MVP Launch Checklist: What to Do in the First 30 Days
Launching an MVP isn't just pressing "publish." The first 30 days are your highest-leverage window for learning — when users are newest, feedback is rawest, and the product is still malleable. Most founders waste this window by either over-preparing before launch or under-preparing for what comes after.
This checklist covers both.
Week 0: Before You Launch
Technical checklist
- Error tracking is set up (Sentry, or equivalent)
- Uptime monitoring is configured (Better Uptime, UptimeRobot — both have free tiers)
- Analytics is installed and tracking page views and key events
- You've run the full user flow end-to-end on a real device, not just localhost
- Email delivery is tested (sign-up confirmation, password reset)
- Payments are tested with real cards in production (do one real transaction yourself)
- The 404 and error pages are reasonable, not blank screens
- SSL is active on your custom domain
Communication checklist
- You have a list of 20–50 target users to notify at launch
- You've written a short launch message (3–4 sentences: what it is, who it's for, link)
- You have a way for users to give feedback (even just an email address or a Typeform)
- Support email or contact form is working
Expectations checklist
- You've defined what "successful launch" means in specific terms (not just "people use it")
- You have 3 specific questions you want this launch to answer
- You've accepted that v1 won't be perfect and that's fine
Week 1: Launch and First Responses
Launch actions
- Notify your personal network first — they're the most forgiving and most likely to give honest feedback
- Post in 2–3 relevant communities (Slack groups, Reddit, LinkedIn, Twitter) — don't spam everywhere
- Message the 20–50 specific targets directly — personalized outreach converts far better than broadcast
What to watch
- Where are users dropping off? (look at analytics funnel, not just total signups)
- What questions are people asking before they sign up? (these are objections your copy didn't answer)
- What's the first thing people do after signing up? (this tells you what they actually came for)
What to resist
- Resist fixing every piece of feedback immediately — collect first, then identify patterns
- Resist adding features before you understand why people aren't using the ones you have
- Resist comparing day 1 metrics to mature products
Week 2: First Real Learning
By now you should have some signal. The goal of week 2 is to understand it.
User interviews
Talk to at least 5 people who signed up — ideally by video call. Ask:
- What problem were you hoping this would solve?
- What did you try before finding this?
- Walk me through what you did after you signed up
- What was confusing or frustrating?
- Would you pay for this? Why or why not?
These conversations will tell you more than any analytics dashboard.
Prioritize by pattern
Take all feedback and look for patterns, not one-offs. If 8 people say the onboarding is confusing and 1 person asks for a dark mode, fix the onboarding.
Week 3: First Iteration
Based on week 2 learning, make 1–3 targeted improvements. Not a full rewrite — targeted changes to the highest-friction points.
Good week-3 improvements:
- Simplifying onboarding (fewer steps, clearer copy)
- Fixing the most common confusion point
- Improving the empty state (what does a new user see when they have no data yet?)
Poor week-3 improvements:
- Full redesign
- New features nobody asked for
- Performance optimizations for scale you don't have yet
Ship the iteration, then tell your week-1 users about it. "We listened to your feedback and made these changes" is a great re-engagement message.
Week 4: Decide What's Next
At the end of 30 days, you should be able to answer:
- Is there a core group of users who find real value in this? Even 10 genuinely happy users is a signal worth following.
- What's the clearest path to more users? Content? Referrals? Direct outreach? Paid?
- What one change would have the biggest impact? Not a wish list — one thing.
The goal of the first 30 days isn't growth. It's learning enough to know whether and how to grow.
The Mindset That Makes the Difference
The founders who learn the most from their MVP launches treat every piece of feedback as data, not criticism. They're curious, not defensive. They change plans based on evidence, not pride.
If your MVP doesn't get the response you hoped for, that's not failure — it's information. The best founders treat a bad launch as the most useful thing that could have happened.
If you're about to launch and want a second set of eyes on your setup, let's talk.