All posts
build in publicconversion ratelanding pagegrowth experimentsfunnel optimization

350 Visitors, 1 Signup, and a 0.29% Conversion Rate

Our landing page was converting at 0.29%. Here's how we diagnosed the funnel, what we found at each step, and what we changed. Real numbers, no theory.

by Nova Yu


TL;DR: We got 350 pageviews and 1 signup — a 0.29% conversion rate. Instead of guessing what to fix, we walked through the funnel step by step. Here’s what each layer told us.

350 Visitors, 1 Signup, and a 0.29% Conversion Rate

The Number That Forced a Diagnosis

If you’re building in public, you collect data. Sometimes the data tells you something you don’t want to hear.

After three weeks of running CrossMind’s marketing — social posts, Reddit comments, direct outreach — we’d driven roughly 350 unique visitors to the landing page. From that traffic, exactly one person signed up for the waitlist.

0.29%.

For context: the average SaaS landing page converts between 2–5%. We were an order of magnitude below “bad.”

The instinct is to start tweaking buttons. Change the CTA color. Rewrite the headline. Add social proof. But those are guesses without a diagnosis. So we didn’t guess. We went layer by layer.

Layer 1: Where Was the Traffic Coming From?

First question: was the problem the page, or the audience?

We pulled PostHog data and segmented by source. The traffic breakdown was roughly:

  • Direct + organic: ~60%
  • Twitter/X links: ~25%
  • Reddit: ~10%
  • Other: ~5%

The Reddit traffic came from our one successful comment — that cohort was high-intent and actually engaged. The Twitter traffic was mostly cold: people scrolling through a feed, seeing a link, clicking.

This told us something important: the conversion rate wasn’t uniform. It varied by source. Reddit visitors stayed longer, scrolled deeper, and converted at a higher rate. Twitter visitors bounced faster.

Diagnosis #1: The page wasn’t failing equally for everyone. It was failing for cold traffic — people who arrived without context about what CrossMind does or why they should care.

Layer 2: What Did Cold Visitors Actually See?

We looked at scroll depth and time on page for the low-converting segment. Most visitors — about 70% — scrolled past the hero section but never made it to the “How It Works” block.

The hero said something like “AI cofounder for growth.” If you didn’t already know what CrossMind was, that sentence raised more questions than it answered:

  • “AI cofounder” — cofounder of what? A company?
  • “for growth” — what kind of growth? Revenue? Users? Followers?

The next section listed features: content creation, competitor analysis, user interviews, legal document review. Four things. None of them were the thing our ICP actually needed — which was finding their first users.

Diagnosis #2: The hero spoke to us (what the product is) instead of to them (what problem it solves). And the feature list described a Swiss army knife, not a specific outcome.

Layer 3: The Positioning Mismatch

Here’s the uncomfortable truth: the landing page was written for a broader audience than our ICP.

Our real user — the vibe coder who shipped a product and has no idea how to get users — doesn’t need “competitor analysis” or “legal document review.” They need one thing: someone to tell them where their users are and help them get in front of those people.

The page was offering a general-purpose AI assistant. Our best user came from a thread where someone was literally asking “where do I start with marketing?”

Diagnosis #3: Feature breadth was killing conversion. More options don’t help someone who’s stuck — they add decision fatigue. The page needed to say one thing, clearly, for one person.

What We Changed

Three changes, in priority order:

1. Rewrote the hero. From “AI cofounder for growth” to something that named the problem directly: if you’ve built something and can’t find users, that’s what we solve. The new copy leads with the pain, not the product.

2. Killed the feature list. Replaced four generic capabilities with a four-step user journey: find where your users are → enter their conversations → automate the follow-up → track what’s working. Specific, sequential, outcome-oriented.

3. Sharpened the CTA. From “Get Early Access” (which tells you nothing about what happens next) to “Tell us what you’re building” (which is specific and starts the agent’s research process).

The Trilogy, Summarized

This post closes a loop we started documenting three weeks ago:

The meta-lesson: distribution is a system, not a channel. Fixing any one part (better outreach, better landing page, better onboarding) doesn’t work if the rest of the system is broken. We had to fix the pipeline end to end.

0.29% is where we started. The number is moving now. Not because of any single tweak, but because we stopped guessing and started diagnosing.


CrossMind is an AI cofounder that finds your first users. Tell us what you’re building, and we’ll figure out where to find the people who need it.

Want an AI to handle your growth work?

CrossMind is your AI cofounder. Join the waitlist for early access.

Join Waitlist