skip to content
← Posts

The AI Trust Deficit

This is Part 1 of a series on MonitorIntent. It covers our first product, why AI-written outreach failed, and the pivot to intent-based lead generation.

The first product we built at MonitorIntent was clever - maybe too clever for its own good.

We’d written a tool that recursively crawled LinkedIn’s “more profiles like this” feature to find semantically similar profiles matching a given ICP. Then it wrote hyper-personalized AI messages to each of them, pulling details straight from their profiles.

The messages were realistic. That was actually the problem.

AI writing wasn’t the issue. AI judgment was.

The content didn’t sound like AI. It sounded like a human who hadn’t done their homework.

The models would make qualitative errors that no attentive person would make. Things like congratulating someone on a “new role” they’d been in for a year. Small mistakes, but the kind that make you look careless - or worse, automated.

Our customers didn’t trust the AI writing. Full stop. They weren’t worried about grammar or tone. They were worried about brand reputation - and what counted as “acceptable” was entirely subjective. Every customer had different boundaries on what topics to broach, what tone to strike, what was too forward. There was no universal standard to optimize for. One bad message to the wrong prospect and you look like a spam operation.

So we’d end up re-generating messages en masse, cycling through rounds of customer feedback until the output was… “OK.” Not great. Not even good. Just OK enough that they’d reluctantly hit send.

That just sounds like a chore with extra steps, and not exactly a good sign for a new product.

The missing ingredient: intent

There was a deeper problem underneath the writing quality issues.

None of these people had intent. They matched an ICP on paper - right industry, right title, right company size. But matching a profile isn’t the same as catching someone in a buying moment. We were sending polished messages to people who likely had zero reason to care.

Multiple customers asked us about intent signals. Not 1 or 2 - many of them. That’s when it clicked: the targeting mattered more than the messaging.

The pivot

We decided to flip the whole approach. Instead of writing AI messages to a broad list of profile-matched leads, we’d:

  1. Find the few leads who actually showed buying intent
  2. Skip the AI-written outreach entirely
  3. Use AI to build rich dossiers on those high-intent prospects instead

Let the salespeople write their own messages. Give them better targets and better context to work with.

It felt right. It addressed every complaint we’d heard. So before building anything, we did the responsible thing - we talked to people.

Why good people lie

We sat down with roughly 10 lead generation industry experts. These were generous people who gave us their time, and we’re genuinely grateful for that.

They gave us the lay of the land. How they find prospects, what tools they use, where the gaps are. All useful background.

But here’s what they couldn’t give us: honest product feedback.

If you’ve read The Mom Test, you already know why. When someone is sitting across from a couple of optimistic startup founders, the last thing they want to do is cast aspersions on the path you’re trying to walk. So they agree. They nod. They say “yeah, that could be interesting.”

One of our interviewees showed interest in our first idea - the AI outreach tool. A couple months later, when we pitched the intent-based pivot, they said something like: “Wow, this is so much better than your first idea. That one was kind of weird.”

We never heard from them again when we offered them the product for free.

100+ conversations, 5 months

In total, our CEO probably talked to 100+ people over 5 months. In person, over calls, at events. That’s a lot of conversations. It felt like validation.

But volume of conversations isn’t the same as quality of signal. Most people will tell you what you want to hear, especially if you’re buying the coffee.

The real test was always going to be: will they buy it?

Post 1 of 4 in On MonitorIntent

← Previous Next →