Published in Artificial Intelligence

What happens when you run an AI agent for LinkedIn outreach?

The article outlines practical steps for automating business processes – LinkedIn outreach campaign with AI agent. Inside, we break down the real performance data: acceptance rates, interest rates, campaign variance, and uncover what actually drives results in B2B outreach. Introduction Recently, we built an AI agent to handle our LinkedIn lead generation. This step is […]

By Altamira team

The article outlines practical steps for automating business processes - LinkedIn outreach campaign with AI agent. Inside, we break down the real performance data: acceptance rates, interest rates, campaign variance, and uncover what actually drives results in B2B outreach.

Introduction

Recently, we built an AI agent to handle our LinkedIn lead generation. This step is not intended to assist with it or help optimize it, but to really run it. The agent managed everything: targeting, messaging, connection requests, follow-ups, all of it.

Two months later, we had data from over 6,000 connection requests across 40 different campaigns. And what we learned wasn't really about LinkedIn at all. Wonder why?

The experiment: Automated LinkedIn outbound

The setup was quite simple. Nine team members agreed to let an AI agent use their LinkedIn profiles for outbound sales. Each profile would run multiple campaigns targeting different industries, job titles, and geographies. The AI handled the entire workflow.

That's obvious that AI can send nice personalized messages faster than humans. However, we wanted to see if AI could learn what actually works in B2B outreach. Could it find patterns we'd miss? Could it adapt messaging based on what was landing?

Between mid-December 2025 and early February 2026, the agent ran 40 campaigns. Some targeted legacy modernizationopportunities. Others pitched AI solutions to clinics, legal firms, and defense contractors. A few focused on outstaffing. The agent sent connection requests, tracked acceptance rates, monitored replies, and logged every data point.

Here's what happened.

The numbers tell half the story: Building relationships

Out of 6,394 connection requests, 1,026 people accepted. That's a 16% acceptance rate, which sounds reasonable until you realize the range was 8% to 30% depending on the campaign.

From those 1,026 connections, we got 61 interested responses. Four turned into qualified opportunities. Do the math, and you're looking at a 3.7% interest rate from accepted connections, or about 1% from total automated outreach.

Those numbers are fine: not great, not terrible. What mattered was the variance.

Some campaigns got 8.7% of connections to express interest. Others got zero. Same AI, same company, same value proposition. The difference was everything else.

What actually drove results: LinkedIn automation

Altamira’s Business Development Manager ran a campaign targeting product managers at companies with 10+ year old legacy systems. Interest rate: 8.7%. A week earlier, he'd run a campaign pitching AI agents to a broader tech audience. Interest rate: 0.2%.

The legacy modernization campaign worked because it led with a problem people already knew they had. The AI agent pitch required explaining why someone should care about a technology they weren't actively looking for.

That pattern held across everything. Our latest DefTech campaign pulled a 28% acceptance rate and 7.3% interest rate by targeting 200 highly specific decision-makers. While a broad Middle East tech campaign sent to 522 people generated 0.2% interest.

The AI wasn't the variable, it was the targeting.

We also ran three campaigns for legal firms. The first two pitched "AI agents for legal work" and got 8-10% acceptance with almost no interest. The third campaign pitched "AI copilot for proposal writing," and acceptance jumped to 24% with 2.3% interest: same audience, same AI tool, but different framing of the problem.

The part we didn't expect: key insights

Around week four, something changed. Acceptance rates across all profiles started climbing. However, we didn’t change the messaging. The AI was still sending the same connection requests.

What changed was profile age. LinkedIn's algorithm rewards accounts with consistent activity and established networks. Fresh profiles running cold outreach get filtered more aggressively. Profiles with 30+ days of activity and growing networks get better visibility.

The AI couldn't hack its way around that, so it just had to wait.

We also noticed campaign fatigue around message 10 in each sequence. Messages 1-3 generated 80% of all responses. Messages 7-10 got almost nothing. Either people were interested in the first three messages, or they tuned out completely.

That's a human behavior pattern, not an AI limitation. If your first three touchpoints don't resonate, message 47 won't save you.

Where AI helped a lot

The AI didn't magically write better copy than a human. It didn't find some secret LinkedIn algorithm hack. What it did do was run 40 campaigns in parallel without getting tired, forgetting to follow up, or making emotional decisions about which leads to prioritize.

For example, we ran six outstaffing campaigns over eight weeks. Acceptance rates ranged from 14% to 28%. Interest rates stayed between 0% and 4.5%. A human sales team would have gotten discouraged and probably stopped after the third campaign. The AI kept going, collected data, and eventually found that week 5 campaigns in that vertical performed better than week 1.

Why? We still don't know. But we wouldn't have seen the pattern without the volume.

The AI also caught things we would have missed. Like the fact that acceptance rate and interest rate don't correlate. Our tender campaign had a 20% acceptance rate but 14% interest rate. That's rare. Most campaigns with 20% acceptance got 0-2% interest.

High acceptance usually means your targeting is broad. High interest means your message resonates. When both are high, you've found product-market fit for that campaign. And our AI agent automatically flagged that combination.

What we got wrong

We ran too many campaigns too fast. Forty campaigns in eight weeks sounds like rapid iteration, but it's actually the opposite. We couldn't analyze results quickly enough to adjust messaging or targeting mid-campaign. By the time we understood what worked, the campaign was over.

A human operator would have slowed down after week three. The AI just kept launching campaigns because that's what we told it to do.

We also assumed geography wouldn't matter much for digital services. Wrong. Middle East and Asia campaigns consistently underperformed. Acceptance rates dropped to 10-12%. Interest rates barely broke 1%. Meanwhile, European defense campaigns were pulling 28% acceptance and 7% interest.

The AI didn't flag this early enough because we hadn't told it to weight geographic campaign performance separately. That's a human oversight, not an AI limitation.

What this means for AI in sales

The real lesson here isn't about LinkedIn. It's about what AI can and can't do in complex human processes like sales.

AI is great at executing processes, maintaining consistency, and spotting patterns in large datasets. It ran 40 campaigns without a single missed follow-up. It tracked every metric we asked for. It didn't get tired, discouraged, or distracted.

But AI doesn't understand context the way humans do. It didn't know that pitching AI agents to legal firms would be harder than pitching legacy modernization to product managers. It couldn't intuit that defense contractors want different messaging than healthcare clinics. We had to run the experiments to generate that data.

The best results came when we combined AI execution with human insight. The AI ran campaigns. Humans analyzed patterns and adjusted targeting. The AI sent 6,000+ messages without complaining. Humans decided which 200 prospects actually mattered.

That's probably the model going forward. AI handles the repetitive execution and data collection. Humans make judgment calls about strategy, positioning, and which patterns actually matter.

What we're doing now

We're running 12 campaigns instead of 40. Each one targets under 300 prospects. Each one has messaging built around a specific problem we've already solved for similar companies.

The AI still manages execution. But now we're spending more time upfront defining who we're targeting and why. We're treating the AI like a very fast, very consistent operator who needs clear instructions.

If you're thinking about using AI for automating your business processes, start with this question: Do you know what good looks like? Because if you don't, the AI will just do the wrong thing faster.

Get in touch to identify new opportunities for AI automation!

Latest articles

All Articles
Artificial Intelligence

How to measure AI agent performance: Key metrics

Nowadays, it’s so easy to use AI agents. In many organizations, teams can move from just an idea to a production-ready agent in a matter of weeks. Such rapid AI implementation lowers the barrier to adoption and introduces a problem that many teams are not prepared for: understanding whether those agents are delivering real business value […]

22 minutes19 January 2026
Artificial Intelligence

Top 5 security risks of autonomous AI agents

Autonomous AI agents create amazing opportunities for businesses, but at the same time, they introduce new risks that demand attention. Business leaders are moving quickly toward agentic AI, and the motivation is easy to understand. These systems are goal-driven and capable of reasoning, planning, and acting with little or no human oversight. Tasks that once […]

14 minutes8 January 2026
Artificial Intelligence

Speech recognition in artificial intelligence

What used to feel optional is now expected.  Speech recognition technology has moved from a convenience feature to a core part of how people interact with software, both at work and at home. Market data reflects a broader trend. The global speech and voice recognition market was valued at almost $10 billion in 2025 and is projected […]

12 minutes2 January 2026
Exit mobile version