Cold Email Strategy

Frame Over Structure: The Only Variable We Change Between Tests

11 min read
MK

Mitchell Keller

Founder & CEO, LeadGrow · Managed 3,626+ cold email campaigns. 6.74% average reply rate. Booked 2,230+ meetings in 2025.

TL;DR

  • **Most teams A/B test copy mechanics** (subject lines, CTAs, length). These rarely move the needle more than 0.5 to 1%.
  • **Frame = how you position the same offer to the same person.** Problem-focused, peer-comparison, and tactical-methodology are three frames for the same service.
  • **Changing the frame can swing reply rates by 3 to 5x.** We've seen the same offer go from 1.8% to 9.2% by changing nothing but the frame.
  • **Testing hierarchy: Frame > offer angle > audience segment > copy mechanics.** Test in this order or waste months.
  • **We test 24 to 48 variants in the first month** of every engagement. Most of those are frame tests, not word tests.

By Mitchell Keller, Founder & CEO, LeadGrow. Managed 3,626+ cold email campaigns. 6.74% average reply rate. 2,230+ meetings booked in 2025.

Everyone Tests. Almost Nobody Tests the Right Thing.

You've probably run A/B tests on your cold emails. Subject line A versus subject line B. Short email versus long email. Soft CTA versus hard CTA.

And you probably got results that looked like noise. One version wins by 0.3%. The other wins next week. Nothing feels conclusive because it isn't.

That's because you're testing the garnish while ignoring the recipe.

The variable that moves cold email performance isn't the subject line or the CTA or the email length. It's the frame. How you position the same offer to the same person. Change the frame and reply rates swing by 3 to 5x. Change the subject line and you might move 0.5%.

We call this "frame over structure." It's the principle that guides every test we run across 3,626+ campaigns. The concept shows up throughout our stealth offers methodology, where the winning offer emerges from frame testing, not brainstorming.

What Is a Frame?

A frame is the lens through which you present your offer. Same service, same audience, different angle of approach.

Imagine you sell outbound lead generation services. Your offer is the same regardless of how you describe it. But the frame changes everything about how the recipient receives it.

Frame 1: Problem-Focused

"Are you still relying on referrals and inbound for most of your pipeline? Most B2B companies we talk to have inconsistent lead flow because they don't have a repeatable outbound system. We build that system. One client went from 3 meetings per month to 12 in 60 days."

Frame 2: Peer-Comparison

"Your competitors in [industry] are booking 8 to 12 meetings per week through multi-channel outbound while most companies still rely on inbound alone. We work with companies like [similar company] to build that same engine. They hit 83 meetings in their first 90 days."

Frame 3: Tactical-Methodology

"We find prospects in buying situations (not just buying roles) and reach them across email plus LinkedIn before they start searching for solutions. It's a different approach than targeting demographics or chasing intent signals. One client went from 1.2% to 12.4% reply rates after switching to situation-based targeting."

Same company. Same service. Same price. Three completely different emotional triggers.

Frame 1 hits the pain of inconsistency. Frame 2 hits the fear of falling behind. Frame 3 hits the curiosity about a better methodology.

Each frame resonates with a different psychological state. And the psychological state of your audience on any given day determines whether they reply or delete.

Real Example: Same Offer, 3 Frames, Wildly Different Results

We ran this test for a B2B SaaS client selling a sales engagement platform. Same list (VP of Sales at mid-market SaaS companies, 50 to 200 employees). Same sender. Same sending schedule. Same subject line format. Only the frame changed.

FrameCore AngleReply RateMeeting Rate
Problem-focused"SDR team underperforming?"4.2%1.8%
Peer-comparison"How [competitor] books 3x more meetings"9.2%4.1%
Tactical-methodology"The signal-based approach to outbound"5.7%2.4%

The peer-comparison frame outperformed the problem-focused frame by 2.2x on reply rate and 2.3x on meetings. Same offer. Same list. Same week. The only thing that changed was the frame.

If this team had been testing subject line variations within the problem-focused frame, they might have improved from 4.2% to 4.8%. Maybe. Instead, by testing frames, they jumped to 9.2%.

That's the difference between testing the right variable and testing the easy variable.

Why Most Teams Test the Wrong Things

Copy Mechanics Feel Productive

It's easy to swap "Quick question" for "Thought on [company]" and feel like you're optimizing. You're making changes. You're running tests. You're looking at data. The problem is you're making changes that don't matter enough to produce meaningful differences. You're optimizing the color of the button when the page doesn't have the right headline.

Frames Require Strategic Thinking

Coming up with three genuinely different frames for the same offer is harder than writing three variations of the same email. It requires understanding your audience's psychology, their competitive landscape, and their internal motivations. Most teams default to what's easier.

Frame Tests Take Longer to Run

Each frame needs its own email body, its own flow, sometimes its own subject line. That's more work upfront. But the payoff is a 3 to 5x improvement versus a 0.5% improvement. The math isn't close.

The Testing Hierarchy

Here's the order in which variables matter for cold email performance, based on running 3,626+ campaigns.

Level 1: Frame (Test This First)

How you position the offer. Problem-focused vs. peer-comparison vs. tactical-methodology vs. curiosity-driven. This is where 60 to 70% of performance variation lives.

Expected impact: 2 to 5x reply rate difference between frames

Level 2: Offer Angle (Test This Second)

The specific benefit you emphasize within the winning frame. If peer-comparison wins, do you lead with "your competitors are booking 3x more meetings" or "your competitors are using situation-based targeting"? Both are peer-comparison frames. Different offer angles within that frame.

Expected impact: 1.5 to 2x reply rate difference between angles

Level 3: Audience Segment (Test This Third)

Which slice of your list responds best to the winning frame plus angle. VPs of Sales at 50 to 100 person companies versus 100 to 200. SaaS versus professional services. Companies that recently raised funding versus bootstrapped. Our situations beat markets guide covers how to build these segments using signal combinations.

Expected impact: 1.3 to 2x reply rate difference between segments

Level 4: Copy Mechanics (Test This Last)

Subject lines, CTAs, email length, personalization depth, send time. This is where most teams start. It should be where they finish.

Expected impact: 0.5 to 1.2x reply rate difference between variations

If you test copy mechanics before frames, you're polishing a strategy that might be fundamentally wrong. Find the right frame first. Then optimize the copy within it.

How to Generate Frame Variants

Every offer can be framed through at least four lenses. Here's how to find them.

Step 1: List the Emotional Triggers

What emotions does your offer tap into? Fear of falling behind? Frustration with the status quo? Curiosity about a new approach? Relief from a painful process?

Write down every emotional state your ideal customer might be in when they'd want your product.

Step 2: Map Each Trigger to a Frame

Emotional TriggerFrame TypeOpening Angle
Fear of falling behindPeer-comparison"Your competitors are doing X"
Frustration with status quoProblem-focused"Still dealing with X?"
Curiosity about better methodsTactical-methodology"There's a different approach to X"
Pride in being cutting-edgeInnovation-focused"The top 5% of teams do X differently"
Anxiety about riskRisk-focused"What happens to X when you scale?"
Desire for efficiencyROI-focused"Cut X by 70% without losing Y"

Step 3: Write One Email Per Frame

Don't write three versions of the same email. Write three genuinely different emails, each built around a different frame. The structure should be different. The opening should be different. The proof point you cite should be different (matching the frame).

Step 4: Test Simultaneously

Run all frames against comparable audience segments in the same week. 200 to 300 sends per frame. Measure reply rate after 7 days (including follow-ups). Kill the bottom 1 to 2. Scale the winner.

Frame Testing in Practice: Our Process

Here's exactly how we run frame tests for clients at LeadGrow.

Week 1: Research the client's offer, ICP, and competitive landscape. Generate 3 to 4 frame variants. Build separate email sequences for each frame (2 emails per sequence). Launch all simultaneously.

Week 2: First data comes in. We need 200+ sends per frame for statistical reliability. Preliminary patterns emerge but we don't make decisions yet.

Week 3: With 500+ sends per frame, we have clear signal. Kill the bottom 1 to 2 frames. The winning frame becomes the foundation.

Week 4: Test offer angle variations within the winning frame. "Your competitors are booking 3x more meetings" versus "Your competitors switched to situation-based targeting" (both peer-comparison, different angles).

Month 2+: Optimize copy mechanics within the winning frame plus angle combination. Now subject line tests, CTA tests, and length tests actually move the needle because you're optimizing within a proven positioning.

We test 24 to 48 variants in the first month of every engagement, following our sprint, test, scale methodology. The majority are frame and offer angle tests. By month 2, we've usually found a combination that produces 2 to 3x better results than whatever the client was running before.

Three Frame Variants for Common B2B Offers

To make this concrete, here are frame variants for three common offer types.

Offer: Marketing Agency Services

Problem frame: "Are you getting leads from content but struggling to turn them into meetings?"

Peer frame: "Companies like [competitor] are converting 15% of content leads into meetings through structured follow-up sequences."

Methodology frame: "We use a 5-touch, multi-channel cadence that turns content engagement into booked calls within 48 hours."

Offer: Sales Enablement Software

Problem frame: "Is your sales team spending more time in the CRM than on calls?"

Peer frame: "Top-performing sales teams in SaaS spend 70% of their day selling. Most spend 35%. The difference is their enablement stack."

Methodology frame: "We auto-log calls, surface buying signals, and pre-build proposals so reps sell instead of doing admin. One team went from 35% selling time to 72%."

Offer: Data Enrichment Tool

Problem frame: "How much of your pipeline dies because contact data is wrong or missing?"

Peer frame: "Your competitors are enriching every lead with 40+ data points before the first email. You're sending to [firstname]@[company].com and hoping."

Methodology frame: "We waterfall 12 data providers to get 95% coverage on email, phone, title, and tech stack. One client went from 62% to 97% data completeness."

Why Frame Testing Saves Months

Consider two teams running cold email campaigns.

Team A tests copy mechanics first. They spend 4 weeks testing subject lines, find a winner, then test CTAs for another 4 weeks, then test email length. After 3 months, they've optimized their 3% reply rate to 4.1%. Improvement, but not transformational.

Team B tests frames first. They spend 3 weeks testing 3 frames, find the peer-comparison frame wins at 8.4% versus the problem frame at 3.2%. They spend weeks 4 to 6 optimizing within the winning frame. After 6 weeks (half the time), they're at 9.8%.

Team B got to a better result in half the time. Not because they're smarter. Because they tested the variable that matters most, first.

If you start with copy mechanics on the wrong frame, you're stuck optimizing inside a ceiling. You might get that 3% to 4%. But you'll never know that a different frame would have started you at 8%.

Frame first. Structure second. That's the entire methodology.

Applying This Beyond Cold Email

Frame over structure isn't just a cold email concept. It applies to any persuasive communication.

Landing pages: test the hero headline framing before testing button colors. LinkedIn content: test whether your audience responds to "here's what I learned" frames versus "here's what everyone gets wrong" frames. Sales calls: test whether opening with a pain question outperforms opening with a case study.

The principle is universal. The variable that changes how someone perceives your offer (the frame) will always outperform the variable that changes how the offer is delivered (the structure).

At LeadGrow, we average 6.74% reply rates across all campaigns. That number doesn't come from magic words or secret subject lines. It comes from finding the right frame for each market through systematic testing, then optimizing everything else within that frame.

Stop testing subject line variations. Start testing how you position the offer. The frame is the unlock. If you want to see how frame testing applies to the entire cold email copywriting process, we break down all 9 techniques that compound together.

Frequently Asked Questions

Want us to run this playbook for you?

Book a strategy call and we'll show you how these frameworks apply to your business.

Book Strategy Call