The 6 Step Learning Framework That Applies to Cold Email and Everything Else
Mitchell Keller
Founder & CEO, LeadGrow · Managed 3,626+ cold email campaigns. 6.74% average reply rate. Booked 2,230+ meetings in 2025.
TL;DR
- The 6 step learning framework works for cold email, GTM, fitness, gaming, and anything else you want to get good at fast.
- The steps: deconstruct, define success, track inputs, study the top 1%, accept the suck, optimize for iteration speed.
- Iteration speed is the ultimate moat. 12 tests per week beats 2 tests per week by 6x in learning velocity.
- We find winning cold email campaigns in 48 hours by running 12 variants across 3 angles on day 1.
- The meta always shifts. The skill is not knowing the current answer. The skill is finding the next answer faster than everyone else.
I was top 250 in League of Legends in North America. Out of millions of players. And the learning framework that got me there is the exact same one we use to find winning cold email campaigns in 48 hours.
This is not a metaphor. The structure is identical. And it applies to GTM strategy, fitness, sales, content creation, and anything else where getting better faster is the whole game.
Miyamoto Musashi wrote "from one thing, know ten thousand things." Skills transfer across domains. What you learn deeply in one place compounds everywhere else. The 6 step framework is how you learn deeply in any place.
The Gaming Origin Story (and Why It Matters for Your Learning Framework)
League of Legends has 160+ characters, thousands of item combinations, constantly shifting game balance patches, and a meta that changes every two weeks. The players who stay at the top are not the ones who memorize the current best strategy. They are the ones who figure out the next best strategy faster than everyone else.
Top 250 meant I was processing information, adapting to change, and executing under pressure faster than 99.99% of the player base. Not because I was more talented. Because I had a system for learning that compressed weeks of improvement into days.
When I started running cold email campaigns, I realized the game was identical. The "meta" shifts constantly. What worked 6 months ago does not work today. Apollo is declining. AI is changing how buyers search. Signal based targeting is replacing firmographic targeting. The teams that adapt fastest win.
We typically find a winning cold email angle within 48 hours of launch. Most agencies take 4 to 6 weeks. The difference is not talent. It is iteration speed.
LeadGrow internal data, 2025
The 6 Step Learning Framework
This framework works for any skill. I have used it for competitive gaming, powerlifting, cold email copywriting, sales methodology, and building a GTM agency. The steps do not change. The domain does.
Step 1: Deconstruct the Skill
Break the complex thing into its smallest learnable components. Cold email is not one skill. It is targeting, copywriting, infrastructure management, deliverability, offer design, CTA optimization, and follow up sequencing. Each is a separate skill with separate feedback loops.
Most people try to get better at "cold email" as a whole. That is like trying to get better at "basketball" without separating shooting, dribbling, defense, and court vision. You end up mediocre at everything instead of excellent at the pieces that matter most.
In League of Legends, I deconstructed the game into laning phase, team fighting, map awareness, champion mechanics, and item optimization. Each got separate practice time.
Step 2: Define What Success Looks Like
Set specific, measurable targets for each component. Not vague goals like "get better at copywriting." Specific ones like "achieve 8% positive reply rate on first step emails within 500 contacts."
For cold email, our success metrics are clear.
| Metric | Target | What It Tells You |
|---|---|---|
| Reply rate | 2 to 8% | Is the message getting attention? |
| Positive reply rate | 8%+ of replies | Is the offer landing? |
| Booking rate | Varies by market | Is the worldview alignment right? |
| Close rate | Varies by offer | Are these ready to buy leads? |
Without specific targets, you cannot tell if you are improving. You are just doing stuff and hoping.
Step 3: Track Your Inputs (Not Just Outputs)
Outputs are results. Reply rates, meetings booked, revenue. Inputs are the actions that produce those results. Emails sent, variants tested, angles explored, markets contacted.
Most teams only track outputs. They look at their reply rate once a week and either celebrate or panic. But they cannot explain why the number changed because they are not tracking what they actually did.
We track inputs obsessively. How many variants did we test this week? How many new angles? How many new markets? The team that runs 12 tests per week learns 6x faster than the team running 2 tests per week. That learning velocity is the real output.
Step 4: Study the Top 1%
Find the people or campaigns that are producing outlier results and reverse engineer what they are doing differently. Not what they say they are doing. What they actually do.
In our data, hyper specific campaigns (targeting specific state regulations, named deadlines, exact situations) produce 21.7% reply rates. Broad campaigns targeting "all businesses" produce 2.9%. That is a 7x difference. The top 1% of our campaigns all share one trait: extreme specificity in both targeting and copy.
The pattern repeats everywhere. The top 1% of gamers study replays frame by frame. The top 1% of cold emailers study winning copy word by word. The top 1% of athletes study film rep by rep. The method is universal.
Step 5: Accept the Suck
The first 20 hours of any new skill are terrible. You will be bad. Your campaigns will underperform. Your copy will be generic. Your targeting will be off.
This is normal. It is not a sign that cold email does not work or that you are not cut out for it. It is the price of entry. In gaming they call it "feeding" because you are dying repeatedly while learning. In cold email, you are burning through contacts while finding what resonates.
The teams that quit during the suck phase never find their winning angle. The teams that push through it with systematic testing find winners in weeks.
Step 6: Optimize for Iteration Speed
This is the one that separates amateurs from professionals. Everything else is table stakes. Iteration speed is the moat.
In gaming, the player who plays 10 ranked games per day with intentional review improves faster than the player who plays 2 games and watches a coaching video. Volume of deliberate practice, combined with fast feedback loops, is the formula.
In cold email, this translates directly to our testing methodology.
Day 1: Launch 12 variants across 3 angles (4 variants per angle).
Day 4: Winners and losers identified. Take winning principles, apply to losing variants.
Day 7: Second round of 12 variants incorporating learnings.
Day 14: Winning campaign identified. Begin scaling.LeadGrow sprint phase methodology
Most agencies test 2 to 4 variants per month. We test 12 in the first week. By the end of month 1, we have run 24 to 48 tests. That is not a small advantage. That is a structural one.
Applying the Learning Framework to GTM Strategy
The framework scales beyond individual skills. It applies to your entire go to market approach.
Deconstruct: GTM is not one thing. It is ICP definition, channel selection, offer design, messaging, infrastructure, and sales process. Each needs separate iteration.
Define success: Not "grow revenue." Something like "book 8 to 12 qualified meetings per month from cold outbound within 90 days."
Track inputs: How many markets did you test? How many offer angles? How many infrastructure setups? Inputs predict outputs.
Study the top 1%: Find companies in your space that are crushing outbound. Reverse engineer their approach. What signals are they using? What offers? What channels?
Accept the suck: Month 1 of any GTM initiative is messy. The data will be noisy. The results will be inconsistent. That is the cost of finding what works.
Optimize for iteration speed: The company that tests 48 offer variants in 30 days will find market fit before the company that tests 4 variants in the same period.
The Meta Always Shifts
The biggest lesson from competitive gaming is that the meta always changes. The strategy that dominates this month will be average next month. The character that seems unbeatable gets nerfed. The build that carries you to the top gets countered.
Cold email is the same. Apollo's data quality is declining. AI is changing how buyers research solutions. Spam filters are getting smarter. The tactics that worked in 2024 are less effective in 2026.
The skill is not knowing the current answer. The skill is finding the next answer faster than everyone else. That is what the learning framework gives you. Not a static playbook. A dynamic system for continuous adaptation.
From one thing, know ten thousand things.
Want to see our iteration speed in action?
We run 24 to 48 offer tests in your first month to find what works. Most teams take 6 months to run that many tests. Book a strategy call and we will show you exactly how we would test your market.
Or check out our case studies to see what happens when iteration speed meets the right market.
Frequently Asked Questions
Related Articles
Want us to run this playbook for you?
Book a strategy call and we'll show you how these frameworks apply to your business.
Book Strategy Call