Campaign Operations
Design proper A/B tests for cold email: subject lines, opening lines, CTAs, send times, and sequences. Statistical rigor without the stats degree. Free Claude skill.
[ab-test]The Problem
Most "A/B testing" in cold email isn't testing at all. Teams change the subject line, the opening, the CTA, and the send time simultaneously, then declare whichever version gets more replies the "winner." That's not an A/B test — that's guessing with extra steps.
Real A/B testing requires changing one variable at a time, calculating the right sample size, running the test long enough to reach statistical significance, and correctly interpreting the results. Most sales teams have never taken a statistics course and don't have a data analyst on the team.
The result is wasted effort and wrong conclusions. A subject line gets declared "the winner" based on 30 sends when you needed 200 to have any confidence. Bad data leads to bad decisions.
What It Does
A/B Test Designer creates structured test plans for cold email campaigns. It ensures you test one variable at a time, calculates the sample size needed for meaningful results, generates the variant copy, and tells you how to interpret the results.
Every output includes:
Example
[ab-test] Design an A/B test for my cold email campaign — I want to test whether a curiosity-based subject line or a relevance-based subject line gets more opensHypothesis: Curiosity-based subject lines (“quick thought”) will generate a higher open rate than relevance-based subject lines (“your sdr team”) for VP Sales at mid-market SaaS.
Variable Being Tested: Subject line only
Held Constant: Email body, CTA, send time, sender account, target segment
Variant A (Curiosity): quick thought
Variant B (Relevance): your sdr team
1. Create two identical campaigns in Instantly
2. Same lead list, randomly split 50/50
3. Same email body, same sender accounts, same schedule
4. Only difference: subject line
5. Run for minimum 8 days or until 760 sends total
6. Compare open rates — declare winner only if statistically significant
Test type: A/B (single variable)
Variable: Subject line
Sample required: 760 sends total
Estimated duration: 8-15 days
Time to generate: ~10 seconds
Get Started
Open Claude Projects
Go to claude.ai and open your Projects section. Create a new project or use an existing one.
Add the skill instructions
Paste the skill prompt into your Project instructions. This teaches Claude the skill.
Start using it
Type `[ab-test]` followed by what you want to test — subject line, opening line, CTA, angle, or send time. The skill designs a proper test plan with variants, sample sizes, and success criteria.
Use Cases
Instead of guessing which subject line is best, run a controlled test with enough volume to be statistically meaningful.
Run a "pain-focused" email against a "benefit-focused" email to learn which framing resonates with your audience.
Test "soft ask" vs. "direct meeting request" CTAs. The data will tell you what your audience prefers, regardless of what your VP thinks.
Test 9am vs. 2pm sends, or Tuesday vs. Thursday. Send time optimization can lift open rates by 10-20%.
Use structured test plans to train your team on data-driven decision making. Each test produces a learning that improves every future campaign.
Skill Stack
Generate subject line variants first, then design a proper A/B test around the top candidates.
Write two sequence variants with different angles and test which performs better.
Generate CTA variants and test them in a controlled experiment.
After testing, use Campaign Optimizer to implement the winning variants across all campaigns.
ClaudeGTM includes every skill, video walkthroughs, templates, and a complete GTM system in a box.
Get ClaudeGTM — $297