I built a $0.24 SEO audit tool because the enterprise ones felt like overkill

Every SEO tool I have used has the same problem: a thousand metrics, zero prioritization. Ahrefs, SEMrush, Screaming Frog — they are fine at showing you data. But when a client asks “so what should we focus on first?”, I would still end up in a spreadsheet stitching together numbers from five different tabs.
So I wrote my own thing. SEO/AEO/GEO Analyzer is a Python tool that pulls competitive data, finds keyword gaps, checks performance, looks at schema markup, and spits out an HTML report with a prioritized 30/60/90-day plan. Cost per run: about $0.24.
The problem
At Snezzi, I run SEO audits for e-commerce and SaaS clients. The workflow used to look like this:
- Pull keyword data from DataForSEO or Ahrefs
- Run PageSpeed on 10-20 URLs
- Manually check schema markup across competitor sites
- Cross-reference everything in a spreadsheet
- Write up findings in a Google Doc
- Format it nicely so the client does not fall asleep reading it
This took hours. And half the time I was doing the same steps, just for a different domain. I wanted to type in a domain, list some competitors, and get back something I could send to a client.
What it does
The tool runs five collection scripts in sequence:
┌──────────────────────────────────────────────────┐
│ run_analysis.py │
│ (interactive orchestrator) │
├──────────────────────────────────────────────────┤
│ │
│ 1. collect_data.py → sitemap + social ~30s │
│ 2. dataforseo_collection.py → keywords ~25m │
│ 3. geo_analyzer.py → schema/JSON-LD ~30s │
│ 4. performance_check.py → Core Web Vitals ~3m │
│ 5. generate_report.py → interactive HTML │
│ 6. export_data.py → CSV / Excel / PDF ~10s │
│ │
└──────────────────────────────────────────────────┘
Keyword gap analysis finds keywords your competitors rank for but you do not, sorted by search intent. This alone used to take me an hour per client.
AEO scoring checks how well your content is set up for AI search results (ChatGPT, Perplexity, Google SGE). It looks at FAQ schema, answer formatting, and structured data that these engines tend to pull from.
GEO analysis extracts and audits JSON-LD schema markup across your site and competitors. Missing LocalBusiness schema? It tells you what to add, with copy-paste code.
Performance benchmarking runs Core Web Vitals for your URLs and your competitors, side by side.
The report is the part I like most. It generates a self-contained HTML file with sortable tables, color-coded scores, and a 30/60/90-day roadmap. I send it to clients as-is.
How I use it
1. Config
Each client gets a YAML config file:
target:
domain: "example-store.com"
name: "Example Store"
competitors:
- "competitor-one.com"
- "competitor-two.com"
- "competitor-three.com"
location:
country: "United States"
language: "English"
keywords:
seed:
- "organic skincare"
- "natural face cream"
- "vegan moisturizer"
branding:
primary_color: "#2563EB"
company_name: "Snezzi"
2. Run it
python run_analysis.py --config configs/example-store.yaml
I usually pick the automated mode and let it run for 30 minutes while I do something else.
3. Review and send
Output lands in output/:
output/
├── example-store_report.html ← interactive report
├── example-store_keywords.csv ← raw keyword data
├── example-store_export.xlsx ← Excel workbook
└── example-store_summary.pdf ← executive summary
I open the HTML report, sanity-check the recommendations, tweak a few priorities based on what I know about the client, and send it. What used to take half a day now takes about 40 minutes, most of which is waiting.
Why $0.24?
The tool uses DataForSEO for keyword and SERP data. They charge per API call, not per month. A typical audit hits their API a few hundred times, which works out to roughly $0.24. An Ahrefs or SEMrush subscription is $99-299/month.
PageSpeed data comes from Google’s free API. Schema extraction is HTML parsing with Beautiful Soup. The report is generated in pure Python.
I top up my DataForSEO balance maybe $20 at a time and it lasts for dozens of audits.
What I would change
The 25-minute keyword collection is slow. DataForSEO has rate limits and I am being conservative. I could parallelize more, but the sequential approach is reliable and I just run it in the background.
The AEO scoring is opinionated. There is no standard for what makes content “AI search optimized.” I based my scoring on patterns from running hundreds of queries through ChatGPT and Perplexity, looking at what gets cited and what gets ignored. Works for my clients. Your mileage may vary.
I should add GSC integration. Right now the tool does not pull Search Console data directly. It would make the “current state” analysis much better if it had real impressions and click data alongside the competitive numbers.
Try it
git clone https://github.com/usood/seo-aeo-geo-analyzer.git
cd seo-aeo-geo-analyzer
pip install -r requirements.txt
You need a DataForSEO account (they give you $1 free credit to start) and optionally a Google PageSpeed API key. Copy the example config, fill in your domain and competitors, and run it.
PRs welcome if you improve on it.