Home/Blog/SEO/We Open Sourced Our Technical SEO Audit Process. Here’s How It Works

We Open Sourced Our Technical SEO Audit Process. Here’s How It Works

We built an open source skill for Claude that turns raw crawl data from Screaming Frog, Sitebulb, or any crawler into a fully prioritised, business impact scored technical SEO audit. Here's how the process works, what it costs, and how to install it yourself.

Author - Suganthan Mohanadasan

Suganthan Mohanadasan

Estimate reading time:

8 minutes

Last updated:

March 4, 2026

At Snippet Digital, we have spent years running technical SEO audits the traditional way.

Screaming Frog exports. Sitebulb crawl reports. Spreadsheets upon spreadsheets.

The crawl tools themselves were never the problem.

They are brilliant at what they do.

The problem was always what happens after the crawl finishes.

We would stare at a 50,000 row CSV and spend hours turning raw data into something a client could actually act on.

Prioritising issues.

Writing fix instructions.

Scoring business impact.

Formatting the deliverable.

That part of the process was manual, slow, and honestly, the bit where human error crept in the most.

So we built something to fix it.

An open source technical SEO audit skill for Claude (Anthropic’s AI) that takes crawl data from any tool and produces a fully prioritised, business impact scored audit report plus an actionable XLSX spreadsheet. And today, we are sharing exactly how it works and how you can install it yourself.

How a Traditional Technical SEO Audit Works

For those newer to SEO, a technical SEO audit is a systematic review of all the behind the scenes factors that affect how search engines crawl, index, and rank your website.

This covers everything from broken links and redirect chains to page speed, structured data, canonical tags, and site architecture.

The traditional process typically follows these steps:

  1. Crawl the site using a tool like Screaming Frog, Sitebulb, or a cloud based crawler like Firecrawl.
  2. Export the data into CSV or Excel format
  3. Manually review hundreds or thousands of rows looking for patterns and issues
  4. Categorise findings across areas like indexability, on page elements, performance, and security
  5. Prioritise issues based on severity and (if you are thorough) business impact
  6. Write fix instructions tailored to the client’s platform (WordPress, Shopify, custom build, etc.)
  7. Format the deliverable into a report the client can understand and a spreadsheet the dev team can work from

Steps 1 and 2 take minutes. Steps 3 through 7 take hours, sometimes days, depending on site size. And that is where most of the cost sits.

The Cost Problem

Here is the reality of what a technical SEO audit costs today, whether you are an agency, freelancer, or running your own crawls in house:

Crawler Tool Costs

ToolLicence CostWhat You Get
Screaming Frog£199/yearDesktop crawler, unlimited URLs, excellent data extraction, custom configurations. Industry standard for a reason.
SitebulbFrom £132/year (Lite) to £288/year (Pro)Beautiful visual reports, accessibility audits, hint based issue detection. Outstanding for client facing output.
FirecrawlFrom $0 (500 credits) to $499/month (Enterprise)Cloud based API crawler with JavaScript rendering. Ideal for headless and SPA sites.
Ahrefs Site AuditIncluded in plans from $129/monthIntegrated with backlink data, great for holistic SEO analysis.

These tools are exceptional at collecting data.

They crawl fast, they surface issues accurately, and they keep improving.

We use them daily and recommend them without hesitation.

But here is the gap, None of them score issues by actual business impact.

None of them know that a canonical error on your highest revenue page matters more than a missing alt tag on a blog post from 2019.

None of them write platform specific fix instructions tailored to your WordPress theme or Shopify setup.

That translation layer, from raw crawl data to prioritised action plan, has always been manual work.

And that manual work is where SEO audit costs balloon.

A freelancer might charge £500 to £1,500.

An agency, £2,000 to £10,000+.

Not because the crawl is expensive, but because the analysis, prioritisation, and reporting take significant human hours.

The New Way: AI Powered Crawl Data Analysis

This is where our open source skill changes the game.

It does not replace your crawler.

It sits on top of whatever crawl data you already have and handles the analysis, prioritisation, and reporting that used to eat up your day.

Here is what the skill does once it receives your crawl data:

1. Auto detects your crawl tool and normalises the data

Whether you feed it a Screaming Frog internal_all.csv, a Sitebulb URLs export, or Ahrefs Site Audit pages.csv, it recognises the column format and maps everything to a standard schema.

2. Identifies your platform and site type automatically

It reads URL patterns, meta generator tags, and response headers to determine whether you are running WordPress, Shopify, Magento, or a custom build. This matters because fix instructions need to be platform specific.

3. Runs analysis across 10 audit categories

Crawlability, indexability, on page elements, site architecture, performance, mobile readiness, structured data, security, international SEO, and AI/future readiness. Each category contains multiple specific checks.

4. Scores every issue on three dimensions

SEO Impact (1 to 10), Business Impact (1 to 10), and Fix Effort (1 to 10). The priority score formula weights impact heavily and rewards easy fixes:

(SEO Impact × 0.4) + (Business Impact × 0.4) + ((10 − Fix Effort) × 0.2)

This means high impact, easy to fix issues surface first.

5. Generates two deliverables

A comprehensive Markdown report with executive summary, categorised findings, and strategic recommendations. Plus an XLSX spreadsheet with every issue, priority score, affected URLs, fix instructions, and an implementation timeline.

The entire process, from uploading your crawl CSV to having both deliverables in hand, takes minutes rather than hours.

A Real World Example

We recently ran this skill against a random site (www.ukmodels.co.uk, a modelling support service).

The audit pulled data from Ahrefs and DataForSEO APIs, analysed 6 key pages plus a 1,000 keyword dataset, and delivered findings in under 10 minutes.

The headline discovery

A canonical tag on their /modelling/ page was pointing to the homepage instead of itself. This page ranks #1 for “modelling” (5,200 monthly searches).

One misconfigured line of code was actively telling Google to ignore one of their most valuable pages.

A human analyst might catch this, but it would be buried in row 847 of a spreadsheet.

The skill surfaced it as the #1 priority issue with a 9.6/10 score because it combined high SEO impact, high business impact, and near zero fix effort.

The full audit identified 20 issues across all categories, scored and ranked each one, and produced a 4 week implementation timeline.

Total cost: the time it took to run the skill.

Pros and Cons – Let’s Be Honest

What This Approach Does Well

  1. Speed: What used to take 2 to 4 hours now takes 10 to 20 minutes including data gathering.
  2. Consistency: Every audit follows the same rigorous methodology. No more “it depends on the analyst’s mood on a Friday afternoon”.
  3. Business impact scoring: Issues are prioritised by actual revenue and traffic impact, not just generic severity labels.
  4. Platform awareness: Fix instructions adapt to your CMS. WordPress users get plugin recommendations, Shopify users get Liquid template guidance.
  5. Cost reduction: The analysis layer is essentially free (Claude subscription cost aside), meaning you only pay for your crawl tool licence.
  6. Open source: You can inspect, modify, and extend the skill to match your agency’s methodology.

Where It Has Limitations

  1. Depends on your crawl data quality: Garbage in, garbage out. If your Screaming Frog crawl missed sections due to config issues, the skill cannot analyse what it cannot see.
  2. No JavaScript rendering by default: Unless your crawl tool rendered JavaScript, SPAs and heavily JS dependent sites may have gaps in the data
  3. Context still matters: The skill asks contextual questions (revenue pages, business model) but a seasoned SEO consultant’s intuition about a specific industry is hard to replicate fully.
  4. Not a replacement for the crawl tools: You still need Screaming Frog, Sitebulb, or similar to collect the data. This skill is the analysis and reporting layer.
  5. Requires Claude Max or Pro subscription: You need access to Claude with the ability to install custom skills.

How to Install the Technical SEO Audit Skill

The skill runs inside Claude’s desktop application (Cowork mode) or Claude Code.

Here is the complete setup guide.

Prerequisites

  1. A Claude Pro or Max subscription from Anthropic
  2. Claude Desktop app (macOS or Windows) or Claude Code CLI installed
  3. Git installed on your machine (CLI method only)

Option A: Claude Desktop (Easiest)

This is the recommended route for most SEO professionals.

No terminal required.

Step 1: Download the Skill Folder

Go to the GitHub repository at https://github.com/Suganthan-Mohanadasan/tech-seo-audit-skill and click Code > Download ZIP.

Extract the ZIP file somewhere on your computer.

Step 2: Open Claude Desktop and Select a Folder

Launch Claude Desktop and switch to Cowork mode (the toggle in the bottom left).

Click Select folder and choose a working folder on your machine.

This is where you will place your crawl data exports and where Claude will save its audit outputs.

Step 3: Add the Skill

Copy the extracted technical-seo-audit folder (the one containing SKILL.md) into the .skills/skills/ directory inside your selected working folder. The path should look like this:

your-selected-folder/
└── .skills/
    └── skills/
        └── technical-seo-audit/
            ├── SKILL.md
            ├── references/
            │   ├── analysis-modules.md
            │   ├── impact-scoring.md
            │   ├── api-crawling.md
            │   └── data-ingestion.md
            └── scripts/
                └── analyse_crawl.py

The .skills folder may be hidden by default on your operating system.

On macOS, press Cmd + Shift + . in Finder to reveal hidden folders.

On Windows, enable “Show hidden items” in File Explorer’s View tab.

Step 4: Test It

Start a new conversation in Claude Desktop and say: “Run a technical SEO audit on [your domain]”.

Claude will detect the skill and begin the audit process.

You can also upload your Screaming Frog or Sitebulb CSV files directly into the chat.

Option B: Claude Code CLI (For Developers)

If you prefer working in the terminal or already use Claude Code for development workflows, this route gives you full control.

Step 1: Clone the Repository

Open your terminal and clone the skill repository:

git clone https://github.com/Suganthan-Mohanadasan/tech-seo-audit-skill.git

Step 2: Set Up the Skills Directory

Claude looks for skills in a specific location. Create the directory structure if it does not already exist:

# macOS / Linux
mkdir -p ~/.claude/skills/technical-seo-audit

# Windows (PowerShell)
New-Item -ItemType Directory -Force -Path "$env:USERPROFILE\.claude\skills\technical-seo-audit"

Step 3: Copy the Skill Files

Copy the cloned repository contents into the skills directory:

# macOS / Linux
cp -r technical-seo-audit-skill/* ~/.claude/skills/technical-seo-audit/

# Windows (PowerShell)
Copy-Item -Recurse technical-seo-audit-skill\* "$env:USERPROFILE\.claude\skills\technical-seo-audit\"

Step 4: Verify the Skill Structure

Your skills directory should now contain:

~/.claude/skills/technical-seo-audit/
├── SKILL.md              # Main skill instructions
├── references/
│   ├── analysis-modules.md   # Detailed check specifications
│   ├── impact-scoring.md     # Scoring methodology
│   ├── api-crawling.md       # API integration docs
│   └── data-ingestion.md     # Column mapping logic
└── scripts/
    └── analyse_crawl.py      # Automated data processing

Step 5: Configure API Access (Optional)

If you want the skill to pull supplementary data from Ahrefs or DataForSEO, set your API keys as environment variables:

# Add to your shell profile (.bashrc, .zshrc, etc.)
export AHREFS_API_KEY="your_ahrefs_api_key_here"
export DATAFORSEO_LOGIN="your_login_here"
export DATAFORSEO_PASSWORD="your_password_here"

Step 6: Launch and Test

  1. Open Claude Desktop (Cowork mode) or launch Claude Code
  2. Select a working folder that contains your crawl data exports
  3. Ask Claude: “Run a technical SEO audit on [your domain]”
  4. The skill will activate, detect your data source, and begin the audit

Troubleshooting

If the skill does not trigger automatically, you can invoke it directly by saying “use the technical seo audit skill” in your prompt.

Ensure your crawl data CSV files are in the working folder Claude has access to.

For API based crawling (without uploading CSVs), the skill can use Firecrawl, DataForSEO On Page API, or Ahrefs Site Audit endpoints directly.

Just specify your preferred method when starting the audit.

Why We Open Sourced This

At Snippet Digital, we believe the SEO industry needs to work smarter, not just harder.

We have been at the forefront of integrating AI into our technical SEO workflows since the early days of Claude, and the results speak for themselves: faster turnaround, more consistent quality, and the ability to focus our human expertise on strategic recommendations rather than data wrangling.

Open sourcing this skill is not about giving away our competitive advantage.

It is about raising the bar for the entire industry.

The real value an agency provides is not the ability to run a crawl or format a spreadsheet. It is the strategic thinking, the client relationship, and the ability to connect technical findings to business outcomes.

This skill handles the mechanical work so that SEOs can focus on what actually matters.

We are continuing to develop and refine the skill. Contributions, feedback, and feature requests are welcome on the GitHub repository.

About Snippet Digital: We are a specialist SEO agency based in the West Midlands. We have been integrating AI into our workflows since 2020 and are committed to sharing tools and methodologies that push the industry forward.

Get in touch if you would like us to run a technical SEO audit for your site.

Author - Suganthan Mohanadasan

Suganthan Mohanadasan

Co-founder

Suganthan Mohanadasan is a Norwegian entrepreneur and SEO consultant. He co-founded Snippet Digital, Keyword Insights, and the KWI SEO Community, helping businesses and marketers navigate search, AI, and content strategy.

We Open Sourced Our Technical SEO Audit — AI Powered Process Explained