Claude Cowork has transformed how SEO teams approach repetitive analysis tasks. Instead of spending 15-20 hours per week on manual data processing, teams can now focus on strategy and execution. This article details eight specific, named workflows that teams at agencies and in-house departments are using right now to move the needle on rankings.
Each workflow is documented with step-by-step setup instructions, required data inputs, and example prompt templates you can copy directly into your Cowork environment. These aren't theoretical exercises—they're patterns extracted from teams that have already deployed Cowork into their production SEO operations.
Why Workflows Matter for SEO Teams
SEO is a data-intensive discipline. Your team probably spends significant time:
- Exporting SERP data from SEMrush or Ahrefs and categorizing feature types
- Manually reviewing hundreds of backlink prospects and scoring their relevance
- Crawling websites with Screaming Frog and identifying on-page issues
- Analyzing GA4 reports to find underperforming content candidates for refresh
- Building internal linking maps and finding orphaned pages
- Tracking competitor featured snippet strategies and updating your own targeting
- Aggregating local SEO citation data from multiple platforms
- Generating performance summaries for stakeholder reports
Each of these tasks requires human judgment, pattern recognition, and synthesis—skills Claude excels at. But they also have clear inputs and outputs, making them ideal candidates for automation with Cowork.
This article covers the eight workflows that have the highest ROI for SEO teams. For a comprehensive overview of how Cowork fits into your broader SEO toolset, see Claude Cowork for SEO Specialists: The Complete Playbook.
Workflow 1: SERP Feature Tracking Automation
What It Does
This workflow automates the categorization and tracking of SERP features for your target keywords. Export your keyword list and current SERP data from SEMrush or Ahrefs, and Cowork will automatically classify each result by feature type (rich snippets, people also ask, knowledge panel, featured snippet, image pack, video carousel, etc.), identify which of your own pages rank in those features, and flag new opportunities.
Manual SERP analysis typically takes 2-3 hours for 100 keywords. This workflow reduces it to 10-15 minutes.
Setup Requirements
- CSV export from SEMrush containing: keyword, volume, current rank, current SERP features, URL ranking
- Access to Cowork Skills library
- SEO analyst familiar with SERP feature terminology
Step-by-Step Process
- Export your target keyword list with SERP data from SEMrush (Organic Research → Competitors or Domain Report)
- Create a new Cowork skill or tool called "SERP Feature Analyzer"
- Upload your CSV as the input data source
- Use the prompt template below to configure Cowork's instruction set
- Review the generated output in Cowork's dashboard, which will categorize features and highlight opportunities
- Export results as CSV for import into your tracking spreadsheet or SEO platform
Prompt Template
Analyze this SERP data and classify each keyword by the types of rich features it displays. For each keyword:
1. List all SERP features present (featured snippet, people also ask, knowledge panel, image pack, video carousel, news carousel, related searches, etc.)
2. Identify which features show our URL ranking (if any)
3. For featured snippet opportunities, note the current position and content format (definition, list, table, paragraph)
4. Flag any features where our competitors rank but we don't
5. Categorize each keyword by opportunity tier (Quick Win, Medium Effort, High Effort) based on feature difficulty and our current position
Output as CSV with columns: Keyword, Volume, Current Rank, Features Present, Our Feature Position, Opportunity Type, Priority Tier
This template works because it gives Claude explicit categorization criteria and a structured output format. Adjust the feature types based on your industry vertical.
Workflow 2: Link Building Prospect Prospecting
What It Does
Instead of manually scrolling through Ahrefs link profiles and copying URLs, this workflow automatically identifies high-quality link prospects from your competitor analysis. It extracts referring domains from competitor backlinks, scores them by relevance, domain authority, and content context, then generates personalized outreach summaries for each prospect.
Typical outreach prospect research takes 3-4 hours per competitor analyzed. This workflow handles it in 20 minutes.
Setup Requirements
- CSV export from Ahrefs: Backlinks report filtered by referring domain (exclude direct competitors)
- Your target keywords and business description for relevance matching
- Cowork text processing skill
Step-by-Step Process
- In Ahrefs, navigate to any competitor's Backlinks report
- Filter by referring domain (use advanced filters to exclude other competitors, your own domains, and low-authority sites—set Domain Rating minimum to 25)
- Export the filtered list as CSV containing: referring domain, referring page URL, anchor text, domain rating, traffic, last update date
- Create a Cowork skill called "Link Prospect Analyzer"
- Upload your CSV and competitor URL
- Use the template below to score prospects and generate outreach angles
- Filter results by your assigned priority score and begin outreach
Prompt Template
You are a link building researcher. Analyze this list of backlinks to our competitor and identify the best prospect domains for outreach.
For each referring domain:
1. Score relevance to our topic (1-10) based on the anchor text, page content type, and industry alignment
2. Calculate prospect priority: (Domain Rating / 10) × (Relevance Score / 10) × (Content Recency Score)
3. Identify the type of content that earned the link (resource page, roundup, guide, list, mention, etc.)
4. Write a 1-line outreach hook that explains why this domain is relevant to THIS specific link (don't generic; be specific to their content)
Sort by priority score (highest first). Include only domains with relevance 6+ and DR 30+.
Output: Referring Domain | DR | Relevance | Content Type | Outreach Hook | Priority Score
This prompt is effective because it forces Claude to evaluate multiple dimensions simultaneously and generate immediately actionable outreach hooks. Teams report 25-30% higher response rates using these hooks versus generic outreach templates.
Workflow 3: Technical SEO Monitoring
What It Does
This workflow processes Screaming Frog crawl exports to automatically identify on-page technical issues at scale. Rather than manually reviewing thousands of rows in a spreadsheet, Cowork categorizes errors by severity, groups related issues, and generates prioritized fix recommendations with estimated impact.
Manual Screaming Frog analysis typically requires 4-6 hours per crawl. This workflow processes it in 15 minutes.
Setup Requirements
- Screaming Frog crawl export as CSV (Internal: All)
- Your target keyword list (for matching against on-page content)
- Cowork skill with text processing and categorization abilities
Step-by-Step Process
- Run a full crawl in Screaming Frog (ensure you have enough crawl budget; limit to 10,000-25,000 URLs for first analysis)
- Export as CSV: Internal → All
- Create a Cowork skill "Technical SEO Auditor"
- Upload your crawl CSV and your target keyword list
- Use the prompt template to identify, categorize, and score issues
- Review results grouped by severity and estimated traffic impact
- Assign issues to your development team with prioritized recommendations
Prompt Template
Analyze this Screaming Frog crawl data and identify all technical SEO issues. For each issue category:
1. Count total affected pages
2. Classify severity (CRITICAL: affects indexing/ranking | HIGH: reduces crawlability or rankings | MEDIUM: minor UX/SEO signals | LOW: nice-to-have improvements)
3. Identify the root cause (template issue, plugin bug, configuration error, etc.)
4. Estimate traffic impact: multiply affected page count by average traffic per page
5. Provide specific fix recommendation with estimated effort (quick = <1 hour, medium = 1-4 hours, extensive = 4+ hours)
Group by severity. Output prioritized action plan starting with CRITICAL issues.
Include these checks:
- Missing/duplicate meta descriptions
- Duplicate page titles
- Missing H1 tags or multiple H1s
- Pages with 0 word count
- Redirect chains (3+ hops)
- Missing alt text on images
- Internal links to 4xx/5xx pages
- Canonical tag errors
Workflow 4: Content Refresh Prioritization
What It Does
Content refresh is one of the highest-ROI SEO activities, but figuring out which pages to refresh is where teams get stuck. This workflow analyzes your GA4 data alongside current SERP positions to identify the best candidates for updates—pages that are already ranking (reducing risk) but losing impressions or have deteriorated rank positions.
Setup Requirements
- GA4 export: Pages report with impressions, clicks, CTR, and traffic metrics (last 90 days)
- SERP tracking data: current rank positions for your target keywords
- CSV with publish dates for each page (from your CMS)
Step-by-Step Process
- In GA4, navigate to Reports → Engagement → Pages and screens
- Set date range to last 90 days, filter to your domain, export as CSV
- Export your SERP tracking data from your platform (showing position, volume, CTR for each keyword)
- Combine both files (merge on page URL)
- Create a Cowork skill "Content Refresh Analyzer"
- Upload combined data and your competitor content benchmarks
- Use the template to score refresh candidates and identify specific improvement areas
Prompt Template
Identify the best content refresh candidates. A good refresh target is:
- Currently ranking (position 5-20) so we avoid content rewrite risk
- Has lost impressions or clicks in the last 90 days (declining trajectory)
- Serves high-volume keywords where competitors have published more recent content
- Has low CTR relative to search volume (suggests weak title/meta)
For each page:
1. Calculate "refresh ROI score" = (keyword volume × days since publish date) / (current position)
2. Identify why it's declining: title/meta weak, outdated stats, missing SERP features, competitor has published fresher content in last 3 months
3. Suggest specific improvements: new section topics, updated statistics, format changes to capture featured snippet
Rank by refresh ROI score and output: Page URL | Keyword | Volume | Current Rank | Days Old | ROI Score | Primary Improvement Area | Estimated Impact
Workflow 5: Internal Linking Audit
What It Does
Internal linking is a powerful SEO lever that most sites underutilize. This workflow analyzes your site structure and content, identifies orphaned pages (pages with no inbound internal links), finds pages that compete with each other on the same keywords, and generates specific internal linking recommendations to consolidate authority and improve crawlability.
Setup Requirements
- Screaming Frog crawl export with internal link data
- Your target keyword mapping (keyword → target URL)
- Content inventory (page URL, title, word count, publish date)
Step-by-Step Process
- In Screaming Frog, crawl your site and ensure "Follow Internal Links" is enabled
- Export: Internal → All (with internal link columns)
- Create content inventory: export page list from your CMS or analytics
- Create Cowork skill "Internal Linking Strategist"
- Upload crawl data and content inventory
- Use template to identify orphaned pages, silos, and linking opportunities
- Implement recommended links using your CMS or internal linking plugin
Prompt Template
Analyze our site structure and internal linking. Identify:
1. Orphaned pages: any page with 0 inbound internal links from other pages (not homepage). Flag as high-priority—either link them in or delete/redirect.
2. Keyword cannibalization: multiple pages targeting the same keyword that aren't explicitly linked to each other. Recommend consolidation or clear silo separation.
3. Linking opportunities: high-value pages (high traffic, important keywords) that could receive more internal link authority. For each, suggest 3 specific pages that should link to it with recommended anchor text.
4. Dead-end pages: pages with 0 outbound internal links. Recommend at least 2-3 contextual outbound links.
Output as prioritized recommendations:
- Orphaned pages (high priority)
- Cannibalization fixes (medium priority)
- New internal links to implement (with anchor text) (medium priority)
- Outbound linking gaps (low priority)
Workflow 6: Featured Snippet Optimization
What It Does
Featured snippets drive significant click-through traffic, especially in voice search and AI-powered search engines. This workflow identifies keywords where you're currently ranking but not capturing the snippet, analyzes the current snippet format and content, and generates optimized snippet-targeting content that you can add to your existing page.
Setup Requirements
- SEMrush or Ahrefs SERP data filtered for featured snippet keywords
- Your current ranking URLs for those keywords
- Your current page content (for comparison against winning snippet format)
Step-by-Step Process
- In SEMrush, export keywords where you rank positions 2-10 but don't own the snippet
- Filter to keywords with featured snippet (use advanced filters)
- Create Cowork skill "Snippet Optimizer"
- Upload your SERP data, current page content, and competitor snippet content
- Use template to analyze format and generate optimized snippet content
- Add recommended sections to your existing pages (no major rewrites needed)
Prompt Template
I want to capture featured snippets for these keywords. For each keyword:
1. Analyze the current featured snippet: What format is it (definition, list, table, paragraph)? What content types does it include (statistics, steps, examples)?
2. Analyze our current ranking page: Does it have the same content in the same format? What's missing?
3. Generate optimized snippet content: Write a 40-60 word definition, list, or table that matches the winning snippet format while using our target keyword naturally.
4. Recommend page implementation: Should we add this as a new FAQ section, callout box, or main content addition? Where on the page?
Output: Keyword | Current Snippet Format | Missing Content Element | Recommended New Section | Suggested Placement on Page
Workflow 7: Local SEO Citation Building
What It Does
Local SEO depends on consistent business citations (Name, Address, Phone) across platforms. But manually checking dozens of citation sites is tedious. This workflow analyzes your current citation data, identifies missing or inconsistent listings, prioritizes which citation platforms have the most SEO impact, and generates a structured action plan for citation cleanup and expansion.
Setup Requirements
- Current list of citation sites where you have listings (export from Whitespark, Moz Local, or manual)
- Your canonical NAP data (correct name, address, phone formatting)
- Spreadsheet of your local keywords and target locations
Step-by-Step Process
- Export your current citations from your citation management tool or compile manually from your audits
- Create spreadsheet: Site Name | Your Listing Status (verified/unverified/missing) | NAP Consistency | Citation Quality (DA/authority)
- Create Cowork skill "Citation Manager"
- Upload your citation list and canonical NAP data
- Use template to identify gaps and generate action plan
- Implement new citations in priority order (focus on high-authority platforms first)
Prompt Template
I have a local business. Analyze my citation presence and create an action plan.
First, identify:
1. Which citation platforms I'm already on
2. Which high-impact citation sites I'm missing (prioritize by local SEO relevance: Google Business Profile, Yelp, Apple Maps, Facebook, local chambers of commerce, Yellowpages, industry directories)
3. Any NAP consistency issues in my current listings
Then create action plan:
- Phase 1 (urgent): Fix any NAP inconsistencies in existing verified listings
- Phase 2 (this month): Add listings to top 5-10 missing high-authority sites
- Phase 3 (next month): Add industry-specific directory listings
Output format:
CURRENT LISTINGS: [list with NAP consistency status]
MISSING HIGH-IMPACT SITES: [prioritized list]
ACTION PLAN: [phased implementation]
Workflow 8: Content Performance Review & Reporting
What It Does
Monthly or quarterly reporting takes significant time. This workflow automatically pulls data from GA4, GSC, and your SERP tracking platform, synthesizes it into a comprehensive performance narrative, highlights the key wins and problems, and generates an executive summary with recommended next steps.
Setup Requirements
- GA4 data export: top pages, top keywords, traffic, conversions (monthly period)
- Google Search Console data: top queries, clicks, impressions, CTR, position
- SERP rank tracking data: position changes, new keywords ranked
- Previous month's goals/targets for comparison
Step-by-Step Process
- Export GA4 monthly report (Reports → Traffic Acquisition → Organic Search)
- Export GSC data (Performance report, filter to last 30 days)
- Export SERP tracking data (position changes across all tracked keywords)
- Create Cowork skill "Performance Reporting"
- Upload all data sources with previous month's targets
- Use template to generate narrative report
- Copy generated report into your presentation tool and distribute
Prompt Template
Generate an executive summary SEO report. Analyze this month's data compared to last month and our targets.
Include:
1. Performance summary: Total organic traffic (vs target), top traffic-driving pages, conversion rate
2. Ranking analysis: Number of keywords in top 10/top 20, positions improved (>3 ranking improvements), positions declined (>3 ranking drops)
3. Search visibility: Total impressions, total clicks, average CTR, top performing keywords
4. Top wins: 3-5 specific examples of pages/keywords that exceeded targets or had major improvements
5. Problem areas: 2-3 keywords/pages underperforming vs target, with recommended fixes
6. Content performance: Best-performing content by traffic and by conversion rate
7. Recommendations: 3-4 highest-ROI projects for next month
Format: Suitable for presentation to marketing director/stakeholder. Use clear metrics and specific examples.
Common Implementation Challenges
Teams report three common blockers when implementing these workflows:
Data Consistency
Cowork works best with clean, consistently formatted data. Before uploading CSVs, verify: columns are named consistently, no special characters in key fields, date formats are uniform (YYYY-MM-DD), numeric fields don't contain units (use separate columns for numbers and units).
Prompt Refinement
The templates above are starting points. Your first run will likely reveal adjustments needed. Cowork lets you iterate on prompts in real-time. If output format isn't right, adjust the prompt's output specification. If Claude is missing certain data points, make the categorization criteria more explicit.
Team Adoption
These workflows save time, but they require analysts to change their process. Before rollout, run a pilot with 1-2 team members. Let them provide feedback on output format and usefulness. Often, small adjustments to how Cowork presents results significantly increase adoption.
Next Steps: Connecting Workflows Together
Once you have individual workflows running, the next level is connecting them into a larger automation system. For example, the SERP feature workflow feeds into the featured snippet workflow, which feeds into your content calendar.
For guidance on integrating Cowork with SEMrush, Ahrefs, and GA4 APIs, see Claude Cowork + SEMrush, Ahrefs and GA4: Connecting AI to Your SEO Data. For scaling these workflows across multiple clients as an agency, see How SEO Agencies Use Claude Cowork to Scale Content Operations.
Frequently Asked Questions
How much data can Cowork process per run?
Cowork can handle datasets with thousands of rows. For Screaming Frog crawls, we recommend limiting initial analyses to 10,000-25,000 URLs to keep processing time under 5 minutes. For SERP data or link lists, you can easily process 500-1,000 rows per run. Larger datasets can be split into batches.
Can I schedule these workflows to run automatically?
Yes. Cowork integrates with your SEO platform APIs (SEMrush, Ahrefs, GA4, GSC) to pull data automatically on a schedule. Many teams run the "Content Performance Review" workflow weekly and the "Technical SEO Monitoring" workflow monthly. Automation reduces manual export/upload overhead from 20 minutes to near-zero.
What if Claude's recommendations are wrong or miss important nuances?
That's expected and normal. Cowork is designed for augmentation, not replacement. These workflows handle 80-90% of the analytical work, freeing your team to focus on high-judgment decisions and strategy. Always review Cowork's output before implementing, especially for technical fixes or major content changes. Many teams run Cowork output past a senior analyst as a quality gate.
Do I need to know how to code to use these workflows?
No. Cowork's Skills interface is no-code. You paste prompts into text fields, upload CSV files via the UI, and receive output in CSV or formatted text. No Python, SQL, or coding required. Basic CSV literacy and familiarity with your SEO tools is sufficient.
How do these workflows compare to existing SEO automation tools?
Traditional SEO tools (SEMrush, Ahrefs, Screaming Frog) excel at data collection and basic reporting. Cowork adds human-level analysis and synthesis on top of that data. For example, Ahrefs can list link prospects; Cowork can score them by relevance, identify outreach angles, and prioritize them. Cowork is complementary, not competitive, with your existing toolkit.
Ready to Scale Your SEO Operations?
These eight workflows represent the core SEO processes that Cowork automates most effectively. Teams implementing even 2-3 of these workflows report 30-40% time savings on analytical tasks within the first month.
Related Articles: Claude Cowork for Content Briefs • Claude Cowork for On-Page Optimisation • Connecting to SEMrush, Ahrefs & GA4 • Scaling with Cowork at Agencies