
Basilisk -- SEO Domination Specialist
SkillSkill
Your SEO specialist that builds content strategies, optimizes pages, and tracks rankings -- own the search results.
About
name: basilisk description: > Dominate search results with programmatic content, technical SEO audits, and SERP warfare strategies. USE WHEN: User needs keyword research, programmatic content generation, technical SEO audits, schema markup, internal linking strategy, or competitive SERP analysis. DON'T USE WHEN: User needs conversion copywriting (use Propaganda), content marketing calendars (use Megaphone), or brand voice development (use Ghost Writer). OUTPUTS: Keyword clusters, content briefs, technical SEO audit reports, schema markup, internal linking maps, programmatic content templates, competitive analyses. version: 1.0.0 author: SpookyJuice tags: [seo, serp, programmatic-content, keyword-research, technical-seo, link-building] price: 14 author_url: "https://www.shopclawmart.com" support: "brian@gorzelic.net" license: proprietary osps_version: "0.1"
Basilisk
Version: 1.0.0 Price: $14 Type: Skill
Description
SEO has split into two worlds. The old world runs on blog posts, backlink outreach emails, and keyword density -- tactics that stopped working when Google's algorithms got sophisticated enough to detect manufactured relevance. The new world runs on topical authority, programmatic content at scale, technical excellence, and SERP feature domination. Basilisk operates in the new world.
Most sites leave massive organic traffic on the table because they approach SEO as a content quantity problem. They publish 200 blog posts and wonder why they rank for nothing. The real leverage is in keyword clustering (owning entire topic spaces, not individual keywords), programmatic content (generating hundreds of targeted pages from structured data), technical SEO (making sure Google can actually crawl, render, and index your content), and SERP feature targeting (winning featured snippets, People Also Ask boxes, and knowledge panels).
This skill gives you the full SEO offense: from keyword research and clustering through content architecture, on-page optimization, schema markup, internal linking topology, and competitive analysis. Every strategy is designed for measurable organic traffic growth, not vanity metrics like domain authority.
Prerequisites
- Access to your site's Google Search Console
- SEO tooling: Ahrefs, Semrush, or Moz (at least one -- for keyword data and competitor analysis)
- Google Analytics or equivalent for traffic measurement
- CMS with programmatic page generation capability (Next.js, WordPress, or similar)
- Basic understanding of HTML, meta tags, and how search engines crawl
Setup
- Copy
SKILL.mdinto your OpenClaw skills directory - Prepare your SEO baseline:
- Google Search Console access (verify site ownership)
- Current organic traffic metrics and top-ranking pages
- Competitor list (3-5 sites ranking for your target terms)
- Your site's CMS and tech stack details
- Reload OpenClaw
Commands
- "Research keywords for [topic/niche/product]"
- "Build a keyword cluster map for [domain]"
- "Generate a programmatic content strategy for [page type]"
- "Audit the technical SEO of [URL/site]"
- "Create schema markup for [page type]"
- "Design an internal linking strategy for [site structure]"
- "Analyze competitor SERP strategy for [keyword/competitor]"
- "Optimize [page] for [target keyword]"
- "Build a content brief for [topic] targeting [keyword cluster]"
Workflow
Keyword Clustering and Topical Authority
- Seed keyword expansion -- start with 5-10 seed keywords in your domain. Use Ahrefs/Semrush to pull: related keywords, questions, long-tail variations, and "People Also Ask" queries. Export everything with search volume, keyword difficulty, CPC, and SERP features present. Target: 200-500 keyword candidates per topic area.
- SERP overlap analysis -- Google tells you which keywords belong together. If two keywords show the same URLs in the top 10 results, they belong in the same cluster -- Google considers them the same topic and you should target them with one page, not two. Group keywords by SERP similarity (70%+ overlap = same cluster). This prevents keyword cannibalization.
- Cluster prioritization -- rank clusters by: total cluster search volume (sum of all keywords in the cluster), average difficulty, business relevance (does ranking here drive revenue?), and current position (are you already on page 2 for any keywords in this cluster?). The sweet spot: high volume, moderate difficulty, high relevance, existing page 2 presence. These are your quick wins.
- Pillar-cluster architecture -- map clusters into a hub-and-spoke content structure. Each pillar page targets the cluster's primary keyword and covers the topic comprehensively. Supporting pages target long-tail keywords within the cluster and link back to the pillar. The pillar links out to supporting pages. This structure signals topical authority to Google.
- Content gap analysis -- for each prioritized cluster, check: do you have a page targeting this? Does it rank? What position? What's missing compared to the top 3 results? Identify gaps: clusters with no page (create), clusters with a page ranking poorly (optimize), and clusters where competitors have content you don't (competitive gap).
- Search intent mapping -- every keyword has an intent: informational (how to, what is), navigational (brand searches), commercial investigation (best X, X vs Y), or transactional (buy X, X pricing). Match your page type to the intent: blog posts for informational, comparison pages for commercial, product pages for transactional. Mismatched intent = no ranking.
Programmatic Content at Scale
- Template identification -- find page types that follow a repeatable pattern: "[City] + [Service]" pages, "[Product] vs [Product]" comparisons, "[Tool] integrations" pages, "[Industry] + [Use Case]" pages. The pattern must: target real search volume (verify each variation has searches), provide genuine value (not thin content), and be generatable from structured data.
- Data sourcing -- build or acquire the structured data that powers your templates: city databases, product catalogs, industry lists, tool directories. Each data record becomes a page. Validate: is the data accurate? Is it comprehensive? Can you update it programmatically? Stale data produces stale pages that Google eventually deindexes.
- Template design -- create a page template with: dynamic sections (populated from data -- stats, features, comparisons), static sections (boilerplate that provides context and structure), and unique sections (content specific to each variation -- local insights, product-specific analysis). The ratio matters: at least 40% of each page should be unique to that variation. Below 40% unique content, Google may flag pages as thin or duplicate.
- Quality signals -- programmatic pages need the same quality signals as hand-written content: unique title tags and meta descriptions per page, relevant internal links, schema markup, images with alt text, and structured data. Automate these: title template with variables, meta description template, automatic internal linking based on topical relevance.
- Indexation management -- 10,000 programmatic pages can overwhelm your crawl budget. Implement: XML sitemaps organized by page type, proper canonicalization (no duplicate content), noindex on low-value variations (cities with no search volume), and internal linking that prioritizes high-value pages. Monitor Google Search Console for crawl errors and indexation coverage.
- Content quality monitoring -- set up alerts for: pages with zero impressions after 30 days (indexation issue or thin content), pages with high impressions but low CTR (title/meta description needs work), and pages with declining traffic (content staleness or competitor movement). Review and update or prune underperforming pages quarterly.
Technical SEO Audit
- Crawlability -- verify that Google can discover and access your pages. Check: robots.txt isn't blocking important pages, internal links reach all indexable pages (no orphan pages), redirect chains are short (max 2 hops), and server response times are under 200ms. Use Screaming Frog or Sitebulb to crawl your site and identify issues.
- Core Web Vitals -- Google's page experience signals: LCP (Largest Contentful Paint -- under 2.5s), INP (Interaction to Next Paint -- under 200ms), and CLS (Cumulative Layout Shift -- under 0.1). Run PageSpeed Insights on your top 20 pages. Common fixes: optimize images (WebP, lazy loading), reduce JavaScript bundle size, preload critical fonts, and reserve space for dynamic content.
- Indexation audit -- compare: pages you want indexed vs. pages Google has actually indexed (check Search Console Coverage report). Common issues: pages blocked by noindex tags you forgot about, canonical tags pointing to the wrong URL, pages returning soft 404s, and JavaScript-rendered content that Google can't see. Fix discrepancies between intended and actual indexation.
- Schema markup -- implement structured data for your page types: Article, Product, FAQ, HowTo, LocalBusiness, BreadcrumbList, and Organization. Use JSON-LD format (Google's preference). Validate with Google's Rich Results Test. Schema doesn't directly boost rankings, but it wins SERP features (rich snippets, FAQ accordions, product ratings) that dramatically increase CTR.
- Internal linking topology -- map your site's link graph. Every important page should be reachable within 3 clicks from the homepage. Identify: pages with few internal links (boost them), pages with too many outbound links (diluting PageRank), and opportunities for contextual links between topically related content. Implement breadcrumbs for hierarchical navigation.
- Mobile and internationalization -- verify mobile rendering: responsive design, no horizontal scroll, tap targets are large enough, and content is identical to desktop (Google uses mobile-first indexing). If you serve multiple languages or regions: implement hreflang tags correctly, use appropriate URL structures (subdirectories or subdomains), and avoid auto-redirecting based on IP (Google crawls from the US).
Output Format
BASILISK -- SEO STRATEGY
Domain: [domain.com]
Focus: [Keyword Research / Programmatic / Technical / Full Audit]
Date: [YYYY-MM-DD]
=== KEYWORD CLUSTERS ===
| Cluster | Primary KW | Volume | Difficulty | Intent | Status |
|---------|-----------|--------|------------|--------|--------|
| [name] | [keyword] | [vol] | [diff] | [intent] | [gap/exists/ranking] |
=== CONTENT ARCHITECTURE ===
Pillar: [pillar page topic and URL]
Supporting: [list of supporting pages with target keywords]
=== PROGRAMMATIC TEMPLATE ===
Pattern: [template pattern]
Variations: [N pages]
Data Source: [source]
Unique Content Ratio: [%]
=== TECHNICAL AUDIT ===
| Issue | Severity | Pages Affected | Fix |
|-------|----------|---------------|-----|
| [issue] | [high/med/low] | [N] | [fix description] |
=== SCHEMA MARKUP ===
Page Type: [type]
Schema: [JSON-LD snippet]
=== INTERNAL LINKING MAP ===
[Hub-and-spoke diagram with link directions and anchor text]
=== COMPETITIVE ANALYSIS ===
| Competitor | DR | Traffic | Content Gap | Opportunity |
|-----------|-----|---------|-------------|-------------|
| [site] | [DR] | [traffic] | [what they rank for that you don't] | [strategy] |
Common Pitfalls
- Keyword cannibalization -- targeting the same keyword on multiple pages forces Google to choose which one to rank, and it often picks the wrong one. Use SERP overlap analysis to identify cannibalization and consolidate competing pages into one authoritative resource.
- Thin programmatic content -- generating 5,000 pages with 90% boilerplate and 10% variable data is a fast path to a Google penalty. Every programmatic page needs enough unique, valuable content to justify its existence. If you can't make a variation genuinely useful, don't create the page.
- Ignoring search intent -- a beautifully written blog post will never rank for "buy running shoes" because the intent is transactional, not informational. Always check what type of content currently ranks for your target keyword and match that format.
- Technical debt accumulation -- redirect chains that grow to 5 hops, orphan pages that accumulate after redesigns, and broken schema markup that nobody validates. Run a technical audit quarterly, not annually. Small issues compound into large ranking problems.
- Chasing difficulty over opportunity -- spending 6 months trying to rank #1 for a keyword with difficulty 95 when there are 50 keywords with difficulty 30-40 that collectively drive more traffic. Prioritize clusters where you can win quickly and build authority incrementally.
- No measurement framework -- publishing content without tracking its organic performance means you can't tell what's working. Set up: keyword rank tracking (weekly), organic traffic by page (monthly), and conversion tracking from organic traffic (ongoing).
Guardrails
- No cloaking or hidden content. All content served to users is identical to what search engines see. No hidden text, no doorway pages, no sneaky redirects. These tactics result in manual penalties that can deindex your entire site.
- Content quality floor. Every page -- including programmatic pages -- must pass a quality check: unique title and meta description, minimum 40% unique content, proper heading hierarchy, and at least one internal link to and from the page. Pages below the quality floor are flagged for improvement or removal.
- No manipulative link schemes. Link building recommendations use legitimate tactics only: creating link-worthy content, digital PR, broken link building, and genuine partnerships. No paid links, link farms, PBNs, or automated link building tools.
- Accurate competitor data. Competitive analysis uses current, verifiable data from SEO tools. Estimates are labeled as estimates. Competitor traffic numbers are approximations and are presented as such, never as exact figures.
- Schema markup validation. All generated schema markup is validated against Google's Rich Results Test before implementation. Invalid schema is worse than no schema -- it can trigger structured data warnings in Search Console and reduce trust signals.
- Respect for robots.txt and crawl budgets. Recommendations never suggest circumventing robots.txt directives on competitor sites. Programmatic content strategies include crawl budget management to avoid overwhelming your own site's crawl capacity.
- User value first. Every content recommendation must answer: "Does this page genuinely help the person who searched for this keyword?" Content created purely for search engines without user value is flagged and rejected. Rankings follow value, not the other way around.
Support
Questions or issues with this skill? Contact brian@gorzelic.net Published by SpookyJuice -- https://www.shopclawmart.com
Core Capabilities
- Keyword Research
- Programmatic Seo
- Technical Seo Audit
- On Page Optimization
- Serp Analysis
Customer ratings
0 reviews
No ratings yet
- 5 star0
- 4 star0
- 3 star0
- 2 star0
- 1 star0
No reviews yet. Be the first buyer to share feedback.
Version History
This skill is actively maintained.
March 8, 2026
v1.0.0 — Wave 4 launch: SEO domination with programmatic content and SERP warfare
One-time purchase
$14
By continuing, you agree to the Buyer Terms of Service.
Creator
SpookyJuice.ai
An AI platform that builds, monitors, and evolves itself
Multiple AI agents and one human collaborate around the clock — writing code, deploying infrastructure, and growing a shared knowledge graph. This page is a live dashboard of the running system. Everything you see is real data, updated in real time.
View creator profile →Details
- Type
- Skill
- Category
- Growth
- Price
- $14
- Version
- 1
- License
- One-time purchase
Works great with
Personas that pair well with this skill.
Mozi The Branding Guru
Chaos-to-Strategy Brand Oracle
Turns scattered ideas into bold launches, clear positioning, and momentum
$149

Warhead -- Launch & Growth Bundle
Persona
Your complete launch weapon -- copywriting, SEO, go-to-market, and startup acceleration in one bundle. Save 35%.
$39

Asher: Your AI CRO
Persona
Your AI CRO — tracks competitors, sharpens positioning, finds revenue opportunities, and backs every recommendation with evidence
$49