--- description: Systematic 8-phase methodology for building deep Verkada domain skills from primary docs through source currency verification. category: verkada --- flowchart TD _HEADER_["
Creating Verkada Skills
Systematic 8-phase methodology for building deep Verkada domain skills from primary docs through source currency verification.
"]:::headerStyle classDef headerStyle fill:none,stroke:none subgraph _MAIN_[" "] %% Entry point START([Start: New Skill Needed]) --> PLAN[Identify Domain and Scope] PLAN --> P1_ENTRY[Begin Phase 1] %% Phase 1: Primary Documentation subgraph Phase1["Phase 1: Primary Documentation"] P1_ENTRY --> REPO_DOCS[Read In-Repo Docs] P1_ENTRY --> NOTION_LIVE[Search Notion Live] P1_ENTRY --> NOTION_CACHE[Search Notion Offline Cache] REPO_DOCS --> DATE_DOCS[Get Git Dates] NOTION_LIVE --> CHECK_DEFERRED{Defers to GitHub?} CHECK_DEFERRED -->|Yes| SKIP_NOTION[Mark as Deprecated] CHECK_DEFERRED -->|No| KEEP_NOTION[Extract Content] NOTION_CACHE --> KEEP_NOTION end %% Phase 2: Source Code subgraph Phase2["Phase 2: Source Code (Ground Truth)"] DATE_DOCS & SKIP_NOTION & KEEP_NOTION --> ENUMS[Read Enums and Configs] ENUMS --> SVC_CONFIG[Read Service Configs] SVC_CONFIG --> SCHEMAS[Read Real Schemas] SCHEMAS --> TERRAFORM[Read Terraform] end %% Phase 3: PR History Mining subgraph Phase3["Phase 3: PR History Mining"] TERRAFORM --> SEARCH_PRS[Search PRs by Title] SEARCH_PRS --> OLD_PRS[Old PRs: prs.json] SEARCH_PRS --> NEW_PRS[New PRs: Individual .md Files] OLD_PRS & NEW_PRS --> REVIEW_COMMENTS[Extract Review Comments] REVIEW_COMMENTS --> REVIEWER_PATTERNS[Find Reviewer Patterns] end %% Phase 4: Org Chart subgraph Phase4["Phase 4: Org Chart Research"] REVIEWER_PATTERNS --> TEAM_LOOKUP[Look Up Team Ownership] TEAM_LOOKUP --> GH_HANDLES[Get GitHub Handles] GH_HANDLES --> CODEOWNERS[Map CODEOWNERS to Channels] end %% Phase 5: Datadog subgraph Phase5["Phase 5: Datadog (Live Production)"] CODEOWNERS --> FIND_MONITORS[Find Related Monitors] FIND_MONITORS --> MONITOR_DETAIL[Get Monitor Configs] MONITOR_DETAIL --> LIVE_METRICS[Query Live Metrics] end %% Phase 6: Slack Channels subgraph Phase6["Phase 6: Slack Channel Mining"] LIVE_METRICS --> DISCOVER_CHANNELS[Discover Channels via Harvest] DISCOVER_CHANNELS --> CHECK_ACCESS{Member Access?} CHECK_ACCESS -->|Yes| READ_HISTORY[Read Channel History] CHECK_ACCESS -->|No| MARK_INACCESSIBLE[Mark as Inaccessible] READ_HISTORY --> RESOLVE_USERS[Resolve User IDs to Names] end %% Phase 7: Linear Tickets subgraph Phase7["Phase 7: Linear Tickets"] RESOLVE_USERS & MARK_INACCESSIBLE --> SEARCH_LINEAR[Search Linear Exports] SEARCH_LINEAR --> EXTRACT_DECISIONS[Extract Design Decisions] EXTRACT_DECISIONS --> KNOWN_ISSUES[Catalog Known Issues] end %% Phase 8: Source Currency subgraph Phase8["Phase 8: Source Currency Verification"] KNOWN_ISSUES --> DATE_ALL[Date-Stamp All Sources] DATE_ALL --> BUILD_TABLE[Build Source Precedence Table] BUILD_TABLE --> DETECT_STALE{Stale Sources?} DETECT_STALE -->|Yes| REFRESH[Re-fetch from Newer Source] REFRESH --> BUILD_TABLE DETECT_STALE -->|No| FINALIZE[Assemble Skill] end %% Final Assembly subgraph Assembly["Skill Assembly"] FINALIZE --> WRITE_SKILL[Write SKILL.md] WRITE_SKILL --> ADD_PRECEDENCE[Add Source Precedence Table] ADD_PRECEDENCE --> ADD_EXAMPLES[Add Working Examples] ADD_EXAMPLES --> ADD_ANTIPATTERNS[Add Anti-Patterns Section] ADD_ANTIPATTERNS --> VALIDATE{Skill Complete?} VALIDATE -->|No| IDENTIFY_GAPS[Identify Gaps] IDENTIFY_GAPS --> P1_ENTRY VALIDATE -->|Yes| DONE([Skill Published]) end %% Click tooltips for every node click START "#" "**Start: New Skill Needed**\nTriggered when a new Verkada domain skill is needed.\n- New service to document\n- Expanding an existing skill\n- Researching a Verkada topic in depth" click PLAN "#" "**Identify Domain and Scope**\nDefine what the skill covers:\n- Which service(s) or domain?\n- What questions should the skill answer?\n- Who is the target audience?\n- What tools and commands are involved?" click P1_ENTRY "#" "**Begin Phase 1**\nStart with primary documentation.\nAll three doc sources run in parallel." click REPO_DOCS "#" "**Read In-Repo Docs**\nCheck service-level and platform-level docs:\n- `{service}/docs/`\n- `{platform_service}/docs/`\n- Root-level `docs/`\n\n`ls ~/verkada/Verkada-Backend-1/{service}/docs/`" click NOTION_LIVE "#" "**Search Notion Live**\nUse Notion MCP to find and read relevant pages:\n- `mcp__notion__notion-search` to find pages\n- `mcp__notion__notion-fetch` to read them\n\nWatch for 'see GitHub for the most up-to-date' callouts." click NOTION_CACHE "#" "**Search Notion Offline Cache**\nSearch the periodic snapshot (at most ~1 week old):\n\n`find ~/verkada/Verkada-Backend-Docs/notion -name '*topic*'`\n\nNo API calls needed, no rate limiting." click DATE_DOCS "#" "**Get Git Dates**\nEvery source needs a date. Run:\n\n`cd ~/verkada/Verkada-Backend-1`\n`git log -1 --format='%ai' -- path/to/doc.md | cut -c1-10`\n\nContent evolves; dates determine trustworthiness." click CHECK_DEFERRED "#" "**Decision: Defers to GitHub?**\nMany Notion pages have a callout saying 'see GitHub for the most up-to-date version.'\n\nIf yes, mark as deprecated and read GitHub directly.\nIf no, extract the content." click SKIP_NOTION "#" "**Mark as Deprecated**\nNotion page defers to GitHub.\nDo not use as authoritative source.\nRead the GitHub source it points to instead." click KEEP_NOTION "#" "**Extract Content**\nNotion page is still authoritative.\nExtract relevant content and record the last_edited_time." click ENUMS "#" "**Read Enums and Configs**\nSource code is more reliable than docs.\nFind the authoritative enum for any concept:\n\n`grep -r 'class ExemptionReason' ~/verkada/Verkada-Backend-1 --include='*.py' -l`\n\nNever document enum values from docs; read the source class." click SVC_CONFIG "#" "**Read Service Configs**\nService configs show real usage patterns:\n\n`cat ~/verkada/Verkada-Backend-1/vcorecommand/vcorecommand/audit_log/service_config/{service}.yaml`\n\nThese reveal actual field mappings and exemption patterns." click SCHEMAS "#" "**Read Real Schemas**\nVerkafka event schemas show actual field patterns:\n\n`ls ~/verkada/Verkada-Backend-1/verkafka/verkafka/events/{service}/`\n\nSchemas are the contract between services." click TERRAFORM "#" "**Read Terraform**\nTerraform is ground truth for what is actually deployed:\n\n`ls ~/verkada/Terraform/services/{service}/`\n\nRead `datadog.tf` for alert configs, monitor queries, thresholds.\nDo not skip this phase." click SEARCH_PRS "#" "**Search PRs by Title**\nFind relevant merged PRs. Two sources exist:\n- Old PRs (pre-7018): `prs.json`\n- Newer PRs: 143k individual `.md` files\n\nSearch by keyword in title frontmatter." click OLD_PRS "#" "**Old PRs: prs.json**\nJSON file covering older PRs:\n\n`python3 -c 'import json; ...'`\n\nFilter by keyword in title and merged_at date." click NEW_PRS "#" "**New PRs: Individual .md Files**\nNewer PRs stored as individual markdown files:\n\n`grep -rl 'title:.*keyword' ~/verkada/Verkada-Backend-Docs/github/.../prs/`\n\nSearch frontmatter for title, state, number." click REVIEW_COMMENTS "#" "**Extract Review Comments**\nReview comments are the gold mine.\nEach PR file has a `## Review Comments` section.\n\nExtract reviewer feedback, corrections, and conventions.\nThese reveal implicit standards not found in docs." click REVIEWER_PATTERNS "#" "**Find Reviewer Patterns**\nFind all PRs reviewed by specific people:\n\n`find .../prs -name '*.md' | xargs grep -l 'by [handle]'`\n\nCross-reference with topic-relevant PRs to find domain experts." click TEAM_LOOKUP "#" "**Look Up Team Ownership**\nUse `verkada-org-chart` skill to find who owns the domain:\n\n`python3 ~/.claude/skills/verkada-org-chart/scripts/team-members.py team-slug`\n\nAlso check `team-service-mapping.json` for service-to-team mapping." click GH_HANDLES "#" "**Get GitHub Handles**\nResolve team members to GitHub handles:\n\nUse `name-email-mapping.csv` from the org chart data.\nNeeded to cross-reference with PR review history." click CODEOWNERS "#" "**Map CODEOWNERS to Channels**\nMap GitHub teams to Slack PR review channels.\n\nUse the team-to-channel mapping from the org chart skill.\nIdentify unicorn reviewers who cover multiple teams." click FIND_MONITORS "#" "**Find Related Monitors**\nSearch all Datadog monitors for the topic:\n\n`datadog-cli monitors list --json | python3 -c '...'`\n\nFilter by keyword in monitor name.\nNo sleep needed between datadog-cli calls." click MONITOR_DETAIL "#" "**Get Monitor Configs**\nGet full config for each relevant monitor:\n\n`datadog-cli monitors get --json`\n\nExtract: query, thresholds, notify config, state.\nThese reveal what the team actually monitors in prod." click LIVE_METRICS "#" "**Query Live Metrics**\nQuery production metric values:\n\n`datadog-cli metrics query --query 'metric{env:prod1}' --from 1h --json`\n\nShows real production behavior and baseline values." click DISCOVER_CHANNELS "#" "**Discover Channels via Harvest**\nHarvest is local SQLite, instant, no rate limiting:\n\n`slackctl harvest list --kind channel | grep -i 'topic'`\n\nGet channel IDs, topics, and membership status." click CHECK_ACCESS "#" "**Decision: Member Access?**\nPrivate channels require membership to read.\n- Public: `slackctl read` works\n- Private + member: `slackctl read` works\n- Private + not member: Returns null, inaccessible\n\nDo NOT use `slackctl search 'in:channel'` for private channels." click READ_HISTORY "#" "**Read Channel History**\nPaginate with `slackctl read`:\n- Use `-n 200` per page\n- Use `--before ` for pagination\n- Sleep 12s between API calls (rate limited)\n\nContains real discussions, questions, and tribal knowledge." click MARK_INACCESSIBLE "#" "**Mark as Inaccessible**\nChannel is private and you are not a member.\nDocument the channel ID for future reference.\nNote what you expected to find there." click RESOLVE_USERS "#" "**Resolve User IDs to Names**\nSlack messages contain user IDs, not names:\n\n`slackctl user get --jq '{id,name,real_name,title}'`\n\nSleep 11s between user lookups (API rate limiting)." click SEARCH_LINEAR "#" "**Search Linear Exports**\nSearch the offline Linear export (no rate limiting):\n\n`grep -l 'keyword' ~/verkada/Verkada-Backend-Docs/linear/ACBE/issues/*.md`\n\nPlain filesystem reads, no API calls." click EXTRACT_DECISIONS "#" "**Extract Design Decisions**\nLinear tickets contain:\n- Feature intent and requirements\n- Design decisions and trade-offs\n- Historical context for why things exist\n- Links to related PRs and docs" click KNOWN_ISSUES "#" "**Catalog Known Issues**\nDocument known bugs, limitations, and workarounds found in Linear.\nThese often explain surprising behavior in the codebase." click DATE_ALL "#" "**Date-Stamp All Sources**\nBefore documenting anything, verify currency:\n- Git dates for source files\n- `last_edited_time` for Notion pages\n- `merged_at`/`created_at` for PRs\n- Monitor last-modified for Datadog\n\nNewer sources supersede older ones." click BUILD_TABLE "#" "**Build Source Precedence Table**\nEvery skill needs this table. Newest source wins:\n\n| Source | Last Updated | Authority |\n|--------|-------------|----------|\n| `path/to/doc.md` | YYYY-MM-DD | Canonical |\n| `path/to/enum.py` | YYYY-MM-DD | Authoritative |\n| PR patterns | YYYY-YYYY | Conventions |" click DETECT_STALE "#" "**Decision: Stale Sources?**\nCompare dates across sources.\nIf a doc is older than the source code it describes, it may be stale.\nIf PR patterns contradict older docs, the PR wins." click REFRESH "#" "**Re-fetch from Newer Source**\nGo back and re-read the newer source.\nUpdate the precedence table with corrected dates.\nResolve conflicts between stale and fresh sources." click FINALIZE "#" "**Assemble Skill**\nAll 8 phases complete.\nConsolidate findings into skill structure." click WRITE_SKILL "#" "**Write SKILL.md**\nCreate the skill file following Anthropic best practices:\n- Keep under 500 lines\n- Overview, quick start, examples first\n- Link to `reference/` for details\n- Put scripts in `scripts/`\n- Progressive disclosure structure" click ADD_PRECEDENCE "#" "**Add Source Precedence Table**\nInclude the dated precedence table.\nThis is mandatory for every Verkada skill.\nNewest source wins when there is a conflict." click ADD_EXAMPLES "#" "**Add Working Examples**\nInclude real, tested examples:\n- CLI commands that actually work\n- Code snippets that have been run\n- Common use cases with expected output\n\nFix examples that produce wrong output." click ADD_ANTIPATTERNS "#" "**Add Anti-Patterns Section**\nDocument what NOT to do:\n- Do not trust deprecated Notion pages\n- Do not document enums from docs\n- Do not use `slackctl search` for private channels\n- Do not skip Terraform\n- Do not assume old PR patterns still apply" click VALIDATE "#" "**Decision: Skill Complete?**\nReview the skill for gaps:\n- Are all 8 phases covered?\n- Are examples tested and working?\n- Is the precedence table populated?\n- Are anti-patterns documented?\n\nIf gaps remain, loop back to Phase 1." click IDENTIFY_GAPS "#" "**Identify Gaps**\nDocument what is missing.\nTarget specific phases for re-research.\nLoop back to the beginning with focused scope." click DONE "#" "**Skill Published**\nSkill is complete and ready for use.\n- SKILL.md under 500 lines\n- Reference docs in `reference/`\n- Scripts in `scripts/`\n- Source precedence table dated\n- Anti-patterns documented\n- Examples tested and working" %% Class definitions classDef entry fill:#d1ecf1,stroke:#7ec8d8 classDef docs fill:#e8daef,stroke:#b07cc6 classDef source fill:#ffeaa7,stroke:#e0c040 classDef pr fill:#d1ecf1,stroke:#7ec8d8 classDef org fill:#fff3cd,stroke:#f0c040 classDef monitoring fill:#f8d7da,stroke:#e06070 classDef slack fill:#ffeaa7,stroke:#e0c040 classDef linear fill:#e8daef,stroke:#b07cc6 classDef currency fill:#fff3cd,stroke:#f0c040 classDef assembly fill:#d4edda,stroke:#5cb85c classDef decision fill:#fff3cd,stroke:#f0c040 class START,PLAN,P1_ENTRY entry class REPO_DOCS,NOTION_LIVE,NOTION_CACHE,DATE_DOCS,CHECK_DEFERRED,SKIP_NOTION,KEEP_NOTION docs class ENUMS,SVC_CONFIG,SCHEMAS,TERRAFORM source class SEARCH_PRS,OLD_PRS,NEW_PRS,REVIEW_COMMENTS,REVIEWER_PATTERNS pr class TEAM_LOOKUP,GH_HANDLES,CODEOWNERS org class FIND_MONITORS,MONITOR_DETAIL,LIVE_METRICS monitoring class DISCOVER_CHANNELS,CHECK_ACCESS,READ_HISTORY,MARK_INACCESSIBLE,RESOLVE_USERS slack class SEARCH_LINEAR,EXTRACT_DECISIONS,KNOWN_ISSUES linear class DATE_ALL,BUILD_TABLE,DETECT_STALE,REFRESH currency class FINALIZE,WRITE_SKILL,ADD_PRECEDENCE,ADD_EXAMPLES,ADD_ANTIPATTERNS,VALIDATE,IDENTIFY_GAPS,DONE assembly end style _MAIN_ fill:none,stroke:none,padding:0 _HEADER_ ~~~ _MAIN_