Skip to content

Commit ba99993

Browse files
committed
Add AI Adoption Score documentation
1 parent 6a4b8fd commit ba99993

File tree

11 files changed

+689
-1
lines changed

11 files changed

+689
-1
lines changed

.gitignore

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -70,4 +70,4 @@ qdrant_storage/
7070
# Act Secret Files
7171
.secrets
7272
# Architect plans
73-
plans/
73+
./plans/
Lines changed: 239 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,239 @@
1+
---
2+
sidebar_label: For Team Leads
3+
---
4+
5+
# AI Adoption Dashboard for Team Leads
6+
7+
This guide covers how engineering managers and team leads can use the AI Adoption Dashboard to drive AI integration, identify gaps, and communicate progress to stakeholders.
8+
9+
## Reading Team-Wide Metrics
10+
11+
### The Organization View
12+
13+
Disable the **"Only my usage"** toggle to see aggregated metrics across your entire team. This view shows:
14+
15+
- **Overall AI Adoption Score** — Your single benchmark number
16+
- **Dimension breakdown** — Frequency, Depth, and Coverage contributions
17+
- **Week-over-week trends** — Direction and magnitude of change
18+
- **Historical timeline** — Score progression over days, weeks, or months
19+
20+
### Dimension Detail Panels
21+
22+
Click on any dimension card (Frequency, Depth, or Coverage) to open its detail panel. Each panel provides:
23+
24+
- A focused timeline for that dimension
25+
- The goal statement for that dimension
26+
- Three actionable improvement suggestions tailored to what that dimension measures
27+
28+
Use these panels to diagnose specific issues and identify targeted actions.
29+
30+
### Comparing Time Periods
31+
32+
Switch between time filters to understand different patterns:
33+
34+
| Filter | Best For |
35+
| -------------- | ------------------------------------------------ |
36+
| **Past Week** | Recent changes, sprint-level trends |
37+
| **Past Month** | Adoption initiative tracking, onboarding results |
38+
| **Past Year** | Long-term trends, seasonal patterns |
39+
| **All** | Historical baseline, major milestones |
40+
41+
---
42+
43+
## Identifying Adoption Gaps
44+
45+
### Low Coverage Signals
46+
47+
A low Coverage score often indicates adoption gaps—pockets of your team that aren't using AI.
48+
49+
**Questions to investigate:**
50+
51+
- Are all team members logged in and active?
52+
- Are certain roles or squads under-represented?
53+
- Is usage concentrated on specific days (spiky pattern)?
54+
55+
**Actions:**
56+
57+
1. Check your Organization Dashboard for inactive seats
58+
2. Look for patterns in who's not using AI (new hires? certain roles?)
59+
3. Consider targeted onboarding or pairing sessions
60+
61+
### Low Depth Signals
62+
63+
Low Depth indicates that developers may be trying AI but not trusting or shipping its output.
64+
65+
**Questions to investigate:**
66+
67+
- Are acceptance rates low? (Developers rejecting suggestions)
68+
- Is AI-generated code being merged?
69+
- Are developers using AI across multiple stages (plan → build → review)?
70+
71+
**Actions:**
72+
73+
1. Enable [Managed Indexing](/advanced-usage/managed-indexing) to improve context quality
74+
2. Review whether suggestions are relevant to your codebase
75+
3. Introduce chained workflows to increase multi-stage usage
76+
77+
### Low Frequency Signals
78+
79+
Low Frequency suggests AI hasn't become a daily habit.
80+
81+
**Questions to investigate:**
82+
83+
- Are developers aware of all available AI surfaces (IDE, CLI, Cloud)?
84+
- Is AI usage triggered only by specific, infrequent problems?
85+
- Have developers built AI into routine tasks?
86+
87+
**Actions:**
88+
89+
1. Map AI to existing daily tasks (stand-ups, PRs, documentation)
90+
2. Ensure the CLI is installed for terminal workflows
91+
3. Run a "try autocomplete for a week" challenge
92+
93+
---
94+
95+
## Running Adoption Initiatives
96+
97+
### Setting Goals
98+
99+
Use the score tiers as milestones:
100+
101+
| Current Tier | Reasonable Next Goal |
102+
| --------------- | ---------------------------- |
103+
| 0–20 (Minimal) | Reach 30–40 within 4–6 weeks |
104+
| 21–50 (Early) | Reach 55–65 within 4–6 weeks |
105+
| 51–75 (Growing) | Reach 75–80 within 6–8 weeks |
106+
| 76–90 (Strong) | Maintain and optimize |
107+
108+
**Tip:** Focus on one dimension at a time rather than trying to improve everything at once.
109+
110+
### Initiative Ideas
111+
112+
**For Frequency:**
113+
114+
- "Autocomplete Week" — Everyone commits to using autocomplete daily
115+
- CLI onboarding session — 30-minute walkthrough of terminal AI
116+
- Daily AI tip in Slack — Share one use case per day
117+
118+
**For Depth:**
119+
120+
- "Chain Challenge" — Complete one feature using plan → build → review
121+
- Managed Indexing rollout — Enable better context for the whole team
122+
- Deploy previews — Validate AI output before merging
123+
124+
**For Coverage:**
125+
126+
- New hire onboarding includes Kilo setup
127+
- Weekly "AI wins" sharing in stand-ups
128+
- Pair low-usage developers with enthusiastic adopters
129+
130+
### Tracking Progress
131+
132+
1. **Set a baseline** — Note your score at the start of an initiative
133+
2. **Check weekly** — Watch for trend changes, not absolute numbers
134+
3. **Adjust tactics** — If a dimension isn't moving, try a different approach
135+
4. **Celebrate wins** — Acknowledge when the team hits a milestone
136+
137+
---
138+
139+
## Benchmarking Against Goals
140+
141+
### Internal Benchmarking
142+
143+
Use the score to compare:
144+
145+
- **Teams within your organization** — Which teams are leading adoption?
146+
- **Before vs. after** — Did a specific initiative move the needle?
147+
- **This quarter vs. last** — Are you trending up or down?
148+
149+
### Communicating to Stakeholders
150+
151+
The AI Adoption Score is designed to be quotable:
152+
153+
> "Last quarter we were at 38. This quarter we're at 57. Our goal is to reach 70 by Q2."
154+
155+
**When presenting scores:**
156+
157+
- Lead with the trend, not just the number
158+
- Explain the tier and what it means
159+
- Connect to business outcomes ("Higher adoption → faster development cycles")
160+
- Share specific actions you're taking
161+
162+
### Sample Stakeholder Update
163+
164+
> **AI Adoption Update — January 2025**
165+
>
166+
> - **Current Score:** 57 (Growing adoption tier)
167+
> - **Last Month:** 48
168+
> - **Change:** +9 points, driven by improved Depth scores
169+
>
170+
> **Key Actions Taken:**
171+
>
172+
> - Enabled Managed Indexing for better AI context
173+
> - Introduced Code Reviews for all PRs
174+
> - Onboarded 3 inactive team members
175+
>
176+
> **Next Steps:**
177+
>
178+
> - Target 65 by end of February
179+
> - Focus on Coverage—spread usage across the full week
180+
181+
---
182+
183+
## Privacy and Data Considerations
184+
185+
### Anonymous Data
186+
187+
Individual usage data is anonymized in the dashboard. While you can see aggregate metrics, the dashboard does not expose individual developer activity to managers.
188+
189+
### Focus on Teams, Not Individuals
190+
191+
The Dashboard is designed for:
192+
193+
- Team-level insights
194+
- Organizational trends
195+
- Comparative benchmarking
196+
197+
It is **not** designed for:
198+
199+
- Individual performance evaluation
200+
- Identifying specific low performers
201+
- Surveillance of developer activity
202+
203+
Use the score to identify adoption **gaps**, not to judge individual developers.
204+
205+
---
206+
207+
## Future Enhancements
208+
209+
### Code Contribution Tracking
210+
211+
A future enhancement will track AI-contributed code from feature branch to main branch:
212+
213+
- What percentage of AI-suggested code actually ships?
214+
- How much of the codebase was AI-assisted?
215+
216+
This metric is separate from the Adoption Score but valuable for measuring AI impact on output.
217+
218+
### Team Comparison Views
219+
220+
Additional views for comparing multiple teams within an organization are planned, enabling leadership to identify best practices from high-performing teams.
221+
222+
---
223+
224+
## Quick Reference: Dashboard Actions
225+
226+
| What You Want to Know | Where to Look |
227+
| ---------------------------- | ------------------------------------------- |
228+
| Overall adoption level | Main score display |
229+
| Which dimension needs work | Trend indicators (look for negative trends) |
230+
| Specific improvement actions | Click dimension → detail panel |
231+
| Historical patterns | Timeline chart with time filter |
232+
| Your personal usage | Toggle "Only my usage" |
233+
| Week-over-week change | Metric cards at bottom |
234+
235+
## Next Steps
236+
237+
- [Understand what each dimension measures](/plans/adoption-dashboard/understanding-your-score)
238+
- [Learn strategies to improve your score](/plans/adoption-dashboard/improving-your-score)
239+
- [Return to the dashboard overview](/plans/adoption-dashboard/overview)

0 commit comments

Comments
 (0)