# Review Bottleneck Detector - **Command:** `lore bottlenecks [--since ]` - **Confidence:** 85% - **Tier:** 2 - **Status:** proposed - **Effort:** medium — join MRs with first review note, compute percentiles ## What For MRs in a given time window, compute: 1. **Time to first review** — created_at to first non-author DiffNote 2. **Review cycles** — count of discussion resolution rounds 3. **Time to merge** — created_at to merged_at Flag MRs above P90 thresholds as bottlenecks. ## Why Review bottlenecks are the #1 developer productivity killer. Making them visible and measurable is the first step to fixing them. This provides data for process retrospectives. ## Data Required All exists today: - `merge_requests` (created_at, merged_at, author_username) - `notes` (note_type='DiffNote', author_username, created_at) - `discussions` (resolved, resolvable) ## Implementation Sketch ```sql -- Time to first review per MR SELECT mr.id, mr.iid, mr.title, mr.author_username, mr.created_at, mr.merged_at, p.path_with_namespace, MIN(n.created_at) as first_review_at, (MIN(n.created_at) - mr.created_at) / 3600000.0 as hours_to_first_review, (mr.merged_at - mr.created_at) / 3600000.0 as hours_to_merge FROM merge_requests mr JOIN projects p ON mr.project_id = p.id LEFT JOIN discussions d ON d.merge_request_id = mr.id LEFT JOIN notes n ON n.discussion_id = d.id AND n.note_type = 'DiffNote' AND n.is_system = 0 AND n.author_username != mr.author_username WHERE mr.created_at >= ?1 AND mr.state IN ('merged', 'opened') GROUP BY mr.id ORDER BY hours_to_first_review DESC NULLS FIRST; ``` ## Human Output ``` Review Bottlenecks (last 30 days) P50 time to first review: 4.2h P90 time to first review: 28.1h P50 time to merge: 2.1d P90 time to merge: 8.3d Slowest to review: !234 Refactor auth 72h to first review (alice, still open) !228 Database migration 48h to first review (bob, merged in 5d) Most review cycles: !234 Refactor auth 8 discussion threads, 4 resolved !225 API versioning 6 discussion threads, 6 resolved ``` ## Downsides - Doesn't capture review done outside GitLab (Slack, in-person) - DiffNote timestamp != when reviewer started reading - Large MRs naturally take longer; no size normalization ## Extensions - `lore bottlenecks --reviewer alice` — how fast does alice review? - Per-project comparison: which project has the fastest review cycle? - Trend line: is review speed improving or degrading over time?