Project Metrics
8 min
the metrics page is under active development and the components below may evolve the metrics tab is split into five sub tabs, selectable from the chip row at the top each chip writes a ?metricstab=β¦ parameter to the url, so individual tabs can be deep linked or shared the default tab is productivity productivity the productivity tab is the default landing view it surfaces a 7 card kpi row in playbook diagnostic order when productivity is below target, you walk these ratios left to right to find where to look first the cards, in order productivity β rolling 7 day productivity vs the project's ap target (100% = on target) time allocation β drawing tool time Γ· reported time lower means more time went to meetings, admin, or waiting healthy band is roughly 75β90% ap/h β annotation points produced per reported hour should sit 15β20% above the project's ap target to absorb correction and review overhead the headline number is colour coded red below apt, green at or above apt Γ 1 15, white in the healthy band correction ratio β correction time Γ· annotate time higher means more rework β often a sign of quality, learning, instruction, or tooling friction trend should head toward zero qa overhead β (review + correction time) Γ· drawing tool time higher means more tool time goes to qa than to producing output reported time β total time the team reported working on this project over the past 7 days active users β number of users with reported working time over the past 7 days each card displays the headline value, a comparison against the previous 7 days (with a coloured chevron indicating whether the change is good or bad for that metric), and an 8 week graph with per dot tooltips the most recent graph point may differ slightly from the headline number, because the headline is a rolling 7 day value while the graph shows weekly buckets apt calibration warning a banner appears above the kpi row when the project's effective ap target looks miscalibrated β for example, when no annotator has reached 100% productivity in the available data the playbook is when nobody is hitting target, the target itself is likely wrong β recalibrate the project's apt before reading the diagnostic ratios, since they all assume a realistic benchmark productivity vs target a weekly distribution chart showing how the team splits across three fixed bands above target, approaching target, and below target weekly ap produced a bar chart of total annotation points produced per week, with value labels on each bar tables below the charts a weekly summary table and a per user breakdown table clicking a user opens their full productivity report; the originating project is preserved via ?projectid=\<id> so the breadcrumb back button returns to this metrics view efficiency the efficiency tab focuses on the drivers behind the productivity headline it contains efficiency drivers table β per user efficiency breakdown ap/h trend β weekly ap/h over time productivity spread β distribution of productivity across the team qa ratio trend β review + correction time as a share of tool time, over time active users trend β weekly active user counts productivity vs target bucket trend β how the distribution across the three target bands has shifted week over week delivery delivery groups the throughput oriented charts weekly counts β inputs delivered per week weekly detailed counts β same data broken down by phase or status turnaround time β average time from request creation to delivery composition composition tracks what is in the data β shapes per input, objects per input, and geometry type density across requests the data source toggle and shape prediction overlay still apply here quality quality kpis are split out into their own tab the headline cards cover review quality (share of annotated objects accepted by quality reviewers without changes) and expert verification (share of qr approved objects also accepted by experts) a sub metric breakdown table covers precision, recall, geometry accuracy, property accuracy, and the count of reviewed scenes who sees what access controls are applied per tab productivity and quality require the team overview permission productivity is also hidden when time spent is hidden efficiency requires the team overview permission delivery and composition are visible to all users with access to the metrics tab when a user lacks permission to see a tab, the chip is hidden from the row
