Project Metrics
5 min
the metrics page is under active development cards, charts, and breakdowns will change as we iterate β this overview covers what's on the page today and what each section is trying to tell you the metrics tab on a project gives you a single view of how your team is performing, blending production signals (throughput, time, active users) with quality signals (review outcomes) it's built for production managers who need to answer "are we on track this week?" without pulling numbers into a spreadsheet the sections weekly counts how many inputs have been delivered per week use this to spot a slowdown or catch up week at a glance production your headline productivity and quality numbers for the project kpi cards β single row showing this week's productivity %, reported time, qa overhead ratio, and two quality cards (quality review, expert verification) with scores from the past 14 days each card shows a delta vs the previous week so you can spot trends at a glance hover any card for a plain english breakdown of what went into the number weekly ap chart β ap produced per week across the project weekly breakdown table β per week totals in a sortable table user breakdown table β one row per annotator, with metrics on a 7 day rolling window rather than all time totals week column headers show their date range on hover, so there's no guessing which "week 14" means production trends time series view of the same signals, so you can see direction rather than a snapshot capped to the last 5 full weeks to keep the view readable ap/h trend β average ap/h over time spread ribbon β how the distribution of ap/h across your team is moving (tightening, widening, shifting) performance comparison β top vs bottom ranked users per interval (shown once you have at least 5 users) qa ratio over time β share of tool time spent on review + correction active users per week β distinct users with reported time each week; hover the bar for a weekday breakdown with dates you can toggle between daily and weekly intervals, and overlay a configurable lower ap/h threshold line to mark what "below expectation" looks like for your project density & distribution per request charts for shapes per input, objects per input, and geometry type distribution β useful when comparing the difficulty or shape of the work across requests toggle the data source between delivered inputs and last completed inputs depending on which slice you want to analyse if a shape prediction model is available for the project, predicted average shapes per input are drawn as an overlay with 80% and 90% confidence intervals a checkbox controls whether predictions are also drawn over already annotated requests who sees what the production and production trends sections are only visible to users with access to team level performance data if you can't see time data on your other views, you won't see these sections here either weekly counts and density & distribution are visible to everyone who can open the metrics tab
