CAMPUS

User productivity

13min
this page provides an overview of your overall performance 1\ productivity path campus ▶︎ user productivity ▶︎ productivity tab get a daily view of your annotation output—points, tasks, objects—and dig into each task’s details productivity of a user 1 1 controls & filters time interval select any date range by default, shows the last 7 days (daily granularity) projects filter metrics to one or more projects request type filter by request category task type choose between annotate, correct, review, or all 1 2 trend charts chart what it shows annotation points per day daily sum of annotation “points” (e g dataset‑specific scoring) the dotted line is your average over the selected interval tasks completed per day count of tasks you finished each day dotted line = average tip hover over any point on the chart to see the exact value and date 1 3 user productivity table a pageable table listing every task in the selected range column description task clickable task id — opens the task in the annotation tool last update timestamp of the most recent state change or submission category internal grouping (e g “productivity”, “training” ) type task type a = annotate, c = correct, r = review, status current status code (e g c = completed, o = ongoing, e = expired) annotation points total points earned for that task (scoring rules vary by project) annotation time total time spent on the task (formatted as hh mm) objects produced number of individual objects annotated (sum of 2d + 3d instances) shapes produced total geometries created (object frame count) 2d shapes count of 2d only shapes 3d shapes count of 3d only shapes 2\ quality path campus ▶︎ user productivity ▶︎ quality tab this page provides a detailed breakdown of the quality error types you’ve generated, empowering you to pinpoint your most common issues and focus on targeted improvements note all figures in the quality tab are based on feedback—each “error” corresponds to one feedback entry 2 1 controls & filters time interval choose any date range (weekly summary by default) projects filter by project request type filter by request ℹ️ the quality view defaults to weekly stats; switch to daily if needed 2 2 total number of errors a summary card displaying the aggregate count of errors recorded in the selected time interval 2 3 error type distribution horizontal bar chart displaying each error type and its percentage, example error type count % missingobject 2 67% feedbackerrortype 1 33% 2 4 individual error breakdown a pageable table listing every feedback entry task click open task ▶ to jump to the corresponding annotation reporter who left the feedback (usually a quality manager), but could be a lead, client or automatic feedback error type the error category which was selected when giving the feedback comment free‑form notes on what’s wrong feedback date timestamp when the error was logged filters above the table allow you to show only errors from certain reporter roles (e g qm, lqm) filter by specific error types search by keyword in comments 3\ faqs q who can see my errors? a only you and users with qm/admin roles