TEAM MANAGEMENT

Productivity Insights

15min

This view is available to anyone with the Team Manager role. It surfaces your organization’s annotation and review productivity in one place, updating automatically every 10 minutes.

Note: “Annotation Points” is a project-level metric. See our Annotation Points & Targets guide for details.



A screenshot of the Productivity Insights view, detailing the data produced in a particular project
Annotators Overview with summary and filters


2. Annotate Productivity Tab

Path: Campus ▶︎ Productivity Insights ▶︎ Annotate Productivity

Track how your annotators perform against targets, with quality checks in one view.

2.1 Summary Cards

  • Total Productivity Overall annotation productivity (% of target) — shown only if Review tasks are not selected.
  • Total Tasks Completed Sum of all completed tasks in the interval.
  • Team AP Total Annotation Points awarded.
  • Total Objects Sum of all objects created.
  • Total Shapes Sum of all shapes (e.g. polygons, boxes) created.

2.2 Annotators Table

A pageable table listing each user’s metrics:

Column

Description

Name

User’s name and email. Click to drill into their personal User Productivity page.

Ongoing Tasks

Number of tasks currently in progress.

Productivity (AP)

% of target AP achieved (green if ≥ 100%, red if < 100%). Actual AP in parentheses.

Tasks Completed

Count of tasks finished.

Quality

Estimated quality (%). Computed as 100 − (feedback items / reviewed objects × 100). Shows N/A if no reviewed work in the interval.

Tasks Reviewed

Number of tasks that received at least one review—higher values increase confidence in the Quality metric.

Working Time

Stopwatch-reported time. Falls back to annotation time if no stopwatch data (asterisk *).

Time Allocation

Ratio of annotation time vs. total working time (colored red/yellow/white to indicate too low/high allocation).

Objects Produced

Net objects created (can be negative if users remove objects in correct/review tasks).

Shapes Produced

Net shapes created (can be negative if shapes are removed in correct/review tasks).



2.3 Columns (Detailed explanation)

This section describes the different columns present in the table.

Name

This is the name and email of the given user. You are able to click on the name of a user to be taken to another view to gain even more insights into their performance.

Ongoing Tasks

The number of currently ongoing tasks that the users are assigned to.

Productivity (AP)

This column details how productive the the listed users are in the selected timespan in comparison with the expected productivity. This metric is displayed as a percentage which is colored green if the user is exceeding expectations, and red if the user is not. Inside the parenthesis the actual number of produces annotation points is displayed. As an example, if a user has worked 2h in a selected time period, they have produced 100APs in this period, and the target is 100AP/h, the user will have a productivity of 50%. The productivity metric is computed using the stopwatch time reporting feature, is this feature is not used this view will not contain productivity metrics, instead a N/A will be displayed.

Tasks Completed

This is how many tasks that the user has completed during the selected timespan.

Quality

In order to be able to understand if a user is producing data with an acceptable quality (not just at an expected rate), this column estimates the quality of the users' work. The quality estimate is presented as a percentage and it's computed by subtracting the ratio of the number of reported feedback items and the number of reviewed objects from 100. For example, if a user has produced 100 objects in 2 tasks, a reviewer has examined both of these tasks and reported 20 issues using the feedback tooling, the quality estimate will be 80%. If no tasks that the user has completed in the selected timespan, no quality metric will be available. This is a crude metric that should not be used for other purposes than ensuring that users are not working at an unsustainable pace. Please note that this column will not contain any valuable data if the users' work is not being reviewed in a way that utilize the feedback tooling to report issues.

Tasks Reviewed

This is the number of tasks that the user has completed within the selected timespan that has been reviewed. The higher this number is, the more trustworthy the Quality estimate from the previous section is.

Working Time

How much time has been reported from Stopwatch during the selected timespan. If no time has been reported, we will fall back to showing the time spent annotating. In this case, a red asterisk will indicate that there is no stopwatch time available, and this will also make it impossible to compute a productivity metric.

Time Allocation

The ratio of time allocated to annotation, and the total time reported during the selected timespan. This metric is in this table in order to verify that users are not working too much or too little. If a user does not take any breaks during a workday, this probably indicates a problem.

The time allocation is colored according to the following in order to indicate is a value is acceptable, too high or too low:

Ratio

Indicator

≤ 0.70

🔴 Too Low

0.70–0.75

🟡 Border

0.75–0.85

⚪️ Optimal

0.85–0.90

🟡 Border

≥ 0.90

🔴 Too High

Objects Produced

This is how many objects that the user has created during the selected timespan. Note, that this might be a negative number, if:

  • The User is doing Correction tasks and is removing objects
  • The User is doing Review Tasks and is removing objects
  • The user started the day by removing objects from a scene started on a previous day, and the timespan only accounts for today.

Shapes Produced

This is how many shapes has created during the selected timespan. Note, that this might be a negative number, if:

  • The User is doing Correction tasks and is removing shapes
  • The User is doing Review Tasks and is removing shapes
  • The user started the day by removing shapes from a scene started on a previous day, and the timespan only accounts for today.

For a distinction between objects and shapes, see Objects and Shapes.

3. Review Productivity Tab

Path: Campus ▶︎ Productivity Insights ▶︎ Review Productivity

Gauge how your reviewers are keeping up with and finding errors in annotation work.

3.1 Summary Cards

  • Review Coverage % of annotation points that have been reviewed / annotated.
  • Total Tasks Completed Number of review tasks submitted.
  • Found Errors % of errors detected by reviewers compared to a follow-up reviewer.
  • Total Follow-up Reviews Count of second-pass reviews performed.



Image showing the Reviewer productivity table
Reviewer Productivity


3.2 Reviewers Table

Column

Description

Name

Reviewer’s name and email.

Review Coverage

% of annotation points reviewed / annotation points created

Tasks Completed

Number of review tasks finished.

Found Errors

% of errors the reviewer flagged versus those found by the follow-up review.

Follow-up Reviews

Count of follow-up reviews on this reviewer’s work.

Review Time

Time spent in the annotation tool performing reviews.

Review Ratio

Review time / annotation time ratio.

Feedback Created

Number of feedback items the reviewer submitted.

4. FAQs

Q: Why is working time not showing correctly?

A: Make sure the Task Category is including Training and Onboarding.

Q: Why is working time not showing correctly i've toggled the additional filters?

A: If working time is not showing correctly and time utilization is >1.0, it means that the annotator accidentally turned of the Stopwatch while working. To get an accurate measurement of annotation time. Please check the graphs in User Productivity.

Q: How often does this view update?

A: Every 10 minutes—no manual refresh needed.

Q: Who can see this view?

A: Only users with the Team Manager role.

Q: Why is my quality metric N/A?

A: Quality is based on feedback in reviewed tasks. If none of your completed tasks were reviewed, quality cannot be computed.