PROJECT MANAGEMENT
...
Request details
Workflow phases
Review
31min
what is a review phase? a request can contain multiple review phases, and each review phase has its own tab inside request details a review phase acts as a quality control step inside the request workflow, where all annotated inputs or a selection of them need to be approved by a reviewer to leave the phase rejected annotations are sent for correction and then re reviewed there are two different types of review phases available full review and sampled review in phases of the type full review, all annotated inputs that enter the phase are sent for review when entering the phase in phases of the type sampled review , a percentage of the annotated inputs entering the phase are sent for review while the other remains in the phase until a manager has decided if they should be reviewed or sent to the next phase without review how does a full review phase work? in phases of the type "full review" all of the intermediate annotations entering the phase are sent for review by a reviewer a reviewer must accept its quality for the annotated input to complete and leave the phase if the annotation quality isn't acceptable, a task with the action "correct" is created, and a labeler is asked to improve the quality according to the reviewer's feedback the updated annotation is then reviewed again; if the quality isn't acceptable, the cycle is repeated until it is an essential characteristic of the review phase is that it allows for giving structured feedback on the annotation you can read more about this in the feedback tools chapter how does a sampled review phase work? in phases of the type "sampled review" only a number of the annotated inputs (intermediate annotations) that enter the phase are initially selected for review (approximately 20 % in the flowchart above) the inputs not selected for review stay in the phase until a manager decides, based on the result of the sampled review, if the non selected inputs should be reviewed or if they can be sent directly to the next phase without being reviewed and potentially corrected annotated inputs selected for review (both initially selected and selected due to phase decision) must have their quality approved by a reviewer to complete and leave the phase if the annotation quality isn't acceptable, a task with the action "correct" is created, and a labeler is asked to improve the quality according to the reviewer's feedback the updated annotation is then reviewed again; if the quality isn't acceptable, the cycle is repeated until it is an essential characteristic of the review phase is that it allows for giving structured feedback on the annotation you can read more about this in the feedback tools chapter how is the sampling/selection done? when entering the phase, each annotated input has a set likelihood of being selected for review that means if the sample size is set to 20 %, each annotated input entering the phase has a 20 % likelihood of being selected for review thus, the final sample size could be just below or just above the configured sample size what is the decision, and how does it work? this phase contains a mandatory decision in which you, as a manager, based on the result of the initial review, determine what should happen to the annotated inputs that haven't been selected for initial review (incl inputs that haven't entered the phase yet as well) you can either decide to send them directly to the next phase or send them all for review you can read more about the decision in the section decision below available actions decision the decision section is only available for review phases of the type "sampled review" this sampled review phase contains a mandatory decision in which you, as a manager, determine what should happen to the annotated inputs that haven't been selected for initial review (including inputs that haven't entered the phase yet) based on the result of the initial review making a decision evaluate the review results start by evaluating the review results and consider if all annotated inputs require a review to ensure sufficient annotation quality decide if all annotated inputs should be reviewed or not a if the sampled review results indicate that the annotated inputs are of sufficient quality when entering the phase, and a review and correction cycle isn't needed > send the inputs not selected for review directly to the next phase b if a review and correction cycle seems required to reach sufficient quality > send all annotated inputs through review both of the available decisions only affect annotated inputs not selected for review this means that inputs that have been selected for review need to be accepted to exit the phase you can accept the input in a review task or use the quick accept action the decision is permanent and can't be reverted quick accept and quick reject with these actions, you can ensure that specific phase inputs either move to the next phase without review or go directly to review correction quick accept accepts and moves inputs with unstarted review tasks to the next phase without needing to be accepted by a reviewer inside a review task inputs with unstarted review tasks have the task type "review" and current task state "to do" find the input/inputs you want to quick accept inside the phase inputs table you can use the search bar or the input id filter if needed if you only want to quick accept one input, you can use the "quick accept" option in the inputs options menu if you want to quick accept multiple inputs, select them using the row checkboxes and then click the "quick accept" option inside the action bar that appears at the bottom of the page quick reject rejects inputs with unstarted review tasks without requiring a reviewer to reject it inside a review task when rejecting a review correction task is created, which is automatically assigned to the original annotator inputs with unstarted review tasks have the task type "review" and current task state "to do" find the input/inputs you want to quick reject inside the phase inputs table you can use the search bar or the input id filter if needed if you only want to quick reject one input, you can use the "quick reject" option in the inputs options menu if you want to quick reject multiple inputs, select them using the row checkboxes and then click the "quick reject" option inside the action bar that appears at the bottom of the page assigning tasks if you want to view or edit the task allocation of review and correct tasks, you can do that inside the tab tasks docid\ nzjynohsym97 hqecnq4w available monitoring progress in this section, you can monitor the current progress of the review phase and answer questions such as how many inputs with annotations have been accepted and have, therefore, moved to the next phase? how many annotated inputs are currently being corrected or re reviewed? inputs in phase the number of inputs is currently in the phase completed inputs the percentage out of all inputs that have completed and left the phase in review phases, "completed" means that the input has been accepted by a reviewer, either via a review task or a quick accept action general phase progress in this graph, you get insight into how the requests inputs are distributed in relation to this phase and its workflow stages you can see how many inputs are in an earlier phase, how many are inside the phase, and in which workflow stage they are, as well as how many have been accepted and left the phase distribution of inputs in rounds in the review phase, an annotated input can pass through multiple review rounds, depending on how many reviews and corrections it takes for it to become acceptable quality each round contains a cycle of correction and review, with the exception of round 1, which only contains a review step when the annotated input is rejected, it moves to a new round in the card distribution of inputs in rounds, you can view how the different inputs are currently spread throughout different states and rounds per round, you can see how many annotated inputs are currently in correction, in review (first review or re review), or have been accepted and are, therefore, delivery ready task acceptance and rejection the chart visualizes the absolute number and percentage of accepted and rejected review tasks per round, which helps you understand the acceptance ratio of annotations entering the phase as well as in later rounds phase inputs in the phase input table, you can see all inputs that are currently inside the phase for each input, you can see when it changed workflow stage, how many tasks have been done on it in the current phase, and what the state and type is of any current task the actions quick accept and quick reject are available for inputs with unstarted review tasks you can read more about them in the section actions above edit summary the edit summary is still under development and, therefore, only available to a limited number of organizations if you would like us to enable this for your organization, contact kognic the edit summary enables insight into how the annotations entering the phase had to be adjusted to meet the reviewers' quality expectations this is done by comparing an inputβs annotation when entering the phase to when it leaves the phase after being accepted in a review task note that quick accepted inputs aren't used in this comparison , and any edits done in the phase for these inputs won't be included as edits in the edit summary currently, you can investigate the edits in three different tables added and removed objects , edited objects , and edited scene properties added and removed objects helps you understand how often specific objects were missed or incorrectly included, thus impacting the recall and precision of the annotations the table contains the following information per class added objects the percentage of added objects seen in the review accepted inputsβ annotations compared to their content when entering the phase the numbers in this column enable insights such as π‘ to ensure all relevant vehicles were annotated we had to add 20 % more objects during review phase 1 we need to understand what objects were missed and how we can prevent that from happening in the future removed objects the percentage of removed objects seen in the review accepted inputsβ annotations compared to their content when entering the phase the numbers in this column enable insights such as π‘ we seem to have removed 50 % of all reviewed obstacle objects has the team misunderstood what defines an obstacle? objects after phase the percentage difference in object count between review accepted inputs' annotations and their content at the start of the phase object review ratio compares the initial count of objects from review accepted inputs (at phase entry) to the total number of objects from all inputs (incl those currently in review correction and review) that have entered the phase this helps you understand the current review sample of a specific class β οΈ note that the review ratio is based on objects that have entered the phase; objects still in earlier phases are excluded this means that the ratio doesn't represent the total object sample rate until all objects have completed the phase edited objects this table helps you understand how often properties and geometries had to be edited to meet the reviewers' quality expectations property edit and sample rates are presented per property, while geometry edits are presented per geometry type and type of edit (2d box position) edited objects the percentage of objects for which the attribute got edited between the annotated input entering the phase and being accepted in review π‘ 15 % of the reviewed 3d box objects have been resized; were the changes in size significant or just minor adjustments? π‘ 35 % of the reviewed objects with the property "age" had their property value changed is the definition of the different values unclear to the team? object review ratio percentage of objects that are review accepted compared to all objects in this phase the value updates when inputs are reviewed and accepted, or when new inputs enter the phase β οΈ note that the review ratio is based on objects that have entered the phase; objects still in earlier phases are excluded this means that the ratio doesn't represent the total object attribute sample rate until all objects have completed the phase edited scene properties this table helps you understand how often individual scene properties had to be edited to meet the reviewers' quality expectations edited inputs the percentage of inputs where the scene property was edited between initial phase entry and review acceptance π‘ for 23 % of the reviewed inputs the scene property weather got edited were there any particular property values that the team members had a hard time distinguishing in between? object review ratio the percentage of inputs that are review accepted compared to all inputs in this phase the value updates when inputs are reviewed and accepted or when new inputs enter the phase β οΈ note that the review ratio is based on inputs that have entered the phase; inputs still in earlier phases are excluded this means that the ratio doesn't represent the total input property sample rate until all inputs have completed the phase error summary with the error summary you get insight into what issues reviewers have found and commented on during the phase's review tasks it helps you understand the most common and less frequent identified issues the error summary insights are based on feedback items written by the phase's reviewers absolute numbers represent actual feedback items, not the edits made in response to the feedback if you are interested in understanding how the annotations were edited based on the feedback, you can use the edit summary described in the section above no of correction requests the number of feedback items of the type "correction requests" this is the sum of all errors shown in the "error type distribution" to the right no of feedback items the number of feedback items of the type "advice" these are excluded from the chart "error type distribution" and "suggested properties" error type distribution shows the absolute count and relative share of all feedback items categorized as "correction requests", grouped by their error type suggested properties for those items with the error type properties, this shows the distribution of properties that we affected each error indicated as properties has a single property connected to it individual feedback items this section helps you to get an overview of all given feedback, to answer questions such as how detailed and critical is the feedback of my colleague reviewers? are the reviewers giving valid feedback given the current guideline? what is feedback where the reviewer and annotator are discussing in the comments? how does feedback of type "missingobject" look? what type of feedback is marked as "invalid"? the items are split up by their feedback type correction requests in this section, you see feedback items for the type correction request these are things that the reviewer wants to get corrected before accepting the review you can filter the feedback items by their resolved status, the error type, whether a discussion thread exists, or whether the overall review of the input has been accepted yet below is a description of what information is available for each correction request status status comment unresolved when created by the reviewer, and not yet approved corrected when the annotator has fixed the issue mentioned in the correction request resolved when the reviewer has approved the annotators fix invalid an item can be marked as "invalid" by the user if they think it's not accurate with respect to the guidelines of the task, this can be due to mistakes in machine generated feedback or from human reviewers an item can be "unresolved" even if the overall review was accepted, or the other way around error type the type of error that was selected in the correction request suggested property if the error type is "properties", this column shows which property and value was suggested by the reviewer comment shows the description that the reviewer might have given thread exists will say "yes", if there was any reply to the item, i e a discussion thread has been started in relation to the item external scene id the scene id of the reviewed annotation current round the review round in which the input of this feedback item currently is all inputs start in round 1 with each rejected review, they progress 1 round forward accepted review whether the overall review was accepted or not feedback in this section you see feedback items of the type advice as the underlying data has less structure, the table has fewer columns and filtering options, but otherwise, it looks the same as the one above