ANNOTATION INTEGRATION
Review
5min
a review is a decision that determines whether an annotation is of sufficient quality or if it needs improvement in order to facilitate efficient improvements, the person or system that made the decision can provide review docid\ eaokniu0rtitvvuddsgva data that details potential errors or mistakes please refer to our introduction docid\ kkoaadn2hl33jbtqug7po regarding feedback to gain insight on how these concepts are used in the platform, note that these describe the full set of possibilities and workflows for dealing with reviews and feedback the possibilities enabled by this integration are currently somewhat limited, as detailed below posting a review the current integration capabilities only allow posting reviews for delivered annotations, and preliminary annotations in phases where you have review access the annotations are identified using their open label uuid that can be found in the metadata (and in the file name) of download annotations docid 5j mcgjyo5xffxqkkrqgo when posting a review, the api expects feedback data as well as a boolean accepted that indicates whether the quality is perceived as sufficient or not the api also expects an enum workflow that details how the improvement should happen for a rejected review and the workflow 'correct', a single correction task is triggered, with no follow up review task the completed correction task becomes the new delivered annotation a successful review post will return a uuid that identifies the review, and that can be used to fetch the posted feedback data this will also create a task according to the chosen workflow depending on project set up, there might be a need for this to be coordinated with the project's managers here's a example from kognic io def run(client kognicioclient, open label uuid str, error type id str) > reviewresponse review = reviewrequest( feedback items=\[ addfeedbackitem( sensor id="\<the id/name of the sensor>", frame id="\<the id of the frame>", # in our openlabel file this is frame frame properties external id object id="\<the id of the object>", pin=addfeedbackitempin(x=0 0, y=0 0, z=0 0), description="i post this via the python client", suggested property=addfeedbackitemsuggestedproperty( property name="propertyname", suggested property value="suggestedpropertyvalue" ), error type id=error type id, metadata={"key" "value"}, ) ], workflow=reviewworkflowenum correct, accepted=false, ) return client review\ create review(open label uuid=open label uuid, body=review) if name == " main " setup logging(level="debug") client = kognicioclient() open label uuid = "\<the uuid of the openlabel>" error type id = "\<the id of the error type>" annotation = run(client, open label uuid, error type id) posting a partial review a partial review is an incomplete review with optional feedback data, this capability may be used in order to achieve one or more of the following prepare a review task with manual or machine generated feedback select what should be reviewed this feature is available in requests where your organization is the request details docid\ wayww2ekb81abyamgbkac or the request details docid\ wayww2ekb81abyamgbkac when a partial review is posted, a review task will be created for the annotation this review task will contain the feedback that may have been supplied when posting the partial review the reviewer can then delete or invalidate any feedback they disagree with, or add their own feedback it is currently not possible to override the workflow of a partial review, the default workflow is a review loop where the correction of a rejected review task will be followed by a new review task if the review is accepted by the reviewer, the annotation will be delivered again, updated to account for any potential changes the reviewer might have done feedback a feedback item details something noteworthy in an annotation in order to do this the following information is available to add, and will be available for annotators when improving the annotation error type id a uuid selected from the list of available error types (fetch these using kognicioclient() review\ get error types() ) description a string describing what should be improved sensor id the identifier of the overview docid\ yunpnpwuhzlgg9wb9qnk8 where the error appears frame id the identifier of the overview docid\ yunpnpwuhzlgg9wb9qnk8 the error appears in our openlabel file this is frame frame properties external id object id the identifier of a particular object that is subject to this particular feedback pin a pointer to a specific area where something is of interest, this can be used to indicate missing objects (a pin contains mandatory x and y coordinates with an optional z coordinate, the coordinate unit in images are pixels and is sensor specific in 3d data) suggested properties when inappropriate property values are discovered, this field can be used to indicate a more appropriate value metadata this field may contain any data, it could be used to identify the version of the system that generated the feedback