Guide: Create a new project
Only Project Managers and Workforce Managers can create new projects.
Projects are used to group and organize annotation work in the Kognic platform. Inside the project, the annotation work is further organized into annotation requests detailing what data to annotate and how. Below is a guide on how to create a new project.
If you want to create a new request in an existing project, read more in the chapter Guide: Create a requestο»Ώ.
You can create a new project by selecting the page "Create new project" in the Project Management menu.
As the project name is referred to heavily inside and outside the Kognic app, the name must be straightforward and easy to remember. Therefore, we recommend you give the project a name that refers to its content and try to avoid using names made up of random letters and numbers.
When you create a project, you must also create its first annotation request alongside it.
Annotation request is the core concept in the organization of annotation work at Kognic. The request's configuration specifies what data to annotate (Input batch), how to annotate the data (Annotation Instruction: Taxonomy and guideline), which process the data should be produced by (Workflow), and who should annotate and quality-assure the data (Team).
Name the request When naming the request, ideally, use a straightforward name that refers to the type of annotation you plan to use it to produce. Try to avoid using names made up of random letters and numbers.
Name the input batch When creating a request, you need to create a new input batch alongside it. The input batch is referenced when uploading your data to the request.
An input batch is a group of inputs. An input is a set of sensor data you want to annotate. It usually consists of at least one image/frame. It can also contain images from several cameras, a point cloud, a video, or a sequence of images. Read more in our API Documentation. ο»Ώ
Select an annotation type Kognic uses annotation types to categorize the annotations produced in different requests. Select the one that best matches the annotations you plan to produce in your request.
The organization with the request role Producer is responsible for producing annotated data in the request. Users from this organization are allowed more detailed monitoring and management options and can configure the request team in detail. All to ensure they can successfully monitor and manage the production process.
The input type decides what data type you can upload to your newly created input batch and later annotate in the request.
The available input types are:
Input type | Description |
Lidar and Cameras | A single frame, containing both camera data (from one or multiple cameras) and lidar data. |
Lidar and Cameras Sequence | A sequence of frames, containing both camera data (from one or multiple cameras) and lidar data. |
Cameras | A single frame of camera data. The data can be from one or multiple cameras. |
Cameras Sequence | A sequence of camera frames. The data can be from one or multiple cameras. |
The request workflow determines the steps used to produce annotations and in what order. They define what type of steps (tasks) to complete before we have produced a deliverable annotation. Below we will describe the available workflows.
Annotate + Review
Every input is first annotated and then reviewed. If the annotation's quality is insufficient, the reviewer can reject the annotation. If rejected, the annotation is sent for a review correction and then reviewed again. This loop continues until the annotation is accepted, at which point the annotated input becomes delivery-ready.
Annotate + Correct + Review
Every input is first annotated, then corrected and lastly, reviewed. If the annotation's quality is insufficient, the reviewer can reject the annotation. If rejected, the annotation is sent for a review correction and then reviewed again. This loop continues until the annotation is accepted, at which point the annotated input becomes delivery-ready.
Experimental workflows
Sometimes, we let our users test workflows that are still under development. If such a workflow is available for testing, you will recognize it by its blue label saying "Experimental workflow".
You can learn more about our experimental workflows in the chapter Flexible Workflowsο»Ώ.
The error types are available in the Feedback Tools during Review tasks, and during follow-up Correction tasks. If your organization has many error types configured, you might want to select a subset of error types that are relevant for this request. You can learn more about error types on the page.
Once everything above has been specified, you can create your new project and its first request. π
After creating your project and its first request, you need to collaborate with Kognic to ensure the request is ready for production. This means ensuring the following things are done:
- Upload data to the input batch You can read more about this process in our API documentation. If questions remain, reach out to your Kognic contact.
- Prepare an Annotation Instructionο»Ώ and ask Kognic to connect it to the request A guideline and a taxonomy are needed to start the production of annotations in the request. These specify what types of annotations should be created in the request and are needed to ensure the relevant information and tools are available to the team members. The taxonomy and guideline are part of what we call an Annotation Instruction. You can create and publish Annotation Instructions in the app, see Annotation Instructionο»Ώ. When you have set the revision in Annotation Instruction to published, Kognic will then add the additional settings and set it to ready for production. You can select a revision that is set to ready for production when creating a new request. Since this is a new project you won't have any revisions ready. For changes on ongoing requests you need to contact Kognic.
- Add team members To get the data label according to your selected workflow, you need a team. This team will be responsible for completing tasks during production. In the tab Teamο»Ώ you can add team members and configure what type of tasks they should access and if they should get them automatically or not. Read more about adding team members in the chapter Teamο»Ώ.
- Activate automatic task allocation When all configuration is completed and you are ready to start producing annotations in the request - we encourage you to activate our automatic task allocation system. You can read how the system works and how to activate it in Guide: Automatic task allocationο»Ώ.
ο»Ώ