DATASET EXPLORATION
Uploading predictions
8min
introduction in this example, we'll walk you through how to upload predictions using our api into an already existing dataset before you begin see introduction docid\ ogy7sv553esiklurze4pp and learn about the the prediction format docid\ ypuzrd5pdurfztjox6i8w steps create a new python file and import the following libraries import requests from kognic auth requests auth session import requestsauthsession base url = "https //dataset app kognic com/v1/" client = requestsauthsession() 1\ get the uuid of the dataset you can either access the tool and copy the uuid following dataset/ in the url, or utilize the datasets endpoint to get the uuid of the dataset client session get("https //dataset app kognic com/v2/datasets") 2\ get the uuid of an existing predictions group or create a new one 2 a get the uuid of an existing predictions group in order to upload predictions, a prediction group needs to exist predictions can be organized into groups for any purpose imaginable the uuid of an existing prediction group can be found in the url after predictions/ or by using the endpoint client session get(base url + f"/datasets/{datasetuuid}/predictions groups") 2 b creating a predictions group (optional) for datasets not containing segmentation tasks, a new prediction group can be creaited either by clicking in the app ( manage predictions in the upper right corner and then + create predictions group ), or by using the following code snippet path = base url + f"/datasets/{datasetuuid}/predictions groups" body = {"name" "my predictions group", "description" "a description of my new predictions group"} try response = client session post(path, json=body) response raise for status() response json = response json() print(f"created predictions group with uuid {response json\['data']}") except requests exceptions requestexception as e msg = e response text print(f"request error {e} {msg}") special case segmentation datasets predictions groups connected to segmentation datasets mus be created using the code snippet, and requires also one extra parameter called classmapping the mapping is used when calculating disagreement between predictions and annotations and will impact the sorting as well as how disagreements appear in the gallery the classmapping parameter is a list of dictionaries, where each dictionary contains the keys annotated and predicted the annotated key is the class name in the annotations, and the predicted key is the class name in the predictions {"annotated" "oak", "predicted" "tree"} if you have annotated different species of trees, but only predict wether it is a tree or not all class names in the predictions and the annotations must be present in the class mappings, even if they don't need to be mapped in the annotations, non segmented areas are labeled with the class name background example body = { "name" "my predictions group", "description" "a description of my new predictions group", "classmapping" \[ {"annotated" "oak", "predicted" "tree"}, {"annotated" " background", "predicted" "not tree"}, {"annotated" "only in annotations"} ] } 3\ upload predictions for a small amount of predictions, synchronous calls might work import requests from kognic auth requests auth session import requestsauthsession base url = "https //dataset app kognic com/v1/" client = requestsauthsession() predictions group uuid = " " openlabel content = {"openlabel" } data = { "sceneuuid" " ", "openlabelcontent" openlabel content, } try response = client session post( base url + f"predictions groups/{predictions group uuid}/predictions", json=data ) response raise for status() response json = response json() print(f"created prediction with uuid {response json\['data']}") except requests exceptions requestexception as e msg = e response text print(f"request error {e} {msg}") for larger amounts of predictions, asynchronous calls are recommended the following example uses the async client from the kognic auth library to make 100 asynchronous calls import asyncio from kognic auth httpx async client import httpxauthasyncclient base url = "https //dataset app kognic com/v1/" predictions group uuid = " " url = base url + f"predictions groups/{predictions group uuid}/predictions" openlabel content = {"openlabel" } max connections = 10 async def upload prediction(payload, session, sem) async with sem response = await session post(url, json=payload) response raise for status() return response json() get("data") async def main(n runs int) client = httpxauthasyncclient() session = await client session sem = asyncio semaphore(max connections) tasks = \[] for i in range(n runs) payload = {"sceneuuid" " ", "openlabelcontent" openlabel content} task = upload prediction(payload, session, sem) tasks append(task) responses = await asyncio gather( tasks) await session aclose() print(responses) if name == ' main ' asyncio run(main(100)) setting max connections to something bigger than 10 might not work and is not recommended