Integrations
On this page users can create and manage their data integrations. It is possible to set up data integrations with your cloud buckets, which allows you to import data to the Kognic Platform without having to upload it from your local machine. After setting up a data integration, it is possible to use Kognic IO to import data from the specified bucket.

Right now it is possible to set up data integrations with the following cloud storage providers:
- Google Cloud Platform (GCS)
- Amazon Web Services (S3)
If you are using a different cloud provider that you would like to set up a data integration with, reach out to your Kognic contact.
To set up a new integration, click on Create new integration. Follow the steps below depending on which cloud provider you will set up the integration with.
Note that the guides below walk you through the process of giving Kognic read access to one of your cloud buckets.
Choose cloud provider
Select the option Google Cloud Platform.
Provide Bucket information
In GCP, navigate to the bucket you want to integrate with. Copy the bucket’s name and paste it in the Name field.
Grant service account access
In the Kognic Platform, copy the Kognic service account number. In GCP, within the bucket’s Permission tab, click Grant Access and paste the service account number as a New Principal and assign the role Storage Object Viewer.
Finalize setup
In Kognic, click Create integration in the bottom right corner.
Choose cloud provider
Select the option Amazon Web Services.
Provide Bucket information
In this step you need to provide the name and AWS region of the bucket you will integrate with.
a) Specify bucket name
In AWS, navigate to the bucket you want to integrate with. Copy the bucket’s name and paste it in the Name field in Kognic.
b) Specify AWS region
For the same bucket, specify it’s AWS region, e.g. eu-north-1, in the AWS region field in Kognic.
Create a role for Kognic in AWS
Next, you need to set up a Role for Kognic in AWS and give it S3 read access to your bucket.
a) Create a new role in AWS
In AWS, navigate to Identity and Access Management (IAM), select Roles and click on Create role. For Trusted entity type select Custom trust policy.
Copy the Trust policy from the Kognic Platform, go back to AWS and paste it in the Custom trust policy section.
Click Next and skip the permissions step for now by clicking Next again. Give your role a descriptive name and click the Create role button.
b) Specify the Role ARN
In AWS, click on the role you just created and copy the Amazon Resource Name (ARN). Paste it in the Role ARN field in Kognic.
c) Give the role S3 read access
In AWS, for the role that you just created, scroll down to the Permission policies section and click on Add permissions followed by Create inline policy.
Copy the Inline policy from the Kognic Platform, head back to AWS and in the policy editor, click on JSON and paste the policy.
Click on next, give the policy a name and click Create Policy.
Finalize setup
In Kognic, click Create integration in the bottom right corner.
When an integration has been created you can test if it works by following the steps below:
Find the integration you want to test
In the Kognic Platform, locate the integration you want to test in the Integrations-table. Click on the three dots next to it and in the subsequent context menu click Run a test...
Find and copy an object's URI
In GCP:
Navigate to the bucket and open it. Next to any folder of file, click the three dots and in the subsequent context menu click Copy gsutil URI.
In AWS:
Navigate to the bucket and open it. Click on a file to open it and then click on Copy S3 URI.
Run integration test
Go back to the Kognic Platform and paste the object URI into the "URI" field in the dialog window. Click Test integration.
If test was successful, you will get a success message and can click on OK, I got it!. The Test status value will change to "Successful" in the table.
If the test fails you an notification will appear with an error message that you can use for debugging.
Use-cases, e.g. creating scenes using a data integration, is currently being worked on. Stay tuned for more updates!