Integration Process¶
The steps for integrating with TaskMonk is detailed below. There are two environments used for the integration process. The staging environment to test and the production environment for the actual integration.
Endpoints¶
The following endpoints are used for the integration
API Endpoints
Staging: https://preprod.taskmonk.io
Production: https://api.taskmonk.io
Portal
The portal can be used for viewing reports and status
Please ensure that the following endpoints are accessible from within the firewall
Access Credentials¶
API Endpoints
The APIs use OAuth2 client credentials mechanism for authentication. The client_id and client_secret will be provided by TaskMonk. This can be generated for each project or can be shared across projects if they belong to the same team.
Portal
Users can register on the portals using their email-id. The email-ids will be configured for the projects that they have access to.
Note: The production credentials and access are typically provided after successful integration and testing with the staging environment
Project Creation¶
The project will typically be setup by the Annotation Partner. This is done from the portal and the project_id shared with the users.
API Integration¶
Once the credentials are available, integration with the APIs can be done in one of 3 methods
Directly integrate with the REST APIs. The documentation for this is available at REST API.
Integrate with the Java SDK. The documentation for this is available at Java SDK.
Integrate with the Python SDK. The documentation for this is available at Python SDK.
Please go through Concepts., which will help with the Integration process.
Typical steps that the client code would execute are:
Create a new batch. This will return a batch_id.
Upload tasks to a batch. This will return a job_id for the upload process.
Check the job progress and wait till the upload completes.
Check the job status for completion by the Annotation Partners.
Once the job is completed by the annotation partners, get the output file specifying the fields to be included in the output.
For testing, contact the annotation partner to complete the tasks uploaded in the test server. In certain circumstances, to help with the testing, Annotation Partner can setup the project to fill in random values for the output fields and automatically submit the tasks.