Data table Mass Insert and Mass Delete

Has anyone been able to perform a mass insert or delete of multiple rows in a custom data table? There are two endpoints which allow you to make an insert and delete a specific row:

POST /api/v2/flows/datatables/{datatableId}/rows
DELETE /api/v2/flows/datatables/{datatableId}/rows/{rowId}

These work great for one record but our scenario is we have a data table that contains some employee data that a validation workflow will look at when an employees calls into a hotline. I am looking to empty this table and reload it daily from data stored in our data warehouse. Right now with these two end points i would need to create a job that would do this through loops and concerned of the time it would take to run and using up API calls.

If anyone has tried this and had success or found a way to format the JSON in the body of the call to make this work your help would be appreciated. I know for the insert there is a bulk import option available which I can use but still have not found something for the mass delete. Dropping the entire table is not an option as it would break the workflow.

1 Like

Hello,

Have you tried POST /api/v2/flows/datatables/{datatableId}/import/jobs with importMode set to ReplaceAll (in request body)?
This should allow to replace old records with the ones from the import.

Regards,

Hi Jerome,

I was not aware that the bulk import can do that. Is there further documentation with examples or an overview on how to set up the import and trigger it besides the API explorer.

I usually check what Genesys Cloud Desktop does (using Chrome Developer tools - Network tab). That's a good way to know/check how an endpoint or a set of endpoints should be used. Actually, that's what I did before answering your question :slight_smile: Trying Manage Imports on Genesys Cloud Desktop Admin UI for a data table (replace all or append choice at the bottom).

Regards,

Right I can see that but what I cannot figure and can't find in the document is how can I specify in the body of the call where to go grab my file locally to upload. So what I did manually is:

  1. Download the import template
  2. Fill the template with a couple of rows and keep the header row.
  3. Import the file with the ReplaceAll command.

If I use the chrome developer tools I see this:

(I blocked out my data table ID)

And this:

The upload URI is being generated by PureCloud because I manually uploaded the file but if I want to send this file from a local machine that my job will be running on how can I pass this in the API call?

Sorry. I thought you were already familiar with this import endpoint.

The import process for a data-table is similar to the upload of a contact list, a DNC list, ...
It is a two-step process.

The first step is to request the import.
This is what POST /api/v2/flows/datatables/{datatableId}/import/jobs is doing.
If you want to replace all existing content with the new one (rip and replace), you just need to send "importMode":"ReplaceAll" in the request body.
I mean:

{
    "importMode":"ReplaceAll"
}

If Genesys Cloud is ok with the import request, it will answer with a 202 Accepted.
In the response body you can see "status": "WaitingForUpload" and the "uploadURI" to use in next step.

The second step is to post the content of the file to this uploadURI.
This is done with an HTTP POST to the uploadURI, with Content-Type set to "multipart/form-data", including your Authorization header/token, and with content of the body being your file (in form-data).
You should see the request to this endpoint just after the one from first step (in your Chrome Network tab).

It is similar to what is described here for a Contact List - Upload Contact Lists.
The example is made for a web page so it makes a JQuery Ajax request to format the request properly. But you can do a similar thing in the language that your are using for your tool.
Unlike the Contact List example, only file is necessary.

I mean something like:

function uploadDatatableRows(file) {
   var data = new FormData();
   data.append('file', file);

   $.ajax({
       url: 'YOUR_UPLOAD_URI',
       type: 'POST',
       headers: {
           Authorization: 'bearer ' + authToken
       },
       data: data,
       processData: false,
       contentType: false
   });
}

Regards,

I just want to give you a big thank you. Was using postman for testing and had to tweak a few things but got it to work. This should really get added to the API documentation.

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.