Download recordings

Genesys team,

I am working on download recordings, trying to run the example at https://developer.mypurecloud.com/api/tutorials/recordings-downloader/index.html?language=java
I am stuck at credentialing. What type of authentication do I setup in Admin to allow for verification? Is there any documentation to review with this specific example.
Thanks

Hello,

At step 4 of the tutorial, it is mentioned that "Client Credentials Grant Type will be used"

You will need to create an OAuth Client in the configuration (PureCloud - Admin), using Client Credentials Grant as Grant Type.
See here for information on how to create an OAuth client: https://help.mypurecloud.com/articles/create-an-oauth-client/

This application making use of analytics conversation details query and recording queries (in API), you will need to assign a role to your OAuth client that contain the necessary permissions (like "analytics - conversationDetail - View" for the analytics conversation details query, and "recording" permissions for the rest).

Regards,

Thanks Jerome, I got past the credentialing. Now I am working on getting list of conversationIDs. My array size is zero. I am checking If I need any specific permissions to grab all conversations for the day.

The permission required to run it is the "analytics - conversationDetail - view".
I can't say for sure as I don't have access to your org.

But, one thing is about the string containing the date interval.
It is YYYY-MM-DDThh:mm:ss/YYYY-MM-DDThh:mm:ss or YYYY-MM-DDThh:mm:ss.000Z/YYYY-MM-DDThh:mm:ss.000Z.
It is a UTC time.

The other thing is if you have multiple Divisions defined in your PureCloud environment, and the call occurred on a Division you didn't give your OAuth Client access to. https://help.mypurecloud.com/articles/about-access-control/

Regarding your query, you can use the Developer Tools to see/check what Analytics Conversation Details query brings back.
Here for the Analytics Query Builder (select Conversation Detail as Query Type at the top): https://developer.mypurecloud.com/developer-tools/#/analytics-query-builder
You can also try all of the API requests directly using the API Explorer: https://developer.mypurecloud.com/developer-tools/#/api-explorer

Jerome, Thanks for the guidance. I am able to download the recordings.
Question: Is there a way to reconcile and double check that I am receiving all the recordings?

Also, are there any stats on how Integral Care instance is performing on Genesys? I am looking for more of Systems admin view – system uptime, total call, latencies etc.

"Question: Is there a way to reconcile and double check that I am receiving all the recordings?"

I am not familiar with the recordings downloader script, but having a look at the code, that's what the script is doing at the beginning.

The script first retrieves all conversations in a specified interval - via conversationsApi.postAnalyticsConversationsDetailsQuery (Analytics - Query for conversation details).

Then for each of the conversations, the script retrieves the conversation recording metadata - via recordingApi.getConversationRecordingmetadata (Recording - Get recording metadata for a conversation).
If there was no recording enabled for the conversation, this request will return an empty array for the convresation.
If there were one or multiple recordings for the conversation, this request will return an array containing an entry for each of the recorded file (it contains an "id" attribute which corresponds to the recordingId, and a "conversationId" attribute which corresponds to the conversationId).

From what I see in the code, the script uses the recordingId and the conversationId to build the filename.

If you want to double check, you could run that part of the code (making a different script), and write in a file the (recordingId, conversationId) entries.

Regarding your second question, I don't have that information and I am not aware of such metrics in the API.
You could possible raise the question on the Genesys Cloud Community Forum - which is about configuration and capabilities of the platform. https://community.genesys.com/communities/purecloud?CommunityKey=bab95e9c-6bbe-4a13-8ade-8ec0faf733d4

Thanks Jerome. Do you have any guidance on Bulk export, java example that is similar to https://developer.mypurecloud.com/api/tutorials/recordings-bulk-actions/index.html?language=python&step=1
Thanks

You mean a version in java of the Bulk delete\export of recordings tutorial?
If so, no - I unfortunately don't have a java example of such.
You will have to adapt the script/sample from nodejs or python to java.

The beginning of the script would be similar to the download recordings - for the login credentials part. Then you would need to modify the API methods used to the ones used in the Bulk delete/export tutorial.

Note that the Bulk export is based on this: https://help.mypurecloud.com/articles/about-the-aws-s3-recording-bulk-actions-integration/

thanks Jerome. will review. Appreciate the quick response and support.

Jerome/PureCloud forum users,

Downloading recordings with bulk/Batch download code with conversations ids & recording Ids is extremely slow, to the point its not feasible. I tried query and also passing multiple conversation ids. I see very slow throughput...few downloads are about 9mins apart. Is there a faster way to download recordings? we have about 1000 a day to download.

batch code download:
import java.io.*;

import java.text.*;

import java.time.OffsetDateTime;

import java.time.ZoneOffset;

import java.time.ZonedDateTime;

import java.time.format.DateTimeFormatter;

import java.net.*;

import java.nio.file.Files;

import java.nio.file.Path;

import java.nio.file.StandardCopyOption;

import java.util.*;

import com.mypurecloud.sdk.v2.Configuration;

import com.mypurecloud.sdk.v2.api.ConversationsApi;

import com.mypurecloud.sdk.v2.api.RecordingApi;

import com.mypurecloud.sdk.v2.model.*;

import com.mypurecloud.sdk.v2.PureCloudRegionHosts;

import com.mypurecloud.sdk.v2.ApiClient;

import com.mypurecloud.sdk.v2.ApiResponse;

import com.mypurecloud.sdk.v2.ApiException;

import com.mypurecloud.sdk.v2.extensions.AuthResponse;

public class BulkDownloadApp {

/*

 * public static void main(String[] args) throws Exception {

 * System.out.println("Hello, World!"); }

 */

public static void main(String[] args) {

    authenticate();

    downloadConversations(dates);

}

private static ConversationsApi conversationsApi;

private static RecordingApi recordingApi;

private static ApiClient apiClient;

private static String clientId;

private static String clientSecret;

private static String dates, date1, date2;

private static void authenticate() {

    System.out.println("authenticate");

    // Input

    /*

     * Scanner s = new Scanner(System.in); System.out.print("Client ID: "); clientId

     * = s.nextLine(); System.out.print("Client secret: "); clientSecret =

     * s.nextLine(); System.out.print("Dates: "); dates = s.nextLine();

     */

    // Input

    clientId = "fc716efa-af4b-4a1f-8a07-bc6a49abc578";

    System.out.println("Client ID: " + clientId);

    clientSecret = "09vYM0PtkjakaSGvQkfO_-rH4fzreDK_Dxy05DqWS_s";

    System.out.println("Client secret: " + clientSecret);

    dates = "2020-05-08T12:00:00.000Z/2020-05-09T12:00:00.000Z";

    //date1= localToGMT("05/08/2020T06:01:01:001 UTC+06:00");

    //date2=localToGMT("05/08/2020T07:01:01:001 UTC+06:00");

    //dates = date1+"/"+date2;

    System.out.println("Dates: " + dates);

    // Set Region

    PureCloudRegionHosts region = PureCloudRegionHosts.us_west_2;

    apiClient = ApiClient.Builder.standard().withBasePath(region).build();

    try {

        ApiResponse<AuthResponse> authResponse = apiClient.authorizeClientCredentials(clientId, clientSecret);

        System.out.println("authResponse=" + authResponse.toString());

    } catch (Exception e) {

        System.err.println("Exception when authenticating.");

        e.printStackTrace();

        System.out.println(((ApiException) e).getRawBody());

    }

    System.out.println("Got past credentialing ");

    // Use the ApiClient instance

    Configuration.setDefaultApiClient(apiClient);

    // Create API instances

    //conversationsApi = new ConversationsApi();

    recordingApi = new RecordingApi();

}

private static void downloadConversations(String dates) {

    System.out.println(" +++ downloadConversations");

    // Call conversation API, pass date inputted to extract conversationIds needed

    try {

        //ConversationQuery body = new ConversationQuery().interval(dates);

        //body.setPaging(new PagingSpec().pageSize(100));

        

        //AnalyticsConversationQueryResponse result = conversationsApi.postAnalyticsConversationsDetailsQuery(body);

        //extractConversationDetails(result);

        extractConversationDetails();

    } catch (Exception e) {

        System.err.println("Exception when calling ConversationsApi#postAnalyticsConversationsDetailsQuery");

        e.printStackTrace();

        System.out.println(((ApiException) e).getRawBody());

    }

    System.out.println(" +++ downloadConversations done");

}

// Format conversation details to object inside and array. Get every mediatype

// per conversation

private static void extractConversationDetails() {

    // System.out.println("extractConversationDetails");

    // Create conversationIds array to store all conversationId

    ArrayList<String> conversationIds = ReadFileLineByLineUsingBufferedReader();

    /*conversationIds.add("8635a126-8824-4526-ab05-90217c12a1d0");

    conversationIds.add("a98d003a-099b-447d-b12e-eb86a96bfbb7");

    conversationIds.add("57f431b5-2e13-4740-969c-368bbda16d16");

    conversationIds.add("1e3d8943-752e-48e8-887e-239b10728e0a");

    conversationIds.add("79074ed4-a965-405f-a5a4-b562d652c5ad");

    conversationIds.add("0b978bac-1996-468b-9dc7-843c574919e9");

    conversationIds.add("a3876250-856d-4e37-ba9b-ee5a2cafe488");

    conversationIds.add("66ce5569-83fc-4c62-b33a-6b46d25c6aec");

    conversationIds.add("eed957c2-bea7-47c3-8a10-8fbb5df93774");

    conversationIds.add("c1078ecc-4bbc-4f87-9ccb-567d0bef12bd");

    conversationIds.add("f07b9c82-ab7a-4fd0-ace4-722735d4071c");

    conversationIds.add("e826ad46-0642-43b6-889e-4cb73dcc2a72");

    conversationIds.add("d7f95d00-630f-4484-85e9-637efb7cd8c2");

    conversationIds.add("7b4d1703-8042-4ca9-9a78-1371a291cd79");

    conversationIds.add("e96c769e-8e47-4b82-ab08-61c191087674");

    conversationIds.add("0e9b11b5-f3f7-4de0-9fad-eb11974b8d85");

    conversationIds.add("0ec6f582-decd-4e21-a9ae-0c7f18fde478");

    conversationIds.add("9a68663f-5af3-41cc-83f7-8ecd40bb8a87");

    conversationIds.add("3781bac5-8b7b-4be4-b1cb-358ef1830020");

    */

    System.out.println(" +++ extractConversationDetails, conversationIds=" + conversationIds.size());

    //System.out.println(" +++ extractConversationDetails, conversationDetails array size="+ conversationDetails.getConversations().size());

    /*conversationDetails.getConversations().forEach((c) -> {

        System.out.println(" +++ getConversationId List val--"+ c.getConversationId());

        //conversationIds.add(c.getConversationId());

    

    });

    */

    // Call getRecordingMetaData function through all IDs

    conversationIds.forEach((c) -> getRecordingMetaData(c));

    System.out.println(" +++ extractConversationDetails done");

}

private static ArrayList<String> ReadFileLineByLineUsingBufferedReader() {

    BufferedReader reader;

    ArrayList<String> conversationIds = new ArrayList<String>();

    try {

        reader = new BufferedReader(new FileReader(

                "C:/GenesysCallRecordings/ConversationIds.txt"));

        String line = reader.readLine();

        while (line != null) {

            System.out.println(line);

            conversationIds.add(line);

            // read next line

            line = reader.readLine();

        }

        reader.close();

    } catch (IOException e) {

        e.printStackTrace();

    }

    return conversationIds;

}

// Format conversation details to object inside and array. Get every mediatype

// per conversation

private static void extractConversationDetails(AnalyticsConversationQueryResponse conversationDetails) {

    // System.out.println("extractConversationDetails");

    // Create conversationIds array to store all conversationId

    ArrayList<String> conversationIds = new ArrayList<String>();

    System.out.println(" +++ extractConversationDetails, conversationIds=" + conversationIds.size());

    System.out.println(" +++ extractConversationDetails, conversationDetails array size="+ conversationDetails.getConversations().size());

    conversationDetails.getConversations().forEach((c) -> {

        System.out.println(" +++ getConversationId List val--"+ c.getConversationId());

        //conversationIds.add(c.getConversationId());

    

    });

    // Call getRecordingMetaData function through all IDs

    conversationIds.forEach((c) -> getRecordingMetaData(c));

    System.out.println(" +++ extractConversationDetails done");

}

// Generate recordingId for every conversationId

private static void getRecordingMetaData(String conversationId) {

    // System.out.println("getRecordingMetaData");

    try {

        List<Recording> result = recordingApi.getConversationRecordingmetadata(conversationId);

        System.out.println(" +++ getRecordingMetaData, recording result=" + result.size());

        // Iterate through every result, check if there are one or more recordingIds in

        // every conversation

        result.forEach((r) -> getSpecificRecordings(r));

    } catch (Exception e) {

        System.err.println("Exception when calling RecordingApi#getConversationRecordingmetadata");

        e.printStackTrace();

        System.out.println(((ApiException) e).getRawBody());

    }

    System.out.println(" +++ getRecordingMetaData done");

}

// Plot conversationId and recordingId to request for batchdownload Recordings

private static void getSpecificRecordings(Recording recording) {

   // System.out.println(" +++ getSpecificRecordings, conversationid=" + recording.getConversationId() + ",recid="

     //       + recording.getId());

    BatchDownloadRequest batchRequest = new BatchDownloadRequest().conversationId(recording.getConversationId())

            .recordingId(recording.getId());

    List<BatchDownloadRequest> batchRequestList = new ArrayList<>();

    batchRequestList.add(batchRequest);

    BatchDownloadJobSubmission body = new BatchDownloadJobSubmission().batchDownloadRequestList(batchRequestList);

    try {

        BatchDownloadJobSubmissionResult result = recordingApi.postRecordingBatchrequests(body);

        recordingStatus(result);

    } catch (Exception e) {

        System.err.println("Exception when calling RecordingApi#postRecordingBatchrequests");

        e.printStackTrace();

        System.out.println(((ApiException) e).getRawBody());

    }

    //System.out.println(" +++ getSpecificRecordings done");

}

// Plot conversationId and recordingId to request for batchdownload Recordings

private static void recordingStatus(BatchDownloadJobSubmissionResult recordingBatchRequest) {

    //System.out.println(" +++ recordingStatus=" + recordingBatchRequest.getId());

    try {

        BatchDownloadJobStatusResult result = recordingApi.getRecordingBatchrequest(recordingBatchRequest.getId());

        System.out.println(" +++ recordingStatus=" +result.getExpectedResultCount()+" == "+result.getResultCount() +"=="+recordingBatchRequest.getId());

        if (result.getExpectedResultCount() == result.getResultCount()) {

            getExtension(result);

        } else {

            //System.out.println(" +++ recordingStatus, before sleep" + recordingBatchRequest.getId());

            Thread.sleep(300);

            recordingStatus(recordingBatchRequest);

        }

    } catch (Exception e) {

        System.err.println("Exception when calling RecordingApi#getRecordingBatchrequest");

        e.printStackTrace();

        System.out.println(((ApiException) e).getRawBody());

    }

    //System.out.println(" +++ recordingStatus done");

}

// Get extension of every recordings

private static void getExtension(BatchDownloadJobStatusResult batchRequestData) throws IOException {

    // System.out.println("getExtension");

    // Store the content type to a variable that will be used later to determine the

    // extension of recordings.

    String contentType = batchRequestData.getResults().get(0).getContentType();

    // Slice the text and gets the extension that will be used for the recording

    String ext = contentType.substring(contentType.lastIndexOf("/") + 1);

    downloadRecording(ext, batchRequestData);

}

// Download Recordings

private static void downloadRecording(String ext, BatchDownloadJobStatusResult batchRequestData)

        throws IOException {

    //System.out.println("Processing please wait...");

    String conversationId = batchRequestData.getResults().get(0).getConversationId();

    String recordingId = batchRequestData.getResults().get(0).getRecordingId();

    String sourceURL = batchRequestData.getResults().get(0).getResultUrl();

    //System.out.println("sourceURL=="+sourceURL);

    String targetDirectory = "c:/GenesysCallRecordings";

    URL url = new URL(sourceURL);

    String fileName = conversationId + "_" + recordingId;

    Path targetPath = new File(targetDirectory + File.separator + fileName + "." + ext).toPath();

    //System.out.println("targetDirectory=" + targetDirectory + "=fileName=" + fileName);

    Files.copy(url.openStream(), targetPath, StandardCopyOption.REPLACE_EXISTING);

}

public static String localToGMT(String dateString) {

    /* "2020-05-08T11:00:00.000Z/2020-05-08T11:59:59.000Z"; */

    /*String output = ZonedDateTime.now( ZoneOffset.UTC ).withNano( 0 ).toString();

    SimpleDateFormat sdf = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss.SSSZ");

    sdf.setTimeZone(TimeZone.getTimeZone("UTC"));

    String gmt = null;

    try {

        Date strDt = sdf.parse(inDate);

        gmt = sdf.format(strDt);

    } catch (Exception e) {

        // TODO Auto-generated catch block

        e.printStackTrace();

    }

    System.out.println("gmt time ="+gmt);

    return gmt;*/

    DateTimeFormatter DATE_TIME_FORMATTER = DateTimeFormatter

                            .ofPattern("MM/dd/yyyy'T'HH:mm:ss:SSS z");

     

    //Date string with zone information

    //String dateString = "08/03/2019T16:20:17:717 UTC+05:30";



    //Instance with given zone

    ZonedDateTime zdtInstanceAtOffset = ZonedDateTime.parse(dateString, DATE_TIME_FORMATTER);



    //Instance in UTC

    ZonedDateTime zdtInstanceAtUTC = zdtInstanceAtOffset.withZoneSameInstant(ZoneOffset.UTC);

     

    //Formatting to string

    String dateStringInUTC = zdtInstanceAtUTC.format(DATE_TIME_FORMATTER);

     

    System.out.println(zdtInstanceAtOffset);

    System.out.println(zdtInstanceAtUTC);

    System.out.println(dateStringInUTC);

     

    //Convert ZonedDateTime to instant which is in UTC

    //System.out.println(zdtInstanceAtOffset.toInstant());

    return zdtInstanceAtUTC.toString();

}

}

Hello,

As suggested in this other similar post, you can open a case with Genesys Cloud (PureCloud) Care to investigate the download speeds.

Otherwise, if it fits your model, you could also investigate the use of the AWS S3 recording bulk actions integration, which will allow to automatically export recordings, screen recordings, attachments, and metadata in bulk and store them in a configured AWS S3 bucket.

Thanks Jerome but Amazon S3 wont fit our model, hence the download approach.

I took a quick look over your code and it appears it's functioning fully synchronously on a single thread. Adding multithreading in your app and/or using async operations would allow you to process operations in parallel, which can significantly increase your app's throughput.

thanks Tim for the feedback. I will try that. I am also trying this example
https://developer.usw2.pure.cloud/api/tutorials/recordings-bulk-actions/index.html?language=python&step=3

where can i find integration-id for data action and not s3?
query.integration_id = "integration-id"

You can't integrate the recording export API with a data action. It's for S3.

This topic was automatically closed 31 days after the last reply. New replies are no longer allowed.