Batch request and file retention

Hi!

I'm working on an integration that is using the batchrequests endpoint ( POST /api/v2/recording/batchrequests) to download recordings.
I know each link that we obtain when querying the job status (GET /api/v2/recording/batchrequests/{jobId}) has an expiration, but If I query the job I can obtain new presigned urls.
Is there any time constraint between the job being ready (as in having all the results available for download) and the query operation to obtain the presigned urls?

Thanks!

I believe recording download URLs have a TTL of 1 hour. Be sure to look into the AWS S3 recording integration to see if it fits your use case; it's the preferred way to export many recordings as a bulk process. The batchrequests endpoint is meant for an actual end user to download more than one recording locally to their machine.

Tim, thanks for your response!

I see, will check the bulk actions option as well.

Do you know if the files that I obtain using the presigned URLs have some kind of retention time?

Storage and retention of recordings is fully controlled by your recording and retention policies:

1 Like

This topic was automatically closed 31 days after the last reply. New replies are no longer allowed.