I'm currently running a bulk export job into an S3 bucket. The job report says it's in a Fulfilled state:
{
id: '0167b14f-6dfe-402b-8aa6-92067406d2a0',
state: 'FULFILLED',
recordingJobsQuery: [Object],
dateCreated: '2023-08-17T12:33:07.160Z',
totalConversations: 39,
totalRecordings: 32,
totalSkippedRecordings: 0,
totalFailedRecordings: 0,
totalProcessedRecordings: 32,
percentProgress: 100,
failedRecordings: '/api/v2/recording/jobs/0167b14f-6dfe-402b-8aa6-92067406d2a0/failedrecordings',
selfUri: '/api/v2/recording/jobs/0167b14f-6dfe-402b-8aa6-92067406d2a0',
user: [Object]
},
It started at 13:33 BST and is still running at 14:57 with 6 records left to appear in the S3 bucket. That's all good, no problems here.
The question I have is around the process itself. I have a lot more records to export and I'd like to get some idea of timings. The batch job is "FULFILLED", but is clearly still running as not all recordings are in S3 yet. So, what's going on? Feels like Genesys has put them in an AWS queue and said "job done", and now AWS is working through the queue and filing away in S3. if this is the case, is there any way I can monitor the progress of the job? It's fine with only 32 recordings, but I'm planning to move thousands. What would the consequences be of setting off a batch for a month, rather than a day? Would it affect system performance? I'm hoping not, as this is a cloud app not on-prem!