Anyone know how to delete unused exports from the api endpoint analytics_api.get_analytics_reporting_exports(page_number=page_number, page_size=page_size)? Dpes it have to be deleted by Genesys or is the only option waiting?
Error reads: HTTP response body: {"message":"The total number of exports per user limit (1500) has been exceeded. Delete at least one export and try again.",..
Same answer even though they edited out the question -
https://developer.genesys.cloud/forum/t/exports-per-user-limit/24492
Is there no other way. I saw your answer but seems like no one else encountered this. Seems odd they would tell me to delete it without there being a way to delete it. Also online documentation I found said 2500 but now its 1500?
@tim.smith any official word on this couldn't find an official response beside api documentation which said 2500 was the export limit
Is there no other way.
Not that anyone has admitted to. They explicitly tell you not to use it the way you appear to be using it, so not really in their best interest to provide you a way to go against their design more easily.
I saw your answer but seems like no one else encountered this.
We're all using the analytics APIs like we were told to.
api documentation which said 2500 was the export limit
API documentation says 2500 is the number of subjects rows per export. The number of exports per user is 1500.
https://developer.dev-genesys.cloud/organization/organization/limits
Don't get too excited about those limits being listed as configurable, they can adjust them, but they won't do it just because you don't want to do it "right" way.
If all avenues to avoid limiting have been exhausted, Genesys Cloud customers can engage Customer Care to make a case for increasing a limit.
@Eos_Rios Alright since you know so much. How do you pull the counts for the abandon call summary like they are in the export? I’ve searched through the APIs and couldn’t find an api that shows the abandon counts for the times that are laid out in the same way. Also how are you doing a bulk upload of data for 5 months? That’s the only reason I hit the limit
Are you referring to where it's bucketed into different interval slots?
I achieve that using https://developer.genesys.cloud/devapps/api-explorer-standalone#post-api-v2-analytics-conversations-aggregates-query and parsing the resultant tAnswered and tAbandons into whatever buckets I choose;
tAnswered10_Sec, tAnswered15_Sec, tAnswered20_Sec, tAnswered30_Sec, tAnswered40_Sec, tAnswered50_Sec, tAnswered60_Sec, tAnswered70_Sec, tAnswered80_Sec, tAnswered90_Sec, tAnswered120_Sec,
tAnswered150_Sec, tAnswered180_Sec, tAnswered210_Sec, tAnswered240_Sec, tAnswered270_Sec, tAnswered_05_Min, tAnswered_10_Min, tAnswered_15_Min, tAnswered_20_Min, tAnswered_25_Min, tAnswered_30_Min,
tAnswered_45_Min, tAnswered_60_Min, tAnswered_90_Min, tAnswered_120_Min, tAnswered_Over_120, tAbandon10_Sec, tAbandon15_Sec, tAbandon20_Sec, tAbandon30_Sec, tAbandon40_Sec, tAbandon50_Sec,
tAbandon60_Sec, tAbandon70_Sec, tAbandon80_Sec, tAbandon90_Sec, tAbandon120_Sec, tAbandon150_Sec, tAbandon180_Sec, tAbandon210_Sec, tAbandon240_Sec, tAbandon270_Sec, tAbandon_05_Min, tAbandon_10_Min,
tAbandon_15_Min, tAbandon_20_Min, tAbandon_25_Min, tAbandon_30_Min, tAbandon_45_Min, tAbandon_60_Min, tAbandon_90_Min, tAbandon_120_Min, tAbandon_Over_120
Also how are you doing a bulk upload of data for 5 months? That’s the only reason I hit the limit
Loops are your friend. I do a lot of looping and stitching.
This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.