Everything works great except we are cut short. For a client that has 25 prompts with that prefix, we only get 15 back in the export .tf and audio files. For clients that have 30, it only exports 20. These are just some examples but the number of missing files can vary depending on the client. But interestingly its always the same files and file count that is missing for the same client regardless of times I run this export for a particular client.
Any ideas if this a bug or anyone experience this issue? To note, we have more prompts than can be displayed in the Architect > Prompts page.
To confirm: you're trying to export user prompts whose name starts with "ClientName_"? I tried recreating the bug you're seeing by creating 25 prompts with a name that starts with "ClientName_" and exporting them. The regex looked like this:
Thanks @charlie.conneely I'll update the regex and try again. It works with what I had it just didn't get all prompts. Nothing different in the prompts being exported. One thing that is notable is that we have more prompts than the get prompts api seems to support (sdk_debug shows last page 100 of page size 100 and thats it). Is there a way you can test your .tf on our org? or do you have a lab org with excess of 10k prompts?
This could be an API issue if it is stopping at page 100. Inside our provider, we collect all prompts using a for loop that terminates when the counter equals the pageCount returned from GET /api/v2/architect/prompts (the counter is used as the pageNumber on each iteration).
If you make a call to GET /api/v2/architect/prompts?pageSize=100 and the value of total is greater than 10000, but the pageCount is not over 100, then this is probably an API bug and not an issue with our provider.
GET you provided shows exactly 10000 but seems inaccurate because last prompt is p since sorted by name asc and we definitely have clients with prompt prefix that begins with z. (api limitation?)
After talking with the Architect team, it looks like 10,000 is indeed the API limit. I just created a ticket to investigate this further and we will get a dev working on a solution as soon as we can.