Regarding Participant Data
The issue is that our customer is storing too many participants on each call, and it exceeds the API limits for the conversations that can be pulled (conversations with PD > 20kb are not stored in the Participant Attribute API). How to reduce the amount of PD that is getting stored in each conversation from IVR flows?
Participant attributes are not a CRM system or database. The best practice is to store excessive amounts of data in a 3rd party system that's intended for mass data storage and put an identifier for those records as a participant attribute on the conversation.
Is this approach documented anywhere? My org has hit that limit (and some other limit below 20kb that Genesys has yet to explain) related to non-CRM data stored on conversations, and it would have avoided a lot of headache if Genesys had communicated this approach proactively.
See https://developer.genesys.cloud/organization/search/conversation-participant-attribute-search
- We only store conversations with attributes <20KB
The approach of not using participant data as a database isn't really a documented concept. It's more of a general common sense guidance that if you're running into data size limitations in one system, you move the data to an external system of record that can handle the size you want and use an identifier to point to the external system to link the two things together.
The challenge is we (as consumers of service) have no other way to audit/log/track activity or pass details to a screen pop other than participant data.
I agree some common sense on the amount of fields added is required, and maybe trim those down to essential pieces of information but without some better options (for me at least, for logging/debug info for flows) we have little other choice.
Any place you can get/set participant attributes you should be able to use a data action (i.e. Architect and Scripts). So instead of writing the data to a participant attribute, you write it to your external system of record. Reading it back works the same way. When you construct your screen pop URL, you get the external reference ID from the participant data, then query the external system to get the data into a variable, then construct the URL as normal.
Simply using less data to stay within the constraints of the Genesys Cloud features you're using is certainly the technically easier option since it has less moving parts. But it should be extensible using an external system if you really do need all that data.
Any integrations are available to support this feature?
@tim.smith The problem with that is we are looking to keep Genesys related data in the Genesys cloud eco system.
Introducing external systems and databases to store data is a massive overhead.
Specifically for me, this is NOT about CRM data, but managing conversations within Genesys from cradle to grave. Interactions going through various flows and scripts and needing to audit or log activity to aid troubleshooting and present agent scripts.
The changes to allow historic playback of flows will help, but that will still not be as quick and as simple as pulling up a conversation and looking at the attributes.
@Dileepkaranki https://appfoundry.genesys.com/ is a good place to look for partner integrations.
@SimonBrown There are tradeoffs, I can appreciate that. I'm just trying to help guide you to work within the limitations that exist. Feature requests need to be made on the ideas site and feedback about Genesys processes need to be given to your TAM. This forum is only suitable for technical discussions about what's currently possible with Genesys Cloud.
@tim.smith i was also looking for the same behaviour as @SimonBrown described because our process perform ETL process from genesys data and produce insights from that data.
I understand. Your options are to work within the limits or use an external system to work around them. There isn't a 3rd option available where you can simply exceed the limit and still have it work. If you have questions about either of the two available options, please ask them here. If you want to request new features or lodge complaints about the existing features, please use the ideas site or contact your TAM.
Hi everyone,
Here are a couple of things to consider:
-
Limits were put in on the participant attributes several years ago because many developers were shoving way too much data into these fields (one customer was putting entire customer records with history in the participant attribute field). This caused significant slowdowns in the overall performance of our analytics jobs and was causing problems with the overall stability of the platform.
-
While most third-party packages and SaaS providers will offer fields to extend their data model, they do not leave them open-ended for exactly this reason. I know, I spent the early part of my career working with Oracle Financials and would often hit these things as we tried to extend the data model by either using the "extension" fields or being creative with existing fields :).
-
Participant attributes were always meant to hold a small amount of data with most of it being primary key style IDs that could be used to look up more details records using a data action. While not as convenient as storing the data in the participant attribute, data actions offer a pretty clean mechanism to lookup any data that is not part of the core Genesys Cloud data model.
Per @tim.smith 's comment, the appropriate channel is to open an idea portal request. I have asked the conversation team's product manager to review this post and see if he has any feedback.
Thanks,
John Carnell
Director, Developer Enagement
Friendly reminder. If anyone in this thread has discussion relevant for this forum, please create a new topic.
Hi Davis,
I was just reading your message and did not know if you were aware of flow outcomes and the flow replayed. I know the participant date
-
Flow milestons let you capture "path" information of what is being hit in your call flows. You can then report on them. They were originally put in to capture analytics about what part of the call flows were being executed. They can help track path information.
-
I am pretty sure we are still in limited release, but you might want to to your TAM and see if you can get into beta. The architect team has built a flow replay that will essentially a debugger for flow calls. I have seen the tool and you can capture X amount of days worth of flow data and then select a specific call and walk through the flow to see what is happened visually.
I feel your frustration around debugging flows, but I do know the Architect team has been working on improving the overall experience. One of the challenges for them is building it so it scales because when you provide a multi-tenant service like this providing functionality across a large data store is always a challenge.
Thanks,
John