I wanted to understand why Management Unit is not provided while uploading agent schedule in the JSON format? How we can add schedule on different Management Units?
I am getting error while Importing schedule via API. I am pasting correlation ID. Can you let me know what is the issue?
I tried using both the method
METHOD 01
Using the import workflow :
making a post request api/v2/workforcemanagement/businessunits/{bu_id}/weeks/{weekId}/schedules/import/uploadurl
then making a put request to the upload url and finally making post request to api/v2/workforcemanagement/businessunits/{bu_id}/weeks/{weekId}/schedules/import
First generating a blank schedule by making a post request to /api/v2/workforcemanagement/businessunits/{bu_id}/weeks/{weekId}/schedules
Then a post request to /api/v2/workforcemanagement/businessunits/{bu_id}/weeks/{weekId}/schedules/{new_schedule_id}/update/uploadurl
a put request with the planning to the upload url and finally a post request to /api/v2/workforcemanagement/businessunits/{bu_id}/weeks/{weekId}/schedules/{new_schedule_id}/update
We are unable to directly debug customer data in the forums. I can tell you that the completion notification will contain metadata that will include the details of the error.
The flow to import a new schedule vs update an existing are nearly identical. Here is documentation on how to update a schedule:
This also includes information on which notification topics to which to subscribe to receive completion notifications, and a link to documentation on how to subscribe.
Also do I need to use any other OAuth method to execute this? Currently I m using Client Credentials.
In the document, I can see uploadKey as {"uploadKey": "123abcd4-e567-890f-g123-456h789abc0d"}
But I m getting very long uploadKey like userUploads/schedule/import/4bf4c980-c3da-4db7-be3d-8a15689e772b/b63a8f3b-252f-4249-a415-dfde94XXXX/3c0ca1f7-7e7b-4066-adca-2844458c9XXX/2024-06-24/ef3bfd84-9c3d-40f2-bXXXXb32063b09bf.input.json.gz
the document uses a fake upload key. You should use the one being provided by the API. There are a couple of ways in the SDK to subscribe to notifications. One only delivers the notification body, while the other delivers the full event along with metadata. You need the latter to see the error details, as they are contained in the metadata section.
If you're not getting a 403 stating your credentials are invalid then you're good on that front. You need to look at the error metadata on the notification to figure out what's wrong
I m currently using this Notification API v2.workforcemanagement.businessunits.{id}.schedules. Using API Explorer (genesys.cloud)
Which method shows us metadata?
Was digging around for an example to show you. This example uses the java sdk. The example is for Historical Adherence but the notification part would be the same.
Look at the bottom of the Java code snippet for the onEvent() method. You'll see that accepts a NotificationEvent parameter. The method goes on to extract the body, but the NotificationEvent itself will contain the metadata you need to debug your issue.
There's also an example here for javascript
Copying the relevant lines, it looks like you would extract the "metadata" from the "data" variable here:
websocketClient.on('connect', connection => {
console.log('WebSocket client connected and listening...');
connection.on('message', message => {
let data = JSON.parse(message.utf8Data);
let topic = data.topicName;
let eventBody = data.eventBody;
In the below documentation, do we need to create a blank schedule first and then perform the steps mentioned in the document. Or these steps will directly create the schedule and upload all the data?
Importing a schedule will create one for you. You can also update a blank schedule using the update documentation if you prefer. The general process is the same. In each case, we included the expected schema for the PUT to the returned URL as part of the response from the /uploadurl route. To see it, just open up the "API Responses" then the "200" response, it'll be under uploadBodySchema and also visible in the sdk in the same spot, even though the returned value will always be null.
Also, how to determine contentlengthbyte of a gz file? do we need to put exact value?
Also, if a schedule is already existing in WFM and we use import API will it overwrite it or throw an error?
Is there any API to know if the schedule for that week is already existing in WFM?
it's simply the file size in bytes. Yes, it must be the exact value.
A new, unpublished schedule will be created. If you want to update an existing schedule, you must use the update schedule workflow
You can have multiple unpublished schedules for a given week, but only one published. When importing a schedule, a new, unpublished schedule will be created. It will not affect any existing data. Of course, if you later publish that schedule (via the update schedule API, or in the UI), it will un-publish any previously published schedules for the affected time range. Those schedules are not lost or deleted however.
I"m sorry I'm unable to debug customer data in the forums. I would suggest posting your json data into a json validator (there are many online, or you can usually "pretty print" it in various IDEs and validate it. I can tell you from the error message that the problem is improperly formatted/bad json data somewhere in your code.
I would recommend using the SDK to generate your request body. I'm not sure what language you're coding in but you can convert it to json directly rather than hand coding it