We have four environments in Genesys Cloud(Dev, Test, Prod, DR) and the infrastructure (objects) is created in all the four environments manually.
We want to automate the process deployment on all the orgs.
The Dev environment Genesys objects need to be exported to Test and other environments. Here in every environment we have similar or excess data to the other Environment like for an instance a object called Routing skills has 114 Skills in Dev and in Test we have like 109 and both are not in Sync and also we have few skills which is not in Test and what we have in Test we don't have them in Dev.
We are using AWS pipelines using Terraform Providers (Genesys Cloud Provider / My pure cloud), first we tried making connection and creating new skills which we are able to create now. When we are trying to export the Dev data to Test the pipeline is failing and the logs says "Existing data because all the data is created manually". So we tried keeping the Test data in Terraform state file by using Terraform Export and downloading the state file and editing it and replacing after editing it. We are able to export the Dev data to Test but what is there in test which is not available in Dev is getting destroyed to be more precise its syncing the data with Dev in Test.
We need some advice and suggestion or an approach to achieve this and it shouldn't destroy any data from both and should replace whatever is missing or added new.
Thanks in advance !!!