Hi,Current Process: Users cut an edition through a custom GUI (allows users to move content through Pre-Production servers to STAGING to Production servers). They deploy the attached files to the Production Database. The script creates a file list of all the attached and included DCRs and then deploys them one by one to the DB.Business Problem: Users need an ability to schedule deployments.Issue: There could be instances where we are scheduling around 1500 DCR deployments (single edition). We cannot schedule those 1500 deployments to run concurrently as that might bring the server down. An alternative discussed is to schedule each deployment instance around 3-5 secs apart but then this will take a lot of time to complete when we run in 1000 odd DCRs.Is there a viable alternative approach to tackle this business case?ThanksDhruv
Hi,Appreciate the quick reply.Reverting content due to conflicting deployments is not a concern.The deployments will be invoked through a custom GUI.Scenario: There is a user who creates 2 separate editions for deployment. S/he schedules both of them for lets say midnight. Both jobs/ editions have 100 attached DCRs.
These 200 DCRs will become around 600-700 when we take into account the master-child relationships between DCRs. Each of these 700 DCRs will be deployed individually.
As per your suggestion, we should group these 700 DCRs into a batch of lets say 100 reducing the number of data deployments to 7.
Question is "can we do file list based data deployments? Or do we have to run cron jobs for these batches?"
The scripts are grouped to deploy lets say 7 data types within the same category to 7 different tables. In this cenario, there are 7 definitions. If a file list data deployment is possible, I am assuming, we will have to create different file lists for each definition.