Documentum RPS conditional aging, event date is not populated
I am working on Documentum RPS and I am very new to the concept, My doubt is on the conditional aging.
So we create the conditions and apply them to the retention policy in phase tab, but there is no place to determine or configure what the condition needs to look for like a change in date or something, Let say we have 100 records split in two different folders and have the same object type but i want to hold one for 7 years and another one for 10 years, from two different date field, I cannot go to chronological since it has one to one mapping with the base date to object type, so i created two policies with two different conditions, but unless the event date is filled, i will not be able to even see the content in the Qualification manager, how to configure the conditions to look for the date and populate the event automatically instead of manually going into the applied retention retainer to update the event date.
Any help would be really appreciated.
Best Answers
-
T_G is correct. There is also another option that avoids using event retention. If you add a new date field to your custom data type, you can make the base date for the object type to be the new field. When the other date field is set, your DFC program/job could copy value from the appropriate original date field into the common one. This triggers your aging.
To solve the issue for the multiple retention policies, one approach is to have the objects linked into multiple folders and link the object into the folder with required retention policy. This approach avoids dealing with the triggering of the event and this should meet your requirement. Setting the new date attribute does have to be done via DFC (versus DQL UPDATE) so that the aspects on your object fire (that RPS managed objects have) which can then start the aging once the base date attribute has been set.
0
Answers
-
@T_G Oh we need to customize it, by any chance you have any documentation link
0 -
So I am assuming that you are applying retention to the folders and based on your use case, you should be using individual retention instead of linked.
The way RPS works is that each object in the repository (not of folder type) will get its own RPS retainer and there will be event created when object is linked/created in the folder. It does mean that some process has to set the event date based on some logic (this can done by the RMA client which you already mentioned). There should be a public API (either java or DFS Services to set the event date.
I don't know if consulting has written something that looks at the object type and sets the event date via a job.
An alternative solution for your use case would be to use a 7 year retention policy and set it unknown for the disposition strategy. When the 7 years expires, when the retention policy is set to rollover, the retention manager could decide if they want to add a 3 year retention policy or decide it can be disposed.
I worked on RPS many years ago
0 -
Based on your use case, you have a requirement to split your documents in multiple folders, where each folder will have a different "base date". In that case, create a custom dm_folder object with a new attribute for base date.
Set that folder base date in the RPS configurations.
Create an a folder of that custom type, link the documents in that folder. Assign an appropriate retention policy that folder and set the folder's custom attribute that is being mapped to base date. Aging will began based on the folder's base date.
For example: you have two Retention Policies: 7 yr Chronological and 10 yr Chronological.
You create two folders: Folder 1 and Folder 2.
Assign 7yr policy to Folder 1. Assuming you want to age from r_creation_date of the folder, then you do not need custom date for base date. If you need to fine tune when the aging should begin, use the custom date field as base date and populate that date. Aging will begin from that date plus 7 years.
Similarly setup the second and further folders.
0 -
Thanks @T_G most of our documents are physical objects of Employees, So the custom date is Termination date which will be filled only when the employee is no longer working, so custom date in a folder with chronological will not work since date would be empty and it will take create date which we do not want it to, I think as you said we would have to create a custom job to update.
@T_G/@Steve, If I am able to find the table where the event date is located, can i directly update the table using a stored procedure instead of a custom job with dfc?
0 -
T_G is correct. There is also another option that avoids using event retention. If you add a new date field to your custom data type, you can make the base date for the object type to be the new field. When the other date field is set, your DFC program/job could copy value from the appropriate original date field into the common one. This triggers your aging.
To solve the issue for the multiple retention policies, one approach is to have the objects linked into multiple folders and link the object into the folder with required retention policy. This approach avoids dealing with the triggering of the event and this should meet your requirement. Setting the new date attribute does have to be done via DFC (versus DQL UPDATE) so that the aspects on your object fire (that RPS managed objects have) which can then start the aging once the base date attribute has been set.
0
Categories
- All Categories
- 122 Developer Announcements
- 53 Articles
- 150 General Questions
- 147 Thrust Services
- 56 OpenText Hackathon
- 37 Developer Tools
- 20.6K Analytics
- 4.2K AppWorks
- 9K Extended ECM
- 917 Cloud Fax and Notifications
- 84 Digital Asset Management
- 9.4K Documentum
- 31 eDOCS
- 184 Exstream
- 39.8K TeamSite
- 1.7K Web Experience Management
- 7 XM Fax
- Follow Categories