We are planning to Moving 83 Million Documents (7 TB Size) from Various File Systems to Documentum as single object type. Metadata for these file is stored in LEGACY SQL Server Database
We are planning to write Java/ DFC code to take the files from File system and read the meta data
My Questions is based on experience
- What is estimated (roughly) time it will take to complete 83 Million docs assuming SQL Server is indexed and read is fast
- Any best practices, catches, things to watch out for.
- Any other alternative tools or others