Discussions
Categories
Groups
Community Home
Categories
INTERNAL ENABLEMENT
POPULAR
THRUST SERVICES & TOOLS
CLOUD EDITIONS
Quick Links
MY LINKS
HELPFUL TIPS
Back to website
Home
Web CMS (TeamSite)
Looping Deployment w/ constantly updating file
cnelson
I have a deployment that is scheduled to run every 5 minutes. It's purpose is to check one of our remote external FTP servers and if anyone has uploaded any content, replicate that content to some internal storage and then kick off a perl script to send an email notification to key users depending on the directory the file was uploaded into.
The problem arises if someone is uploading a very big file that takes a long time to upload. Since OD is running every 5 minutes, it will see a change in the file each time it runs and download a part of the file and stop it will repeat this until the user finished the FTP upload and OD has grabbed the latest piece. The problem with this is, if the file takes hours to upload (it happens often). The users could get 50+ emails (one email every 5 minutes) even though the file is not complete.
Is there any way to avoid this?
I can think of the following options but don't know how to implement them
1) Kick off a DNR script and check to see if the file size is incrementing and if so, error out -- Problem I would REALLY like to only exclude this file from the transfer, there could be many files waiting to be replicated and I don't want them all help up if only one file is being updated still
2) Find a way to know when the file is totally complete and don't email until them. So far looking at the logs I have been unable to figure this out. I get src-is-newer as the reason everytime until the file is complete.
Any other ideas?
Find more posts tagged with
Comments
Dwayne
So far looking at the logs I have been unable to figure this out. I get src-is-newer as the reason everytime until the file is complete
I assume by this that you mean looking at the OD logs. I would thing the better choice would be to look at the FTP server's logs. Perhaps they're in a "parseable" format.
You don't say what platform your OD and FTP server's are running. Some flavors of Unix have the '
lsof
' command, which you could use to determine if the file is open or not. You could also attempt to open the file in an exlusive mode - if it fails, then the file is still opened somewhere else (probably the FTP server).
--
Current project: TS 5.5.2/6.1 W2K
cnelson
We are all on solaris, so lsof is a possibility. I could also easily write a script to check and see if the file size was still growing, parse the FTP logs (or install a realtime event handler into our FTPd to notify when new uploads complete). However with alll of these solutions then my #1 question becomes an issue again.
How can I exclude particular files from the deployment once it's already started? If 10 files have been completed upload and 1 or 2 are still going, I don't want to hold up the deployment of those 10 files until the other 1 or 2 are complete. New files are constantly being uploaded so if I waited until all files were complete I might never complete a transfer.
Dwayne
Well, if you're running OD 6, you can use the new PayloadAdaptor interface. There isn't one "out of the box" that does precisely what you want, but you can write your own java class and plug it in, so that you can do whatever filtering you want.
--
Current project: TS 5.5.2/6.1 W2K
Adam Stoller
When you initiate the deployment - are you using the '-inst' flag? If so - try doing it without the '-inst' flag and that should cause OD to abort running a 2nd (and 3rd, and ...) instance of the same deployment until the currently running one is completed.
--fish
Senior Consultant, Quotient Inc.
http://www.quotient-inc.com
Dwayne
try doing it without the '-inst' flag and that should cause OD to abort running a 2nd (and 3rd, and ...) instance of the same deployment
I don't think that really addresses the issue. It's not a case of one instance of OD stepping on another. It's could very well be that every one of these OD instances completes before the next one starts. The problem is that the file being deployed is constantly being updated outside of the control of any OD instance (probably because it's coming in over a relatively slow link).
--
Current project: TS 5.5.2/6.1 W2K
Adam Stoller
Ah - good point - if the file is still being written at the time of the deployment it could be trying to deploy a partial file.
If you had the base server where the FTP drop was - you could write a script to initiate the deployment with a filelist of files that are at least 1-5 seconds old - which would probably help avoid trying to ship an actively being written file.
I'm not sure how the payload adapters work and whether they can be used in a reverse deployment - but that's probably the next thing to investigate.
--fish
Senior Consultant, Quotient Inc.
http://www.quotient-inc.com
Migrateduser
Using a payload adapter sounds like a reasonable approach, but this necessitates the base server on the source system. You can't do a reverse adapter-based deployment.
Todd Scallan
Director of Product Management
Interwoven
t: 408-530-7167
e:
tscallan@interwoven.com
Migrateduser
Can I run ftp adapter with reverse deployment? Where should add the ftp adapter section if it's possible?
Thanks
Jason