Couple things...
We found a package on a remote site server that was still pending copy.
We manually copied that from the NS to that site server, did a delta update, then did an update on the client on the site server, and a refresh packages.
It showed as fully copied at that point.
We're thinking we could load a 3rd party multicast client on the site servers and NS server and push the copyfile directory to the site servers through this multicast client, then when the client updates it will recognize it has the files.
How does this sound to you?
2nd question is...
We also noticed some packages copying twice (some of these are 700mb, 300mb, etc).
They have the same guid but different package id.
These are copy tasks that we have re-uploaded the files into.
We are going to go through and clean those up but why would it make a whole new package and re-push the same guid?
Thanks in Advance