Our uploads to Azure are very slow, taking 15-30 minutes for a moderately small site, and we only have the “Timestamp” file comparison method selected.
I had a look at the logs and it looks like the Timestamp method doesn’t work. System DLLs are being copied up every time even though they are the same library, unchanged since the previous deployment. The dates being compared are the “last write time”, which means all files are always changed because - duh - they had to be written to disk to be extracted from the nuget package for deployment. This is basically the same as “upload all files, every time” - so absolutely no benefit in comparing the dates since they will always be different.
Or am I missing something obvious? Octopus v3.7.8 by the way
Partial log attached
extract.log (6 KB)
Thanks for getting in touch! You’re right on the money - when NuGet extracts locally, it updates the timestamp at that point in time. Therefore it will deploy every file regardless if it has changed or not, as they appear to be newer. That’s the way NuGet packages work with timestamps, so you’re correct - there is no real benefit to comparing the dates.
One option to consider is using checksum instead of timestamp with your NuGet packages. That would only deploy files that had changed, so it may be very well suited for your situation. This may work for moderately-sized applications, but may be prohibitively slow and unreliable for large applications.
Another option would be to use Zip packages instead of NuGet. We have tested and confirmed that Zip packages will preserve the timestamps throughout the whole deployment process. Check out our blog post which details these options to help optimize your type of scenario
I hope that helps! Don’t hesitate to reach out if you have any further questions.