The problem: we have many web sites to deploy. Each of them is big (think hundreds of megabytes), but most of the sites have a lot in common and substantial number of binaries are repeating from site to site. Some binaries we build, some come from external NuGet packages.
Naive application of Octopus Deploy would put each web site to separate NuGet package. As the result Octopus would have to move gigabytes of data over network to each web server for each deployment. Performance of such deployment is going to be dismal.
One thing we do (before we looked at Octopus) was to use TAR instead of ZIP and make it pack identical files as a links rather than actual bits. This reduces the size of the package 20 times and makes everything manageable again. Unfortunately as I remember from the previous discussion with Octopus support - Octopus uses some open source implementation of TAR which does not understand linking (for unpacking at least) and we cannot use this workaround with Octopus.
Please let me know if you can come out with any creative solution for that problem - it is crucial for our decision to accommodate Octopus or not.
My own best idea so far is make single site deployment package not a single NuGet, but basically NuGet per DLL with explicit inter-package dependencies. This way identical DLLs would not be copied over again and again as Octopus hopefully caches the downloads and take the package it already downloaded to the target. Please let me know how feasible this plan is. Primary question, of course is - would Octopus even understand deployment NuGet package with dependencies?
Also if you can come out with anything better than that - please let me know.
Thank you!
Konstantin