Incremental Deployments for Massive Projects

For perspective, I’m an ops guy…

We are loving Octopus for our newer .NET projects, however we have some old legacy internal (non-.NET) websites with a codebase in excess of 40,000 files (shudder). We’re struggling to find a simple way to retrofit these projects into our new TFS/TeamCity/Octopus regime – obviously a CI type of situation where we’re building a massive nupkg with 40,000+ files for every tiny modification is untenable.

Has anyone else conquered a similar situation and willing to share some advice?

Hi Timuel,

Good question. As you know, packaging files for incremental deployments can be hard since each machine you deploy to may have a different baseline.

Depending on where your environments are located, one option might be to have your CI server push the changed files to a shared location (e.g., network share) which represents the “current” version. Your NuGet package would only contain one file - a Deploy.ps1 - that uses RoboCopy to pull the changed files from the shared location to the machine being deployed to.

The benefit of this approach would be that you would never need to use a large package (either in CI or Octopus). The downside is that it would only work when the machines are on the same LAN. Would that work for you?