Parallel Processes across Tenants on a single Deployment Target

I’m just not grasping the concept here. We’re looking to speed up deployments in a tenanted environment.

  1. If I have a single Project containing 6 Steps.
  2. I want to deploy the project to a single server (Deployment Target) that is home to 10 Tenants.
    Is there a way to make this deployment run for all 10 Tenants simultaneously (or at least a few of them) or does it have to proceed sequentially, running Steps 1 to 6 for Tenant 1, then running Steps 1 to 6 for Tenant 2, etc?

I see the OctopusBypassDeploymentMutex setting but it then says that “deployments of the same project to the same environment (and, if applicable, the same tenant) are not able to be run in parallel even when using this variable.”

Is there a way to increase the deployment speed (in terms of simultaneous execution) across multiple Tenants on the same Target?

Jamie

Hey Jamie, thanks for reaching out!

Have you tested using OctopusBypassDeploymentMutex for your deployment? Since you’re deploying to 10 different tenants, it should still work to deploy concurrently, up to your task cap limit.

For example, I have a simple tenanted IIS deploy to two tenants on the same target. I also have OctopusBypassDeploymentMutex set to true in my project variables. When I deploy the release, I can see both deployment actions running concurrently:

You can also see in the logs that they were deploying to IIS at the same time:

If you’re still running into concurrency issues with the Mutex variable set, we can look a little closer and try to find a better solution. Let me know how it goes, happy to help however we can!

Cory,

I guess the part that was unclear is the statement “deployments of the same project to the same environment (and, if applicable, the same tenant) are not able to be run in parallel even when using this variable.” Is this specifically referring to multiple Deployment processes?
Example: You cannot expect a concurrent Deployment of

  • (Deployment#1) Project#1 to Environment#1 for Tenant#1
  • (Deployment#2) Project#1 to Environment#1 for Tenant#2
    to run simultaneously even with the Mutex set to true?

We did try setting the variable to true and it appeared that for our single Deployment, all Tenants on the same Deployment Target were updated at the same time. This is what we were hoping was possible. Am I correct in assuming that the total number of Tasks (across multiple Deployment Targets) would then be limited by the Task Cap setting on the Octopus Server?

During our test of the OctopusBypassDeploymentMutex we did run into some issues with file access (file in use) issues. We’ll need to do more investigation there.

As always, thank you for the help.

Jamie

Hey @jamie.sidoti, you’re totally right that the docs aren’t exactly as clear as they could be with that. We’re taking that feedback and we’ll try to clarify it (and provide a reasonable example to help illustrate if we can!)

You are correct - the call out is specifically for deployment processes, not the tasks themselves. Octopus does this by default to protect files from needing to be accessed concurrently and hitting resource contention/locking issues. You are also correct that task cap is the true upper limit for your deployment targets.

And as to the final bit - custom installation directories should help eliminate some of your file locking issues. However, it’s always good to review the files that were causing locking errors - if they’re more central to the deployment (like the IIS metabase, not something included in your package), you may still run into issues with locking and have to address via a different method.

Thanks for the questions, even though you already had proper answers for most of them :smile: Happy to help if you run into any additional issues!

Alright, I think we have a handle on it. Thanks for confirming the Mutex and Task Cap items. That’s helpful. As for the file access issue… yup, we just tracked it down to a directory that was getting shared between the processes. Easy fix there.

Thank you!

Jamie

1 Like