Build once, deploy everywhere ... with .exe's

My last project was a web project, and once I grasped Octopuses “Build once” model, I ran with it. It just made sense, got rid of all the “staging” branches with builds for each branch. Everything is aweeessssooome.

But now I’m working on a thick client (WPF) application, and want to go the different “channels” route (not to be confused with Octopuses channels, I’m talking about beta, stable, nightly channels). Or maybe I can use octopus channels, I’m not really sure.

I’ve been reading on how other companies do release channels (https://www.chromium.org/developers/tech-talk-videos/release-process#TOC-Web-vs-Client-app is an interesting read). I like this method as it does away with a lot of headaches, I would like to do something similar

The problem is the only way I see release channels working is multiple branches (which puts me back to build multiple versions). I don’t see any way around this. I’ve read http://docs.octopusdeploy.com/display/OD/Branching (the old process was “Environment Branches”), and it feels like I would be going back to “Environment Branches” - except they would be “Channel Branches”.

My current build in teamcity gets kicked off like this: Figure out version, pass in Channel and Version to build.ps1, update AssemblyInfo.cs with given version and channel (My Client (Beta), v 0.0.1.1), spit out a MyClientBeta.0.0.1.1.nupkg. A release gets created. Or maybe its MyClientStable.1.0.0.0.nupkg. But now I have several Nuget Feeds (MyClientStable*.nupkg, MyClientBeta*.nupkg, etc…). So now I have 3 steps, one for each feed, that might be scoped to some Channel.

Sorry for the long post, but I feel the research involved was relevant to this post. My final question is: How can I utilize Octopus (and maybe Octopus Channels) to work with my scenario? The only thing I can think of now is to have 3 teamcity builds that output (stable, beta, nightly) builds, and would create a release on a specific octopus channel, where there would be 3 channels (stable, beta, nightly).

Hi Josh,

Thanks for reaching out!

While I was reading your text I was thinking about an approach, and funny enough at the end of your post you described it perfectly: The only thing I can think of now is to have 3 teamcity builds that output (stable, beta, nightly) builds, and would create a release on a specific octopus channel, where there would be 3 channels (stable, beta, nightly).

That’ll be my recommended approach. Have you tried it already and found any problems with it?

Thanks,
Dalmiro

Yeah I’m in the process of doing it now, waiting on a TeamCity restart to update the Octopus Deploy TeamCity plugin from 2.4 to 3.4+ for Channel usage.

Here is the “weird” part that I don’t really like, maybe there is a better way.

I had to create 3 lifecycles, and three channels. Each channel is assigned it’s corresponding lifecycle (beta lifecycle goes to the beta channel, etc…). But this feels wrong, there is only one path: deploy to [nightly, stable, beta] environment / channel. So I can’t “move” releases across environments or channels (because the binaries are already built with specific channel info). It makes me wonder if I should even be using Octopus Deploy for this.

Hi Josh,

I brought this conversation to the team and we believe this could be solved by taking a different versioning approach like this:

  • Use a single package ID like MyClient and then rely on versioning and pre-release tags for your different branches.

  • 1.0.0 is the current “Stable”. If you ever need to patch it, it becomes 1.0.1 and so on (no pre-release tag as this is the “stable” or “master” branch).

  • Beta would be 1.1.x-beta. Whenever Beta is finished, you merge your changes into the “stable” branch and bump that one to 1.1.0 (again, “stable/master” doesn’t have a pre-release tag).

  • So I can’t “move” releases across environments or channels (because the binaries are already built with specific channel info). The goal is to make your packages generic by not hardcoding any channel-specific info on them, and instead making everything configurable from a config file (that you can modify during the deployment using channel/environment scoped variables). This will of course implicate a change in your application to make it rely more in config files vs hardcoded values.

  • So now I have 3 steps, one for each feed, that might be scoped to some Channel. By having the same package ID always, you can have a single Package Deploy step that runs for all channels, but with different configuration and versioning rules. Channel Stable will only deploy packages without pre-release tags such as MyPackage.1.0.0.nupkg while channel Beta will only deploy packages with the “beta” pre-release tag such as MyPackage.1.1.0-beta.nupkg.

Workflow would go like this:

  • You have 1.0.0 in your stable branch (in source control), so you have MyPackage.1.0.0.nupkg deployed in your Stable channel all the way to production.

  • Your dev team is working on some new features on the Beta branch which is 1.1.x. At that point you are pushing MyPackage.1.1.X-beta.nupkg to your beta channel.

  • You find a bug in production that you need to fix right away. You push the fix to your stable branch, build 1.0.1 and deploy MyPackage.1.0.1.nupkg to your stable channel all the way to Staging. Once your testers aprove it, you push it to Production (still in channel stable)

  • You are confident enough that your new features are ready to go live, so you merge all your changes from branch Beta to Stable, bump the version of stable to 1.1.0 creating MyPackage.1.1.0.nupkg and then deploy it to your Stable branch.

This process is also explained (probably a lot better) in this link: http://gitversion.readthedocs.io/en/latest/reference/mainline-development/

Hope that helps,
Dalmiro

Notice:

This issue has been closed due to inactivity. If you encounter the same or a similar issue and require help, please open a new discussion (if we asked for logs or extra details in this thread, consider including them in the new thread). If you are the creator of this thread and believe it should not be closed let us know via our support email.