Extracted Package Directory

Hi,

I have a project which deploys a package to an azure cloud service. The package is a zip and contains the cspkg and cscfg files to deploy as well as a directory of sql scripts to run. The deployment is done in one step (well there are 2 steps - one per channel), and the sql scripts are to be run in a future step (there are some other steps that happen in between). I cannot seem to get the directory of the extracted package in the future step to allow me to get at the sql scripts. I’ve tried various Octopus parameters/variables but cannot seem to get the directory - all of them are empty. I’m tearing my hair out with this now so any help would be much appreciated.

Thanks
Colin

Hi Colin,

Thanks for getting in touch.

The package extraction path will be logged as an output variable for your cloud service deployment step/action. For example, if you had two deployment steps, named as follows:

  1. deploy my cloud service
  2. some scripting step

The Octopus.Action.Azure.PackageExtractionPath will be set as an output variable for your ‘deploy my cloud service’ step, and could be accessed as follows: Octopus.Action[deploy my cloud service].Output.Octopus.Action.Azure.PackageExtractionPath

The PowerShell for accessing this output variable would look something like the following:

Write-Host "Package info:"
$packageExtractionPath = $OctopusParameters["Octopus.Action[deploy my cloud service].Output.Octopus.Action.Azure.PackageExtractionPath"]
Write-Host "PackageExtractionPath: $($packageExtractionPath)"

To help visualise the output variables that are available to you, you can temporarily set the OctopusPrintVariables variable (as described here) for your project. Then you can view all the output variables that are set during your deployment (in your task log).

Let me know how you go.

If this does not work for you, could you please set the OctopusPrintVariables variable to True and attach a full task log and we will investigate further.

Cheers
Mark

Hi Mark,

This works perfectly. Not sure what I was doing wrong :slight_smile: Anyway, I’ve also
managed to simplify the process as part of doing this so all good. Thanks
for the help.

Cheers
Colin

Hi Mark,

Actually it doesn’t work perfectly, the extracted folder I’m trying to
reference is removed after the Deploy to Cloud Service step completes. Is
there a way of preventing this from happening so that I can use the
extracted files?

Cheers
Colin

Hi Colin,

Sorry, you’re right. All script steps clean up their working directory after they’ve finished running (and the Azure Cloud Service step extracts your package, executes a script, then cleans up after itself). There’s no way to bypass this behaviour, however, you have some options…

As part of your Cloud Service package step, you could bundle your scripts as custom PostDeploy scripts (either include a PostDeploy file as part of your Cloud Service package or use the ‘Configure features’ section of the step to select ‘Custom deployment scripts’, where you can then type your script content directly in the UI). These scripts will then get run as part of your step and should be able to references things in your package (before your package is cleaned up).

Alternatively, you could separate your sql scripts from your Cloud Service package altogether (have a separate package for your sql scripts), upload this package to your Octopus Server to a custom installation directory, then future steps could address this directory as you wish. Or if this is just a single sql script you’re trying to run, you could do that all from one “Run a Script” step where you reference your package ID and target a specific script file inside your package.

Would either of those options work for you?

Cheers
Mark

Hi Mark,

I ended up copying the files I need to a temp directory as part of the
pre-deploy script as I don’t actually run the sql scripts until some other
steps have been run and checked. It’s not ideal because it means some messy
powershell cluttering up a step template that is only used in one or two
projects and it doesn’t get cleaned up properly afterwards either.

The only other thing I could think of would be to do as you say and upload
a sql package to Octopus but that would become complicated because it’s not
just one script and we’d have different versions per channel.

Anyway, got it working for now. I’ll tidy it up when I get some more time!

Thanks
Colin