Octopus scripts - Azure Databricks

Hello, I am trying to create Octopus release for our Azure Databricks development which would include Service, Clusters and Users creation. As currently Azure Databricks doesn’t have support for Personal Token auto generation which leaves with a manual step in between my octopus release steps. I was wondering how can I pass this token value to already running Octopus release. I do understand the use of ‘Prompted Variables’ and I don’t think it can be used in this case. Can you please suggest what option I could use? Thanks!

Hi Satya,

Thanks for getting in touch. I am not familiar with Azure Databricks specifically but let me see if I can help. Could you detail a bit more about what steps you have and what each is doing?

From what you’ve said I’d probably agree that Prompted Variables is unlikely to be the best option. They work well for values that are variable between deployments, but are known at the start of the deployment. It sounds like you have a step that results in the values you need in subsequent steps? If that is the case maybe output variables may help.

Regards
Shannon

Hi Shannon, Here are the steps I am trying to achieve using Octopus Release (Powershell)

  1. Create Azure Databricks Service - Works fine.
  2. Need to create a access token manually on the Databricks portal - Currently automation is not supported.
  3. Use the token in the rest of the subsequent steps in Octopus for creating the Clusters etc.

So my question is if I add a manual step for the Step 2 and grab the token how can I pass this to the next step as the release would be already in progress. Hope you could point me in the right direction.

Thanks
Satya

Hi Satya,

For your manual token problem, you can use a manual intervention to allow the user to enter the token value, which will be available to subsequent steps as a variable.

The value entered by the user in the manual intervention step is available in PowerShell via $OctopusParameters["Octopus.Action[get token].Output.Manual.Notes"] where get token is the name of the manual intervention step.

Looking at the documentation for Azure Databricks, the default authentication is Azure AD, which I think means you would be able to authenticate using an Azure Service Principal, and then create the token using the Databricks CLI.

Hopefully, this helps you out.

Regards
Ben

Thanks Ben, the solution worked great!

Hi Ben, Thanks for suggesting the solution and it worked well for me and I am able to pass the manually generated token to the next steps. Appreciate your help!

Satya

Hi Satya,

Good to hear that this helped you.

I was doing some reading on the Azure Databricks authentication, it seems to support Azure AD Service Principals for authentication according to this doc page.

If this is possible you can access an Azure Service Principal account by creating an Azure Account Variable in the project and then accessing the individual properties of that account to authenticate with Databricks, the properties of the account are documented on the Azure Account Variables page.

Regards
Ben

Hi Ben, I am able to authenticate with the service principal earlier but the current Databricks REST calls support the Token generation only after you generate the first token manually as it need to be passed in the requests(Per my understanding).
I am going to go through these articles and double check. Thanks for sharing them.

Regards

Satya

Hi Ben, I have another question which is sort of related to the Octopus scripts for the Databricks CI/CD. Please let me know if you want me to open another thread.

I have the Databricks python notebooks in Github and I have the external feed in Octopus set for the same. As the Databricks Powershell (third party) currently supports the Import on Databricks workspace from a local directory I want to
get the notebooks to Octopus server and upload from there. I got the files downloaded as a nuget package on the octopus server but not really sure how I can unpack them on the Octopus server as I don’t see any built in steps. I was trying to something similar
listed in this article for the python files -
https://help.octopus.com/t/extract-octopus-package-on-the-command-line/1837
.

Can you please suggest the right way to handle this?

Thanks

Satya

Hi Satya,

I think what you are wanting is to run a script which uploads the contents of the GitHub repo.

On the script step, you can specify (as of version 2018.8.0) additional package references to be unpacked in the work directory alongside your script. You can add multiple package references to a single script step and each one will be unpacked into subdirectory the same name as the package id.

For your example, your script step can be defined inline on the step, and your Databricks deployment can be included in the work directory by adding a package reference and selecting your Github feed and typing into the package name field to find the source repo.
If I have got that backwards and your deployment scripts are in the GitHub repo, then you can specify the script source as a package, and select the GitHub feed and repo for the script source as well.

Regards
Ben

Hi Ben, I was able to get it working. Thanks so much for your inputs!

Regards

Satya