In another support post - which is closed now - the recommended solution was to deploy a package to our hosted cloud environment. While this seems like a great idea the only local deploy target I can configure is an offline deployment. Is this the only way to deploy a package locally?
If so - how do I retrieve the path to the created .cmd file so that I can “deploy” that package locally and access the files?
So I figured out how to get the path to the cmd using powershell. It kinda sucks to have to add an extra step to do exactly the same thing as well as all the extra files that come with the package.
Is this the recommended way to do this??
I’ll drop the code here in case anyone is intersted:
$environmentName = $OctopusParameters["Octopus.Environment.Name"]
$deploymentId = $OctopusParameters["Octopus.Deployment.Id"]
$projectName = $OctopusParameters["Octopus.Project.Name"]
$releaseNumber = $OctopusParameters["Octopus.Release.Number"]
$locationName = "Offline Drop Location" # Unable to find this in a system variable
# Set path to drop directory
Set-Location -Path "C:\Packages\$($environmentName)\$($projectName)\$($releaseNumber)\"
# Execute cmd file
Thanks for getting in touch!
Would you mind describing your scenario a little more? Have you considered installing a Tentacle on the same machine as your Octopus Server to act as a local deployment target? That will let you use all of our system variables, such as
Let me know if that helps, or if I have mis-understood your scenario.
I’m trying to migrate to octopus cloud so installing a tentacle isn’t really an option.
Our current scenario is this:
- Together with our web app we publish a database migration package on our nuget feed.
- We deploy this package to a tentacle (running on the same vm as octopus) where it is automatically extracted by the deployment step.
- We then run an executable inside that package to perform the migrations.
I am now trying to migrate this to octopus cloud. The only way I could find to achieve the same:
- Deploy the same package to an Offline Package Drop target.
- Execute the created .cmd file to extract the package.
- Run the executable inside the package using a post deployment script.
By the way… another issue we have with the current approach is that running the executable on the cloud environment takes 5 minutes while it only takes up to 1 minute on our own VM.
Edit: never mind the slowness.
@Alex.Rolley any other suggestions besides a tentacle?
I’ve been working with out Hosted team to see what options we have for you. At this stage it looks like having a Tentacle installed on a machine is the only viable option that we can come up for Octopus Cloud.
We kicked ideas around regarding Workers, but there is not guarantee that each step within a deployment will run from the same worker, so that doesn’t help us.
Is there any possibility that you can have a Tentacle installed on your DB server, or alternatively have it operating on a small capacity VM somewhere?
Let me know and we’ll keep discussing on this side,
Using a VM is not an option as the only reason for us to move to cloud is if we can ditch the VM we’re using.
Are there any problems with the approach I’m using now? Creating and running the package drop on the same machine. It seems to work fine, but obviously I don’t want to break any rules and would like to know if it stays supported.
Offline drops aren’t really support on Cloud in their current form, and will be replaced shortly. Functionality will be the same, the main change will be that the drop itself will become an artifact of the deployment, rather than something that is created straight to disk/share.
Is there a reason why you can’t use the inbuilt worker (effectively the same as
Run on Server previously)? That would run the scripts on your hosted Octopus server, so access may be a problem (in that your Cloud server will need access to your Database server), but other than that it should work as normal.
Let me know if either of those will be an issue for you,
I think I mentioned this already, but we see no need to move to octopus cloud if we have to keep maintaining a VM anway. It’s as simple as that.
As for the alternative solution using the Offline Drop. It has been proven to work well for us, yet you’re saying it will be replaced/removed.
Does this mean we cannot migrate to octopus cloud at this time?
Sorry about the confusion, I wasn’t clear enough in my last reply. I was referring to running the step on your Octopus Cloud instance, not on a seperate server. I’ve looked at this a bit more and it seems that this probably isn’t suitable for your needs, however I do have some good news.
As part of our 2018.8 release we are adding the ability for a script step to directly reference a package (or packages), and run commands against it. This should allow you to do what you need to do directly on your Octopus Cloud instance, without needing any external VM’s or Offline Drops. This release is currently in bug bash, so all going well should be released in the next week or so.
Once it has been released feel free to drop me a line and I can get your instance upgraded so we can confirm that this is suitable for your scenario.
Thanks again @Alex.Rolley - so I’m correct in saying that what we want is currently not supported, but something similar will be very soon?
Is there any way we can extend our trial? It will expire tomorow.
I’m sure we can extend the trial for you, are you able to let me know your instance name?
@Alex.Rolley just wanted to inform you that we’ve got it working now with 2018.8! It greatly simplifies our deployment process.
That’s great news, thanks @tijs.hendriks!
Let me know if there is anything else you need,
This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.