I’m trying to setup a connection between our existing Octopus Deploy server with our new Azure DevOps Services instance and reading Azure - Octopus Deploy link it looks like it is for deploying to an Azure cloud target.
I need some assistance as in our case we are only using ADO Services to store code but all our deployments on on-prem servers. I tried using the API key but when it’s time to setup a release pipeline the Space is blank (see screenshot)
Thanks for reaching out and for all of the information.
Assuming that API key has all of the permissions required (which you could test with a quick GET request on the https://serverurl/api/spaces/all endpoint. You can do this from https://serverurl/swaggerui.), this does look to be networking related.
To get this working, your Octopus Server will have to be able to receive data from your ADO server on port 443 as well as send it back to the ADO server.
Please let me know if you’re able to get it working with that information or if we need to dig in a bit more.
Yes the API key has all the permissions it needs to connect. The account we used to generate the API key is a service account we used in Octopus. In the setup that I’m trying to configure we are using an On-prem Octopus server and we are using Azure DevOps Services (cloud) not server. Our Octopus Deploy Server is using https but it is not available outside of the company, I hope this help clear the picture.
Oh I forgot to mention when the API was created it did not ask for permission to provide so we assume it is using the permission of the account we are using in this case the service account.
The Octopus plugin in ADO requires two-way communication with the Octopus Server. If you want to continue to use the cloud version of ADO in conjunction with Octopus Server, you will have to speak to your network team about getting a valid pathway between those two entities.
Another alternative you could potentially explore if your organization doesn’t allow for that would be to use a local ADO server on your intranet so that external communication is not required.
The first article you linked is as you said, for setting up to deploy to Azure resources.
The second article does apply to this scenario, however, the agent will also need to be opened up the same way the Octopus Server would have needed to be. If your network team would be more comfortable opening up a VM explicitly for this purpose, to be an ADO agent to communicate with Octopus, then this is an alternative you could explore. There is one caveat here, though. If you are going to use this method, you will need to hardcode any values for parameters in Octopus Plugin steps, as ADO will not be able to talk to Octopus to get information like Space, Project Group, Project, etc. It will then pass this information to the agent you’ve set up on your intranet to communicate with Octopus and do the work when you actually go to run the pipeline.
Ideally, if you can, the preferred method would be the one you outlined there. You will have a much higher quality experience if you can create a valid pathway for communication from the ADO Cloud server to your Octopus Server. If that is not possible, the article you linked is an alternative.
What I meant by the caveat is that in those steps like where you are having problems setting the Space with that dropdown, this will continue to be a problem. You will need to go to your Octopus Server and get the IDs for these fields manually. So for instance, instead of clicking that dropdown and selecting the space you want thanks to the the communication being open, you would instead have to go to Octopus and find out the space ID and manually enter it into that field, for example Spaces-1. It’s a bit more cumbersome, so if you can, it would be best to open the network path to your Octopus server. However, if your company can’t do that and can only open a path to a VM for an ADO agent, then this could be a possible avenue to explore.
If the DevOps server is external to your Octopus Server then yes, you will need to expose the the server to the internet in some form. It can be placed behind a firewall and/or proxy but ultimately the DevOps machine has to be able to resolve the DNS and establish a connection to the Octopus Server.
You’re correct, what I meant by the ideal scenario was referring to what Paul said, that you will need to expose the Octopus Server to the ADO server for best usability. (if this is something your company/security team is willing to do)
Please let me know if you have any questions or if that helps.
I apologize I was confused with this reply before but I figure it out and was able to use the swaggerui to test. I was able to get a response on queries so does this mean I can access ? Is the ServerURL here pointing to our Octopus deploy?
That test was actually just for checking if your API Key had the correct permissions. The network path will still need to be made valid by your network team if you want to utilize Octopus with ADO, whether it is to the Octopus Server itself or to an ADO Agent on your intranet that will do the work, whichever your security/network team can approve.
Please let me know if you or they have any questions.
I’m just stepping in for Jeremy as he is offline for the day.
To get those drop downs above, the traffic will need to be opened by your network team from the ADO server to your Octopus Server, there is no workaround for this. Installing the ADO Agent on the server is not necessary.
If you want those drop downs to work, you will need to get with your network team to open communications.
Thank you Garrett for confirming, by the way I was able to confirm that the ADO Services was able to call Octopus in our instance by manually typing in the ID’s so as far as connectivity we are good. Per your comment I just need to have our IT team open up traffic so I do not need to manually enter the ID’s. Have a good weekend.
Do you have any guidance as to what needs to be opened because I know our IT team will ask? It is a specific port? Is it port 443? I assume this will be for the Octopus Server since ADO Services is hosted. Any information is greatly appreciated.
I’m glad to see that manually putting in the ID’s is functional we should be most of the way there at this point.
For your question, yes, 443 should be the proper port. If you have any kind of networking appliances (IE proxy/load balancer) that would change the API request your IT team will know which rules will need to be put into place to handle that handoff.
If your IT team has questions please don’t hesitate to reach out.