Azure Cloud WebRole Service deployment started failing yesterday with no obvious changes

Hi,

Yesterday morning we deployed a TeamCity build via Octopus (Ver 2.6.5.1010) to an Azure WebRole Cloud Service. An hour latest, the SAME build failed to deploy to the same environment with this error message (see below):

We have tried:
Rebooting our build server (VM running on Azure)
Reviewing the Microsoft patches installed yesterday (there’s some relating to TLS)
Reviewing the Event Viewer of the deployment machine (there’s a TLS Error Message: “An TLS 1.2 connection request was received from a remote client application, but none of the cipher suites supported by the client application are supported by the server. The SSL connection request has failed.” From my reading, TLS 1.2 is only supported in Octo 3.1, however these messages are not consistent with the failure times)
Deleting the Cloud Service (we tried to deploying to different environments with same results)

We are now running out of ideas an no deployments are working (even previous ones that did). The odd thing is that we have a couple of web roles to deploy and some of them do deploy, so if it is a TLS handshake issue between Octo and Azure then it does not seem to be consistent.

Any ideas please?

Azure deployment parameters:

Info 10:53:22
Slot: Production
Info 10:53:27
Package URI: …j.5.21.0.9_bd1bdab9b737b00d36e03d797f1077375bee4152.cspkg
Info 10:53:33
Configuration file: …5.21.0.9_1\ServiceConfiguration.Cloud.cscfg
Info 10:53:37
Deployment label: Azure Admin PreProd v5.21.0.9
Info 10:53:42
Allow swap: False
Info 10:53:44
Importing Windows Azure modules
Info 10:53:47
Loading the management certificate
Info 10:53:52
Setting up the Azure subscription
Info 10:53:57
Starting the Azure deployment process
Info 10:55:32
Creating a new deployment…
Info 10:57:13
WARNING: The current subscription is being removed. Use Select-Subscription to select a new current
subscription.
Error 10:58:23
New-AzureDeployment : The HTTP request to ‘https://management.core.windows.net/c5c34be5-1f4e-49ed-b1f8-d01211176e0f/ser
Error 10:58:23
vices/hostedservices/ptpreprodadmin/deploymentslots/Production’ has exceeded the allotted timeout of 00:01:00. The
Error 10:58:24
time allotted to this operation may have been a portion of a longer timeout.
Error 10:58:24
At S:\Tentacle\app\PreProd\PropertyTree.Admin.Azure.Cloud\5.21.0.9_1\DeployToAzure.ps1:73 char:5
Error 10:58:24

  • New-AzureDeployment -Slot $OctopusAzureSlot -Package $OctopusAzurePackageUri ...
    

Error 10:58:25

Error 10:58:25
+ CategoryInfo : CloseError: (:slight_smile: [New-AzureDeployment], TimeoutException
Error 10:58:25
+ FullyQualifiedErrorId : Microsoft.WindowsAzure.Management.ServiceManagement.HostedServices.NewAzureDeploymentCom
Error 10:58:25
mand
Fatal 10:58:28
PowerShell script returned a non-zero exit code: 1
Tentacle version 2.6.5.1010

Hi Rodney,

Thanks for getting in touch! We think the following part and error is the key to help figuring this out:
New-AzureDeployment : The HTTP request to 'https://management.core.windows.net/c5c34be5-1f4e-49ed-b1f8-d01211176e0f/services/hostedservices/ptpreprodadmin/deploymentslots/Production' has exceeded the allotted timeout of 00:01:00. The time allotted to this operation may have been a portion of a longer timeout.

This error is returned from Azure itself. We don’t think it has anything to do with your deployments specifically. We found a couple of Stack Overflow threads with the same error and they seem to hint at networking problems as the timeout does not appear to have a setting that can be changed.


We unfortunately have not seen this error reported to us previously so cannot help in guiding you to the correct resolution!

Let me know if the above helps you find the issue.
Vanessa

Hi Vanessa,

Thanks for the prompt and detailed reply. It seems to have resolved itself and we are assuming that it is on the azure side. It is always strange when something happens with no obvious cause!

We can consider this resolved, ta.

image0eb6c9.png_3494e7d4.png

Hi Rodney,

Glad to hear it. We also aren’t fans when we don’t know why especially when its an external source.

Vanessa