Deploying Cloud Service result in "Could not find a part of the path xxxxxxx.xxx" at least 50% of the time

We have a step in our deployment that deploys an Azure Cloud Service. We have around 30 tenants we will deploy to, sometimes all together (no more than 6 at a time). Somewhere close to half of the deployments will fail on the Cloud Role step with the error Could not find a part of the path '4kdjsdpj.cnk'.. The random looking file name is different every time. This error use to only crop up one out of 20 deployments, but now it’s almost every other.

Has anyone seen this type of error before? My google fu has been defeated.

Could not find a part of the path '4kdjsdpj.cnk'. 
January 10th 2019 22:28:33Error
System.IO.DirectoryNotFoundException 
January 10th 2019 22:28:33Error
   at System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath) 
January 10th 2019 22:28:33Error
   at System.IO.FileStream.Init(String path, FileMode mode, FileAccess access, Int32 rights, Boolean useRights, FileShare share, Int32 bufferSize, FileOptions options, SECURITY_ATTRIBUTES secAttrs, String msgPath, Boolean bFromProxy, Boolean useLongPath, Boolean checkHost) 
January 10th 2019 22:28:33Error
   at System.IO.FileStream..ctor(String path, FileMode mode, FileAccess access, FileShare share, Int32 bufferSize, FileOptions options, String msgPath, Boolean bFromProxy, Boolean useLongPath) 
January 10th 2019 22:28:33Error
   at System.IO.IsolatedStorage.IsolatedStorageFileStream..ctor(String path, FileMode mode, FileAccess access, FileShare share, Int32 bufferSize, IsolatedStorageFile isf) 
January 10th 2019 22:28:33Error
   at MS.Internal.IO.Packaging.PackagingUtilities.SafeIsolatedStorageFileStream..ctor(String path, FileMode mode, FileAccess access, FileShare share, ReliableIsolatedStorageFileFolder folder) 
January 10th 2019 22:28:33Error
   at MS.Internal.IO.Packaging.PackagingUtilities.CreateUserScopedIsolatedStorageFileStreamWithRandomName(Int32 retryCount, String& fileName) 
January 10th 2019 22:28:33Error
   at MS.Internal.IO.Packaging.SparseMemoryStream.EnsureIsolatedStoreStream() 
January 10th 2019 22:28:33Error
   at MS.Internal.IO.Packaging.SparseMemoryStream.SwitchModeIfNecessary() 
January 10th 2019 22:28:33Error
   at MS.Internal.IO.Packaging.SparseMemoryStream.Write(Byte[] buffer, Int32 offset, Int32 count) 
January 10th 2019 22:28:33Error
   at MS.Internal.IO.Zip.ZipIOFileItemStream.Write(Byte[] buffer, Int32 offset, Int32 count) 
January 10th 2019 22:28:33Error
   at System.IO.Compression.DeflateStream.WriteDeflaterOutput(Boolean isAsync) 
January 10th 2019 22:28:33Error
   at System.IO.Compression.DeflateStream.PurgeBuffers(Boolean disposing) 
January 10th 2019 22:28:33Error
   at System.IO.Compression.DeflateStream.Dispose(Boolean disposing) 
January 10th 2019 22:28:33Error
   at System.IO.Stream.Close() 
January 10th 2019 22:28:33Error
   at MS.Internal.IO.Packaging.CompressStream.ChangeMode(Mode newMode) 
January 10th 2019 22:28:33Error
   at MS.Internal.IO.Zip.ZipIOLocalFileBlock.FlushExposedStreams() 
January 10th 2019 22:28:33Error
   at MS.Internal.IO.Zip.ZipIOLocalFileBlock.UpdateReferences(Boolean closingFlag) 
January 10th 2019 22:28:33Error
   at MS.Internal.IO.Zip.ZipIOBlockManager.SaveContainer(Boolean closingFlag) 
January 10th 2019 22:28:33Error
   at MS.Internal.IO.Zip.ZipIOBlockManager.SaveStream(ZipIOLocalFileBlock blockRequestingFlush, Boolean closingFlag) 
January 10th 2019 22:28:33Error
   at MS.Internal.IO.Zip.ZipIOModeEnforcingStream.Dispose(Boolean disposing) 
January 10th 2019 22:28:33Error
   at System.IO.Stream.Close() 
January 10th 2019 22:28:33Error
   at Calamari.Azure.Deployment.Conventions.RePackageCloudServiceConvention.AddContent(Package package, PackageDefinition manifest, Uri partUri, String file) 
January 10th 2019 22:28:33Error
   at Calamari.Azure.Deployment.Conventions.RePackageCloudServiceConvention.AddLocalContentParts(Package package, PackageDefinition manifest, LayoutDefinition layout, String baseDirectory, String relativeDirectory) 
January 10th 2019 22:28:33Error
   at Calamari.Azure.Deployment.Conventions.RePackageCloudServiceConvention.AddLocalContentParts(Package package, PackageDefinition manifest, LayoutDefinition layout, String baseDirectory, String relativeDirectory) 
January 10th 2019 22:28:33Error
   at Calamari.Azure.Deployment.Conventions.RePackageCloudServiceConvention.AddLocalContent(Package package, PackageDefinition manifest, String workingDirectory) 
January 10th 2019 22:28:33Error
   at Calamari.Azure.Deployment.Conventions.RePackageCloudServiceConvention.Install(RunningDeployment deployment) 
January 10th 2019 22:28:34Error
System.ObjectDisposedException: Can not access a closed Stream. 
January 10th 2019 22:28:34Error
   at System.IO.Compression.DeflateStream.EnsureNotDisposed() 
January 10th 2019 22:28:34Error
   at MS.Internal.IO.Packaging.CompressStream.Flush() 
January 10th 2019 22:28:34Error
   at MS.Internal.IO.Zip.ZipIOLocalFileBlock.FlushExposedStreams() 
January 10th 2019 22:28:34Error
   at MS.Internal.IO.Zip.ZipIOLocalFileBlock.UpdateReferences(Boolean closingFlag) 
January 10th 2019 22:28:34Error
   at MS.Internal.IO.Zip.ZipIOBlockManager.SaveContainer(Boolean closingFlag) 
January 10th 2019 22:28:34Error
   at MS.Internal.IO.Zip.ZipIOBlockManager.SaveStream(ZipIOLocalFileBlock blockRequestingFlush, Boolean closingFlag) 
January 10th 2019 22:28:34Error
   at MS.Internal.IO.Zip.ZipIOModeEnforcingStream.Dispose(Boolean disposing) 
January 10th 2019 22:28:34Error
   at System.IO.Stream.Close() 
January 10th 2019 22:28:34Error
   at System.IO.StreamWriter.Dispose(Boolean disposing) 
January 10th 2019 22:28:34Error
   at System.IO.StreamWriter.Close() 
January 10th 2019 22:28:34Error
   at System.Xml.XmlTextWriter.Close() 
January 10th 2019 22:28:34Error
   at System.Xml.XmlWriter.Dispose(Boolean disposing) 
January 10th 2019 22:28:34Error
   at System.IO.Packaging.ZipPackage.ContentTypeHelper.SaveToFile() 
January 10th 2019 22:28:34Error
   at System.IO.Packaging.ZipPackage.Dispose(Boolean disposing) 
January 10th 2019 22:28:34Error
   at System.IO.Packaging.Package.System.IDisposable.Dispose() 
January 10th 2019 22:28:34Error
   at Calamari.Azure.Deployment.Conventions.RePackageCloudServiceConvention.Install(RunningDeployment deployment) 
January 10th 2019 22:28:34Error
   at Calamari.Deployment.ConventionProcessor.RunInstallConventions() 
January 10th 2019 22:28:34Error
   at Calamari.Deployment.ConventionProcessor.RunConventions() 
January 10th 2019 22:28:34Error
Running rollback conventions... 
January 10th 2019 22:28:34Error
Can not access a closed Stream. 
January 10th 2019 22:28:34Error
System.ObjectDisposedException 
January 10th 2019 22:28:34Error
   at System.IO.Compression.DeflateStream.EnsureNotDisposed() 
January 10th 2019 22:28:34Error
   at MS.Internal.IO.Packaging.CompressStream.Flush() 
January 10th 2019 22:28:34Error
   at MS.Internal.IO.Zip.ZipIOLocalFileBlock.FlushExposedStreams() 
January 10th 2019 22:28:34Error
   at MS.Internal.IO.Zip.ZipIOLocalFileBlock.UpdateReferences(Boolean closingFlag) 
January 10th 2019 22:28:34Error
   at MS.Internal.IO.Zip.ZipIOBlockManager.SaveContainer(Boolean closingFlag) 
January 10th 2019 22:28:34Error
   at MS.Internal.IO.Zip.ZipIOBlockManager.SaveStream(ZipIOLocalFileBlock blockRequestingFlush, Boolean closingFlag) 
January 10th 2019 22:28:34Error
   at MS.Internal.IO.Zip.ZipIOModeEnforcingStream.Dispose(Boolean disposing) 
January 10th 2019 22:28:34Error
   at System.IO.Stream.Close() 
January 10th 2019 22:28:34Error
   at System.IO.StreamWriter.Dispose(Boolean disposing) 
January 10th 2019 22:28:34Error
   at System.IO.StreamWriter.Close() 
January 10th 2019 22:28:34Error
   at System.Xml.XmlTextWriter.Close() 
January 10th 2019 22:28:34Error
   at System.Xml.XmlWriter.Dispose(Boolean disposing) 
January 10th 2019 22:28:34Error
   at System.IO.Packaging.ZipPackage.ContentTypeHelper.SaveToFile() 
January 10th 2019 22:28:34Error
   at System.IO.Packaging.ZipPackage.Dispose(Boolean disposing) 
January 10th 2019 22:28:34Error
   at System.IO.Packaging.Package.System.IDisposable.Dispose() 
January 10th 2019 22:28:34Error
   at Calamari.Azure.Deployment.Conventions.RePackageCloudServiceConvention.Install(RunningDeployment deployment) 
January 10th 2019 22:28:34Error
   at Calamari.Deployment.ConventionProcessor.RunInstallConventions() 
January 10th 2019 22:28:34Error
   at Calamari.Deployment.ConventionProcessor.RunConventions() 
January 10th 2019 22:28:34Error
   at Calamari.Azure.Commands.DeployAzureCloudServiceCommand.Execute(String[] commandLineArguments) 
January 10th 2019 22:28:34Error
   at Calamari.Program.Execute(String[] args) 
January 10th 2019 22:28:37Fatal
The remote script failed with exit code 100 
January 10th 2019 22:28:37Fatal

Hi justinself, thanks for getting in touch,

Could you please send through the raw task logs so that we can take a closer look, please? This should contain additional information which we can’t see from the log excerpt you provided. I have created a secure upload location here which is only accessible by Octopus staff and will be removed with any files after we are done.

Regards,
Shaun

Done. I removed any settings that were being printed but it’s otherwise untouched.

More Information:

For this particular instance, I disabled all steps that had succeeded and only ran this cloud service step. It failed again but eventually succeeded upon another attempt.

During this failure, there were 5 other deployments of the same type in progress.

In case it’s helpful, I’ve jotted down a few time stamps of the first 7 deployments when Calamari started re-packaging the cloud service (after it performed all config transforms and substitutions).

Tenant Id Started Repackaging Finished
Tenant 1 7:26:27 7:33:28
Tenant 2 7:31:09 7:36:50
Tenant 3 7:31:49 7:41:55
Tenant 4 7:31:59 7:45:07
Tenant 5 7:32:02 7:48:19
Tenant 6 7:32:04 7:48:45 FAILED
Tenant 7 7:59:09 8:05:49

Tenant 6 was one that failed. These were the first deployments of the night. All deployments were queued and scheduled to deploy at 7. Tenant 6 failed after the first 5 successfully repackaged the cloud service.

Looking through Calamari source, there’s a global semaphore that only allows one deployment to repackage at a time. Feels like that could be related?

Hi Justin,

Thanks heaps for the update.
I am trying to reproduce this issue but I need a bit more time.
I’ll touch basis again tomorrow.

Regards
John

Hi Justin,

A quick update.
I have made a few changes to Calamari that hopefully will make this better, unfortunately I haven’t been able to reproduce the issue.
Hopefully we will release this changes during this week.

Regards
John

Awesome, John!

We look forward to the update and will report back whatever happens.

Our team greatly appreciates your attention to this… if it works, it’ll save us several additional hours at night during deployments.

Hi Justin,

The changes I mentioned before will be included in v2018.12.0.
This release will come out next week, so if you can please install it once available and let me know if it does fix this random issue.

Regards
John

Will do. We’ll be installing it this week.

@John_Simons Installed today (the 12.1). Did 20+ deploys… NO ERRORS.

You seriously made an impact on my personal time with family.

If you are ever in Austin, I’m buying you a beer and BBQ.

Thanks Justin.

I never say no to a beer :slight_smile:

Cheers
John

Hi

We have had this exact problem as well and I was very happy when I found that it would be solved in 2018.12.0. However my happines was shortlived when I discovered that we are running 2018.12.1 and still has this issue. Did you reload the deployment step or did you take any other actions then just upgrading version? I have attached image with stacktrace if that says anything?

@Kahl thanks for responding to this…

So, a couple of weeks ago, we ended up having it happen again.

The good news, is that it is dramatically less frequent… maybe one out of 20-30 deploys.

We also increased the specs of our VM (16gigs and 4 cores now) and only run 4 tasks in concurrently. I do see this, but it’s nowhere near the frequency it previously was.

Ok, so more power and less concurrently running task could be a way to mitigate the issue.
We will look into that and many thanks @justinself