Deployment fails because file is not downloaded properly. How to resolve it?

I was running my deployment, for installing the PowerShell7 core on one environment and everything goes great.
Then I switched to another Octopus environment - and very very often I am getting failed deployments because file which I try to download from the same artifactory repository, is invalid - it seems it hasn’t been downloaded properly because its regular size is around the 85MB while I see it has for example around 50MB when I try to download it and then of course I cannot properly Extract that installation files and deployment fails.
I have many many our components in ZIP format which should be downloaded from artifactory so I am afraid it will happen for all of them so I am not sure what I can do? I am not sure if this is network issue or not, but it is really the blocker for the whole process.
The point is that for the same deployment target and step, this download and deployment sometimes pass ( is downloaded with its regular size of 85MB) and sometimes it fails because it is not downloaded properly - on the same deployment target!

I got one proposal that maybe I can use the CRC of the file to validate if the download was successful, or to compare hash with local and remote file but I am not sure how I can do this with remote file in PowerShell?
I am downloading the file using the Invoke powershell command.
Invoke-WebRequest -Headers @{'X-JFrog-Art-Api' = $ARTIFACTORY_API_KEY } $ARTIFACTORY_FILE_URL\ -OutFile "$DOWNLOAD_FOLDER\"

Thanks in advance for helping me…

Hi @veljkosbbb,

Thanks for getting in touch!

This does sound like a network issue or timeout is causing the download to end early. As this is being handled by a custom script against a third party repository, I don’t really have much advice I can offer. Comparing the hash is a good idea, however, you would need to speak with Artifactory to see if there is a way to do this.

One option that might work, but would require some testing, is to create a NuGet feed within Artifactory and add that to Octopus as an external feed. You could then rename the ZIPs that you’re using to .nupkg and then us the Octopus package transfer and deploy package options.
I’m not entirely sure if the lack of a nuspec file would cause Octopus to refuse these packages, but it is an option you could explore.


Hi @paul.calvert

thanks for response… I am still looking for my options…
Can I somehow put this zip file into the Octopus packages and to “retrieve” it from some repository of the octopus?
Do you have such example and could you be so kind to provide me those instructions how to upload that file and how to retrieve that zip package in that case (if I am NOT storing it into external artifactory but within some internal Octopus repository)?

Thank you

Absolutely. There is a built-in repository that you can upload your zip files to directly. You can find it in Library > Packages.

Then in your deployment process you’d just add a Deploy a Package step and configure the desired package name, install location and if needed you can add a post deploy script to the same step.

Thanks @paul.calvert
Can you give me an example for “Deploy a Package” step how I can retrieve the file from the repository - can I extract it , like every other which I am downloading from external repository?

In regards to the network connections - is it the same if I am downloading this file on deployment target from Octopus repository or from some external systems (like Artifactory or BitBucket and similar…)? Will it produce better performances or not?


You can find an example of the step here:

Just login as a guest.

The basic step will copy the package to the target and then extract it.

The network performance will depend on where your Octopus Server is. If it is located internally in the same network as your deployment targets then you should see better transfer quality.

This topic was automatically closed 31 days after the last reply. New replies are no longer allowed.