Octopus Deploy isn't uploading new version of my app

I have a process which has the following steps:

  1. Changes to git repository on bitbucket.
  2. Teamcity pulls changes, performs Maven build, then uploads .war file to JFrog
  3. Teamcity tells Octopus Deploy to create release and deploy.
  4. Step 1 in Octopus Deploy is it downloads the artifact from JFrog to the local file system.
  5. Step 2 in Octopus Deploy is a PS script that is detailed here, which performs an FTP upload of file to Azure.
  6. Azure takes the ROOT.war file, unpacks it, then turns the app on.

The version of my application is on 1.3.0-SNAPSHOT. I’ve made changes to the app without altering the version.

I’ve attempted to make some noticeable changes, such as a tag at the bottom of a home page (jsp), and then look for it.

When I look for evidence of this change in various places, it looks like OD is the point of failure.

The method I used to gather evidence supporting that claim was to download the war file from various locations, unpack it and inspect the home.jsp file, and look for the added line.

  • From JFrog, the war file did contain the added line.
  • From the OD server local disk, the war file did not contain the added line.
  • From the ROOT.war file on Azure, the war file did not contain the added line.

I’m not sure if Octopus Deploy is using a cached version of the war file from somewhere, or what is happening. Somehow it’s getting an old war file, from somewhere, and uploading that to Azure.

What’s odd about this behavior is that the application runs fine, it just doesn’t contain updates to the application. What I’m about to try is some mindless version bumping to the application to see if that solves the problem, but I’d like to at least understand where Octopus Deploy is getting this war file from, because it’s not JFrog (there isn’t even an old copy of the 1.3.0-SNAPSHOT build there), and it’s not TC (same), and it’s not being built (the time it takes to build it would be very noticeable, and I don’t know where it’d get source to build it anyway).

Hi Rodesha,

Thanks for getting in touch!

We do indeed cache files from external build repositories, they are stored at C:\Octopus\OctopusServer\PackageCache. As you haven’t bumped the version number our cache busting logic wouldn’t trigger (as there is no new version to acquire).

You probably have tested with a bumped version number already (as you indicated that was what you were testing next) which should then trigger Octopus to acquire the new version of the package and use that for your deployment. If this isn’t the case please let me know!

I hope that helps, please let me know if there is anything else that I can assist with,


Thanks, Alex.

Is there a way to tell OD not to cache for a project build?

Our TC configuration (or maybe Maven, I’m not entirely sure) is currently populating an xml document right next to the war file named “maven-metadata.xml”, with contents as follows. Is there some way I can configure Octopus Deploy to read this file and determine whether or not to cache based on some info? Or even read this file and use anything from it?

For example, there’s several nodes with a “lastUpdated” or “updated” type of integer, and maybe OD could store that integer and compare it to determine whether or not it should use the cached war file based on an integer comparison?

<?xml version="1.0" encoding="UTF-8"?>
<metadata modelVersion="1.1.0">

1 Like

As it turns out there is. For the project if you create and set a variable Octopus.Deployment.ForcePackageDownload to True it will always download the package from the feed, regardless of if the package exists in the cache or not.

Hope that helps, let me know if there is anything else that I can assist with,


1 Like

Thanks, Alex. I put that variable in as a project variable, but I think I may be doing something wrong because it’s still getting an old version of the app, and not from the repository.

Here’s the output in the log, in image form and text form:

Acquire packages
Ran for 3 seconds
May 14th 2018 15:38:18Info
Acquiring packages
May 14th 2018 15:38:18Info
Making a list of packages to acquire
May 14th 2018 15:38:18Info
The package cache will be skipped because the variable ‘Octopus.Deployment.ForcePackageDownload’ has been set to ‘true’
May 14th 2018 15:38:19Info
Package com.esha.eta:amd-webapp v1.3.0-SNAPSHOT was found in cache. No need to download.
May 14th 2018 15:38:21Info
All packages have been acquired

The piece of information I’ve been using to quickly determine where the war file is coming from is by looking at how long that step takes, and if it’s around 4 seconds, then it’s extremely unlikely that it downloaded the app from the repository, which sometimes takes around 30 seconds.

The way I’m confirming that it’s an old version is by looking at the repository, downloading the war file to my local machine, and then inspecting the contents of the file there. Then I check both the file on the Octopus Deploy server and the one being deployed to Azure, and they both are not getting updated copies of the application.

Hi Rodesha,

I’m assisting Alex with your issue, it looks like it could be a hole in our “bypass” package cache logic where it only bypasses the cache on the deployment target, not the Octopus server itself.

When you go to deploy the release again, there is an option to force a download of the package from the feed again (see below), could you change it from the default option to Re-download packages from feed, this should bypass the cache on the Octopus server as well.

This is just to confirm what I’m seeing in the code about forcing the package to be re-downloaded. If choosing that option works, then I can prepare a fix for this issue and get that out in a future release.

Thank you and kind regards,

1 Like

Choosing that option caused the download to happen and ignored the cache.

Thanks so much for your help. I think this solves exactly the problem I was attempting to solve.


Great to hear that you were able to get past that stumbling block!

Thank you and kind regards,

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.