Prevent extracted package cleanup on step failure

Hi,
Is there any way to prevent octopus from deleting an extracted package on step failure in guided failure mode? Often times there may be log files or staging files in that extracted package folder that I need to see to determine why the step failed and how to recover from it. It would be fine to delete the extracted package once a retry is executed or the step is ignored but in guided failure mode it would be nice to leave it alone until an action is decided upon.

Thanks

Hello @Daniel.Guenter,

Thanks for reaching out!

What step are you using in your deployment that is deploying the package to your target?

Best regards,
Mark Butler

Hi,
I am using a script step, using powershell with an extracted package reference.

Hello @Daniel.Guenter,

Thank you for sharing that with me.

Unfortunately, I don’t believe that it is possible not to have Octopus clean up that working directory after the step has been completed. However, you can retrieve the package before it was extracted on the target in the Files directory. You’d then be able to extract it yourself and view the contents.

I hope this helps.

Best regards,
Mark Butler

Hi @mark.butler
The step has not been completed yet as it is in guided failure mode and has not received guidance from the user yet.

Yes I can get the original package from the disk but this won’t have any of the logs or temporary files created in the extracted package directory.

If this is by design then perhaps the cleanup could be adjusted to occur after guidance has been received in guided failure mode.

Hello @Daniel.Guenter,

One of my colleagues shared that we do have a way to copy the working directory before it gets deleted, enabling you to inspect it during a guided failure.

You can create a project variable named Octopus.Calamari.CopyWorkingDirectoryIncludingKeyTo with the value set to the directory on the target that you would like the contents to be saved to. The directory you save the contents to you can then inspect. You will need to manually clean up this directory after debugging.

I hope this helps.

Best regards,
Mark Butler

Hi @mark.butler ,
That is very interesting information. I will give this a try and see if it can help solve my problem. Thanks for digging this up.

Hey @Daniel.Guenter,

No worries at all! I hope this helps. Feel free to reach back out if you have any more questions.

Best regards,
Mark Butler

Hi,
I’m going to leave a message here saying how I fixed this issue in case it helps others.

I moved from using a script step that references a package to a package step that has custom deployment scripts.

This has exactly the behaviour that I want. The custom installation directory is only ever cleaned when the script is executed, or retried. As well I could put in a filter that keeps the log folder from being cleaned.

The only downside that I can see is that I will have to cleanup these directories periodically myself or write that into my scripts.

This topic was automatically closed 31 days after the last reply. New replies are no longer allowed.