I’ve searched the forum alot to find my problem.
We are deploying to SharePoint with PS scripts. We have logging in our scripts.
The problem is that these logs are generated in the root folder of the scripts when deploying, but the folder where NuGet package is extracted aka %Octopus%\Work<date stamp> is deleted when the scripts fails or completes.
We cannot read the logs we generate because it is generated there.
How can I prevent the folder being deleted?
I’ve tried to use fail-over mode, but still it gets deleted despite the deployment step is on pause.
Thanks for reaching out. What I’d recommend you to do is change your script so it sends the logs to the path where Octopus deployed your code, instead of the work folder.
There’s a deployment runtime variable in Octopus called
$OctopusParameters['Octopus.Action[StepName].Package.CustomInstallationDirectory'] that you could use to always get the deploy path. Make sure to update the
stepname part to match the name of the step where you are deploying your scripts to the machine.
Another totally valid approach would be to have a harcoded path like
C:\Logs where you’ll always store the logs of your deployments. Logs are usually something you wanna keep around and in a single place.
Hope that helps,