"The process cannot access the file" when Unzipping an archive in predeploy.ps1

Hi,

I am unpacking and repacking my ccpkg file so I can do substitutions on the config files. This process is working for the most part except now I want to add NewRelic to my worker/website. This requires 2 msi’s and a cmd file be added to the role.

My predeploy unpacks the ccpkg then tries ot unpack each role but that isn’t working anymore… apparently because there are msi’s in the archive.

Error    11:32:15
cannot access the file 'D:\Octopus Tentacle\Applications\UAT\Playup.Api\0.0.161
Error    11:32:15
-fb-3d55512\worker\approot\NewRelicAgent_x64_3.9.146.0.msi' because it is 
Error    11:32:15
being used by another process."
Error    11:32:15
At D:\Octopus 
Error    11:32:15
Tentacle\Applications\UAT\Playup.Api\0.0.161-fb-3d55512\predeploy.ps1:19 char:2
Error    11:32:15
+     [System.IO.Compression.ZipFile]::ExtractToDirectory($zipFile, 
Error    11:32:15
$destination)```

So this error is kind of ridiculous because prior to extracting it creates the folder (so there are NO existing files), it then unzips the archive and gets this crazy error message about a process already using a file it has just extracted... its almost like its unzipping the twice, simultaneously. But theres nothing in the output that would indicate that is happening.

My predeploy.ps1 is

load the assembly required

function Unzip($zipFile, $destination)
{
#Delete destination folder if it exists
If (Test-Path $destination){
Write-Host "Path Exists… Deleting $destination"
Remove-Item $destination -Recurse
}
else{
Write-Host “Path Does Not Exist… $destination”
}

#Create the destination folder
New-Item -ItemType directory -Force -Path $destination

#Unzip
[System.IO.Compression.ZipFile]::ExtractToDirectory($zipFile, $destination)

}

function Get-ScriptDirectory ()
{
$Invocation = (Get-Variable MyInvocation -Scope 1).Value;
if($Invocation.PSScriptRoot)
{
$Invocation.PSScriptRoot;
}
Elseif($Invocation.MyCommand.Path)
{
Split-Path $Invocation.MyCommand.Path
}
else
{
$Invocation.InvocationName.Substring(0,$Invocation.InvocationName.LastIndexOf(""));
}
}

$path = Get-ScriptDirectory

#Unzips the cspkg file
Unzip “$path\PlayUp.Api.CloudService.ccproj.cspkg” “$path\azurePackage”

#Unzips the .cssx file that was contained in the cspkg file
Unzip (Get-Item (join-path -path “$path\azurePackage” -childPath “PlayUp.Api.Web_*.cssx”)) “$path\website”

#Unzip “$path\azurePackage\PlayUp.Api.Web..cssx" "$path\website"
Unzip (Get-Item (join-path -path “$path\azurePackage” -childPath "PlayUp.Api.Media_
.cssx”)) “$path\worker”


So this makes no sense to me. I can run the predeploy on my local machine without error but it always fails on Octopus Deploy.

I sometimes also have this process fail (before I added newrelic) with the same error different file (RemoteForwarder.msi i think it is). That msi is added automatically for the remote desktop support.

I changed the extraction code to extract files one by one and check for the existence of the file before trying to extract. Can’t have a process lock a file that doesn’t can we? Well apparently we can…

$archive = [System.IO.Compression.ZipFile]::OpenRead($zipFile)
	foreach ($file in $archive.Entries) {
		$destinationFileName = [System.IO.Path]::Combine($destination, $file.FullName)
		$destinationFilePath = [System.IO.Path]::GetDirectoryName($destinationFileName)
		[System.IO.Directory]::CreateDirectory($destinationFilePath)
		if (Test-Path $destinationFileName) {
		}
		else {
			[System.IO.Compression.ZipFileExtensions]::ExtractToFile($file, $destinationFileName, $false)
		}
	}

So this resulted in:

Exception calling "ExtractToFile" with "3" argument(s): "The process cannot 
Error    12:20:25
access the file 'D:\Octopus Tentacle\Applications\UAT\Playup.Api\0.0.164-fb-bfe
Error    12:20:25
d40d\website\approot\bin\NewRelicAgent_x64_3.9.146.0.msi' because it is being 
Error    12:20:25
used by another process."
Error    12:20:25
At D:\Octopus 
Error    12:20:25
Tentacle\Applications\UAT\Playup.Api\0.0.164-fb-bfed40d\predeploy.ps1:28 char:4
Error    12:20:25
+             [System.IO.Compression.ZipFileExtensions]::ExtractToFile($file, 
Error    12:20:25
$destinationF ...
Error    12:20:25
+    ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Error    12:20:25
~~~~~~~~~~~~
Error    12:20:25
    + CategoryInfo          : NotSpecified: (:) [], ParentContainsErrorRecordE 
Error    12:20:25
   xception
Error    12:20:25
    + FullyQualifiedErrorId : IOException
Fatal    12:20:25
PowerShell script returned a non-zero exit code: 1

All i can say is WTF!?

Hi Sam,

It does sound like a strange issue. My hunch in this situation would normally be to check whether you have a virus scanner running on the machine - it’s possible that it takes locks on the files during the extraction process. The extraction might extract to a temp file and then attempt to rename while the file is locked by another process.

Paul

You might also find that simply putting a retry around the attempt to extract solves the problem. In Octopus, we have a “custom installation directory” option where we copy files, and sometimes they might be locked. It isn’t nice, but we’ve found that simply trying, sleeping, and trying again a few times tends to make the extraction process more robust.

Paul

Hi Paul, thanks for the tip. Our server guy would do something like install AV on our deployment server. I will investigate.

I thought about adding some retry logic but it just didn’t make sense. The file doesn’t exist (the whole folder subtree doesn’t exist). The process that creates the file (from the archive) is the same process that complains the file is locked. I’ll try it out anyway.

Paul you were spot on with the AV. We disabled Trend and it worked straight away, ran it a few more times - all passed. Then we turned on Trend again and it started failing right away.

Thanks!