Failure deploying NuGet package with a single large (4GB+) database file

Hi, we have recently run into an issue deploying our customer database package. Using the ‘Deploy a Package’ step template to deploys a single database file which when unpacked is 4,847,284 KB’s.

An exception is thrown ‘Unable to read beyond the end of the stream.’

The root of the problem appears to be SharpCompress, which is using a method FillBuffer(int 32), which in a case of a large file would break the 32 bit limit imposed by this method.

We are running Octopus version 3.7.18

System.IO.EndOfStreamException: Unable to read beyond the end of the stream.
12:27:21   Error    |         at System.IO.BinaryReader.FillBuffer(Int32 numBytes)
12:27:21   Error    |         at System.IO.BinaryReader.ReadUInt32()
12:27:21   Verbose  |         Adding journal entry:
12:27:21   Error    |         at SharpCompress.Common.Zip.StreamingZipHeaderFactory.<ReadStreamHeader>d__1.MoveNext()
12:27:21   Error    |         at SharpCompress.Reader.Zip.ZipReader.<GetEntries>d__6.MoveNext()
12:27:21   Error    |         at SharpCompress.Reader.AbstractReader`2.MoveToNextEntry()
12:27:21   Error    |         at Calamari.Integration.Packages.NuGet.NupkgExtractor.Extract(String packageFile, String directory, Boolean suppressNestedScriptWarning) in Z:\buildAgent\workDir\14ffc968155e4956\source\Calamari\Integration\Packages\NuGet\NupkgExtractor.cs:line 64

OctopusLog.txt (11 KB)

Hi,

Thanks for getting in touch.

There is an open issue to fix large package deployments and we have moved it to the top of our priorities: https://github.com/OctopusDeploy/Issues/issues/2811

Cheers,
Shane