Can not access a closed Stream?

Hi guys,
Thanks for the feedback @Jonas, hopefully the NuGet guys get on to this asap as it seems to be a fairly fundamental problem.
Cheers,
Rob

Been running with /m:1 for a week now and it seems it fixed the issue with nuget

Hi,

I think /m:1 is only a workaround when building 1 solution. We have 50+ solutions in our build and we build as many of these as possible in parallel.

/Jan

I have exact same issue in OctopusDeploy. My scenario is that my TeamCity kicks off multiple deployments (in fact releases that have an auto-deploy lifecycle), and in consequently tries to pack several packages at the same time (nuget pack I believe) which results in All of the 5 except last one to fail – retrying each of 4 failed ones will result in success which is clear indication that problem is in nuget pack parallel processes interrupting each other.

Does anyone know any workaround for this?

This issue is still open.

Just a note that might be able to shed some light on this, we had an issue with a completely unrelated library that also uses the System.IO.Compression library to uncompress ZIP packages, and on some client machines the same error occurs - ‘Can not access a closed stream’

This is due to the underlying file system not being able to write out the stream to the IsolatedStorage (or other temp location).

The solution was related to the user account that the process ran under and the IsolatedStorage folder that Windows uses by default when the uncompressed size is too large to hold in memory. A third party found the issue using https://technet.microsoft.com/en-us/sysinternals/processmonitor and it showed the OS trying different folders to ascertain the location where it could save the files and when it could not, that is when we saw the issue.

The following article relates to a very similar issue with the same solution

http://www.gemboxsoftware.com/support-center/kb/articles/23-isolatedstorageexception-when-writing-a-large-xslx-files