I had a problem that i fixed using this instructions:
[Quote]
Failed to process request. ‘Request Entity Too Large’.
The remote server returned an error: (413) Request Entity Too Large…
Looking very interesting because it’s an IIS Web server error and after poking around with the server argument, I noticed that removing the /api/v2/package worked just fine.
These instructions helped me as well. We were encountering a strange error message when pushing packages either by command line or using the OctoPack within the MSBuild arguments:
Unable to write data to the transport connection: An existing connection was forcibly closed by the remote host.
But we only received these messages when packages reached an arbitrary size. We could not determine if it was our network or IIS until we hit this post.
Note: within the MS documentation on the two fields in the config file (maxRequestLength and maxAllowedContentLength), the first is in kilobytes and the second is in bytes.
Hello! Octopus 2.x uses different hosting type, so this solution doesn’t work, is it? We have similar problem now, we can’t upload 300MB package to internal nuget-feed. How this can be fixed?
BTW we’ve tried adding the paramtters mentioned above to the Octopus.server.exe.config file, but it hasn’t helped us.
When we are referring to IIS in this case, we mean to our NuGet server which runs on top of IIS - not Octopus. That’s why the web.config changes worked. What type of NuGet host are you using? Are you using the internal NuGet server in Octopus? An external site like myget.org or a commercial product like ProGet?
NuGet.exe implements a (default) 5 minute timeout for push operations.
When you call NuGet.exe on the command line, adding -Timeout 601 will extend this to 10 minutes and one second, etc.
You need to make sure the timeout value isn’t a multiple of 60, because of this NuGet bug: https://nuget.codeplex.com/SourceControl/network/forks/styx31/nuget/contribution/5969
yes, the issue has not been related to the 30MB limit. This was solely the issue with the timeout. The reason why we hit this timeout is still unclear, but it looks like that we had some bandwidth issue with transferring data between AWS servers. Perhaps we used external IPs to upload packages. Now we switched to internal IPs and the problem has gone.
Dmitry is correct (high-five) we do not impose any limit. But NuGet has a timeout when it’s uploading the file which for large files can make it seem like we impose one.
Generally if the network connection isn’t having major issues, adding the timeout flag will help.
I added below settings to increase the package upload size to 50 MB, but I am still getting the same error. Is there any other setting I am missing ? Could you guys please help me to fix it ?
Error:
Failed to process request. ‘Not Found’.
The remote server returned an error: (404) Not Found…
Thanks for getting in touch! A 404 error is likely to be a problem with the URL being wrong, not with timeouts. Could you show me what the URL looks like?
I am experiencing the same issue that Ram is having. The only thing is, is that it is working for every package except my web project. It is 40~ MB and I have upped the max request length to 300 and still no dice.
Again, I am getting this error:
The remote server returned an error : (404) Not Found
@ianpaulin@Rammaram note that when using Nuget.Server the URLs for list / restore and for push operations are different. This is in the Nuget.Server documentation, but is pretty brain-dead if you ask me.