Upgrade from 2.6 to 3.3.24 - Stuck at Convert documents


I am testing the upgrade process on a test vm with a copy of our production database which has a backup size of 752MB.

The database import was taking a large amount of time and I came across the -maxage paramter. I sent this to 60 but the upgrade process was still running after 73 hours. The vm had 4 GB ram and 4 virtual processors (2.49GHz) and the process got to “Convert documents” and it consumed all of the memory and no further outputs were written to the screen or logfile.

I am now trying the process to upgrade again but have not set any task logs to migrate. The vm now has dynamic memory (up to 16GB) and the same processors. The command line I’m using is:

“D:\Program Files\Octopus Deploy\Octopus\Octopus.Migrator.exe” migrate --instance “OctopusServer” --file “C:\Octopus\20160729-140413.octobak” --master-key “abcdefghijklmnopqrstuvwxyz”

The output to the screen is:

Octopus Deploy: Migrator version 3.3.22 (3.3.22+Branch.master.Sha.badb547cec1597b49d77721a7bd900490b35fcaf)
Beginning database upgrade
Fetching list of already executed scripts.
No new scripts need to be executed - completing.
Migrator assembly version 3.3.22+Branch.master.Sha.badb547cec1597b49d77721a7bd900490b35fcaf

Add destination documents to identity map
18 destination documents added
Step took 00:00:00s

Match source documents to destination documents or new Ids
Step took 00:21:06s

Identify attachments
Step took 00:02:07s

Translate compound IDs
Did not satisfy dependencies for projects-449 on variableset-projects-449-snap
Did not satisfy dependencies for projects-577 on variableset-projects-577-snap
Step took 00:00:14s

Convert documents

The OctopusMigrator log file has a lot of entries with debug information (no errors), the last few lines are:

2016-08-08 11:17:12.3311 1 DEBUG Translated DashboardConfiguration-users-324 to DashboardConfiguration-Users-30
2016-08-08 11:17:12.3467 1 DEBUG Updating destinationId from $DashboardConfiguration-users-355 to DashboardConfiguration-Users-34
2016-08-08 11:17:12.3623 1 DEBUG Translated DashboardConfiguration-users-355 to DashboardConfiguration-Users-34
2016-08-08 11:17:12.3623 1 INFO Step took 00:00:14s
2016-08-08 11:17:12.3623 1 INFO
2016-08-08 11:17:12.3623 1 INFO Convert documents

Is this normal? How long should I expect the upgrade process to take?

Many thanks,


Hi Graham,

Thanks for getting in touch! 752MB is a fairly large amount of data for the migrator to have to process. It is very memory intensive. It needs to store every record and its new matching ID in memory to be able to correlate all the new records it is creating.
The time depends on number of records. It stalls slowly. You should find that even if it appears stalled a new line gets added to the logging every few minutes.
But throwing more ram at the problem does help, and determining the least amount of history to import via --maxage also helps. 16 GB should be enough.

The command you are using is not limiting the time of data to include and also it will be importing the task logs as that is the default you have to tell it not to import logs if that is what you want.
Both of these will speed up the process.

Let me know how you go.


Hi Vanessa,

Thank you for your reply.

The Migration log file hasn’t been updated now in 24 hours but there are some entries in the OctopusServer log. Some of the entries are warnings stating an API key is not valid.

What is the command I need to use to tell the migrator not to import the logs? I assumed adding “—include-tasklogs” meant the logs would be imported and that not specifying it meant that they wouldn’t be included?

Should the migration log file be being updated? Is there any way to determine how far through the process it is?


Hi Graham,

Logs are imported by default. There are options for nologs (this is the one you want) or onlylogs (this option means do not import or update any data, only convert logs) but by default it will take logs. Here are the options: http://docs.octopus.com/display/OD/Migrating+data+from+Octopus+2.6+to+3.x

If your migration log hasn’t updated I would kill the process. There is no way to determine where it is up to, as we really don’t know as we have to read the file line by line.
Feel free to send through your migration log if you would like me to check the errors you are seeing.


Hi Vanessa,

As you have suggested I have killed the process and started the import process again using this command:

“D:\Program Files\Octopus Deploy\Octopus\Octopus.Migrator.exe” migrate --instance “OctopusServer” --file “C:\Octopus\20160729-140413.octobak” --master-key “abcdefghijklmnopqrstuvwxyz” -maxage=30 –nologs

The previous log file which had been running for 96 hours still had nothing more written to the log file, the last few lines were:
2016-08-08 11:17:12.3623 1 INFO Step took 00:00:14s
2016-08-08 11:17:12.3623 1 INFO
2016-08-08 11:17:12.3623 1 INFO Convert documents

The process that I have just started (with maxage set to 30 and no logs) has got to the same point and the log file looks similar, it hasn’t written anything in the last hour:
2016-08-11 10:13:29.7505 1 INFO Step took 00:00:00s
2016-08-11 10:13:29.7662 1 INFO
2016-08-11 10:13:29.7662 1 INFO Convert documents

Unfortunately my company prohibit me from uploading log files with any internal information in them. I understand that the forum post can be made private but I am still unable to add the log files.

How frequently should the log files be updated once it has got to the “Convert documents” stage?


Hi Graham,

Is the policy only about uploading or can you email me the logs at support @ octopus dotcom ?


we are stuck at same position. can anyone of guys suggest how did u resolve this. its stuck @

Match source documents to destination documents or new Ids

Yes! Had to give the server 32GB or ram.

Hope that helps!

On 18 Oct 2016, at 19:31, jim <tender2+debe49a80e@tenderapp.commailto:tender2+debe49a80e@tenderapp.com> wrote:

Hi Jim,

As Graham said, RAM is the solution. You should find when it reaches that point that the RAM has reached it’s limit.
16 GB appears to be the least required to keep the migrator running but more RAM will allow it to process faster.