Hey there and welcome to the forum!
I do think you’d be able to do this with Octopus, but the details on how would depend on a few factors.
Do you want to continue using the company’s sql program to do the syncing and script deployment (having Octopus call that program on your behalf)? Or are you looking to replace that program with a process in Octopus?
The solution will rely heavily on workers to run the database commands for you. The tricky part will modelling the 300 databases in a way that makes it easy to run the same script across them. I have two thoughts on that.
The first one is to use Tenants and model each database as a tenant. I feel like this is a more complicated solution and not optimal if you always deploy to all 300 databases at the same time.
The second would be to model each database as a Cloud Region target, and have your script step that does the database work on a worker run on behalf of the role assigned to the targets. This way you can have a single step in your process, but still run it for 300 databases. The Cloud Region name is a little bit of a misnomer. It was originally intended to represent different cloud regions for applications being deployed, but we’ve found more uses for them since they were added. The good news here is that Cloud Region targets do not count towards your license limit.
The exact scripts those run would depend on what tool your using to run the scripts. We have many community step templates and samples around different DB deployment tools.
I hope that helps get you started. If you need more info or want to schedule a call to discuss in detail, we can definitely arrange that.