I’ve defined two different roles, Web and DB. Obviously, different steps run on different roles.
When I deploy to an environment that has different physical machines for each role, I’m getting an error on one of the steps:
Import-Module : The specified module ‘WebAdministration’ was not loaded because
Error 16:34:53 no valid module file was found in any module directory.
I have defined a couple of Script Modules that I use to support various tasks. One of those modules imports WebAdministration because it works with IIS websites.
Even though none of the code executed on the DB server uses that module, it’s obviously still getting evaluated. And since we don’t have the Web Site role added to the DB server, we get an error message.
Is there a way to bind Script Modules to specific steps or roles/machines, and not others? Or do my Script Modules need to support the “lowest common denominator” of any machine that runs any step?
Thanks for getting in touch! As you found, when you create a PowerShell module and use it in a project, we load it any time we are using PowerShell within that project. Unfortunately it’s not possible for us to know if a given module is going to be used by your script, and it would be a bad user experience to ask you to select the module for each place that we run PowerShell.
So when creating a module, it’s important for the module to run in all of those contexts. There are two ways you could make this work in this case:
First, instead of a module like this:
Write it like this:
Alternatively, when importing a module that may not exist, you could do this:
Import-Module WebAdministration -ErrorAction SilentlyContinue
This will ignore the failure to import the module. Hope that helps!