Use Nuget to Share PowerShell Modules in your Enterprise

Nuget is not just for developers! If you are an IT Pro, you can use it as well. Nuget is a relatively new tool from Microsoft that provides the ability for people to easily share and use code. Microsoft is marketing it to developers as the way to share and use Open Source code in Visual Studio Projects. It does a great job at this and is really starting to take off in the developer community. But we, IT Pros, can take it and use it for PowerShell Modules.

So why would you want to use Nuget to source your enterprise scripts?

  • Versioning– You can match your Module Version numbers with the Nuget Package version
  • Dependencies – In a Nuget Package, you can say this package depends on this package, so check to see if its installed and if not, please install it for me.
  • At least 10 other cool things that I haven’t discovered yet.
    Nuget is a client server application. You can save a bunch of packages up on a server, and you use the client to download and install these packages. Microsoft hosts a Nuget server that currently has around 1500 published packages that you can use.

Last week at Teched Scott Hanselman gave a talk on using Nuget in the Enterprise.  The quick version of the talk is that you can host your own internal Nuget Server. In fact, all you really need to do is set up a file share. The other key is that all you need to access the packages on a Nuget Server is the Nuget command line tool

After seeing this talk, I was thinking this could be a great tool to distribute and share PowerShell Modules with the rest of my IT Department. What I didn’t know was whether or not it would be easy to package up a module and then install it using Nuget. So I started playing. As an example, I am going to use a Module I wrote to manage my work items in TFS.

The first thing to do is download the Nuget command line tool. After you download it, be sure to unblock it because it was downloaded from the Internet. I put the file in a directory in my documents folder and created an alias to it.


Once that is set up, you can start creating a package. There are a couple things you need to do if you are using the command line version of Nuget.

First, you need to create a spec file.  You can do this with the command nuget spec, as shown below. You can see that it creates an XML file.


There are all kinds of properties here that you can set. I am going to leave everything as default for demo purposes, except that I am going to rip out the dependency node and change the tag values to something more reasonable. This could obviously be automated very easily, but I simply used good old notepad2.

Now the dependency node is gone and tags are updated.


Once we have this we can create the package, again using nuget.exe. I really should mention that there are conventions for structuring a package. You can read all about it on I am going to completely ignore them because all I really care about the PowerShell Module.

Nuget.exe has a parameter called pack. You can specify a directory to package up by specifying the basepath parameter, the spec file, and the output directory.


Now I have a package called tfs.1.0.nupkg. NUPKG is the extension for Nuget Package. All this really is a zip file. In fact,you can rename this to and unzip the contents. In there is all the files for my modules.

OK, so now that I have a package, I need to put it somewhere where people can access it and pull it down with a Nuget client. There are tons of articles on how to create your own Nuget Server.

If you just want to use a file share, you can do that. Or you can actually build a NuGet Server Web Site. I guarantee that you won’t have to write any code, but you would have to install Visual Studio Express, or get a developer to build a quick app for you. NuGet server itself is a Nuget Package. With Visual Studio installed, I had my NuGet Server up and running in under 10 minutes. 

After you set up your Nuget Server, there is a directory where you can place your packages. It is appropriately named, “Packages.” Here is the directory where I have all my files for my local instance of IIS. You can see that I have two packages up here, one for the DataOnTap Module and the one we just created.


Here is the client listing the available packages from this source.


For the sake of demonstration, I created a folder called c:\modules and used nuget.exe to install my module there.


The only thing wrong with this is that the folder name contains the version and the nupkg file is still around. This can all be cleaned up pretty easily, manually, using PowerShell, or probably with some Nuget options I haven’t found yet.

I will be testing this out this week too see what else I can come up with. I am looking forward to seeing how versioning works. Also, I want to wrap some of this functionality with Advanced Functions.

If you install Visual Studio and Nuget, you will also get a set of PowerShell cmdlets to manage packages. However, as of right now the Module is not a separate download. I’ll look into this and let you know what else I find. I am sure there will be at least one or two follow up posts on this as I learn more about NuGet.

When Read-Host doesn’t quite cut it

Ninety percent of the time when you are writing PowerShell code, you can use parameters in advanced functions to get the data you need to get from a user. However, there are times that you may want to have a bit more control over the user experience. Out of the box, PowerShell provides a cmdlet called Read-Host.


From here you can use the variable in your code


This is cool but what if you want to offer choices to the user, and what if you want to customize the caption in the window in addition to the actual message. It turns out PowerShell has some messaging capabilities built in that are not exposed as direct Cmdlets. If you have used –confirm and –whatif on some Cmdlets, you have probably seen this UI.


I thought it would be pretty cool to be able to use this functionality with my own custom choices, caption, and message. I wrote a function called New-Choice that is up on Poshcode

Here are some example of using the function.

In PowerShell.exe




And even in PowerGUI


In summary, this function provides a great way to provide a rich user experience and maintain control of possible inputs the user could provide.

NetApp PowerShell Toolkit 1.4 Released! Get-NaHyperVHost

Last Friday, NetApp released version 1.4 of their PowerShell Toolkit. They have a total of 501 Cmdlets with this release.


Their stuff just keeps getting better and better.

There are a couple of Cmdlets that I wanted to highlight because they were extremely useful for me the other day. We have several 8 to 10 node Hyper-V Clusters all using NetApp and iSCSI storage. We have been moving VM’s to faster disks on our NetApp. One challenge that can crop up is correlating which VM’s in HyperV are stored on which Volume or QTree on our NetApp.

We have a great Ops guy who is super nitpicky about naming standards and because of our naming standards, we know exactly how everything lines up, at least for the VM’s that have been created in the last year or so. The problem is some legacy VM’s that don’t adhere to our standards in development and test environments. This is where Get-NaHyperV comes in to save the day. This CmdLet has actually been around for a while, but with this release, it now supports clustered disks, which is exactly what we needed. In addition to getting info on our CSV’s and exact location of VHD’d, we were also able to enumerate exactly which NetApp Volume, QTree, and LUN the VM Disk resources were associated with. Absolutely brilliant!


Here’s a screenshot of an example from the NetApp Help on the cmdlet


There is also a more generic cmdlet call Get-NaHostDisk which does essentially the same thing for disks that are on the SAN but not necessarily associated with Hyper-V VM’s. This can be used for clustered SQL or something else that uses shared storage.

I use these cmdlets nearly everyday. I can’t tell you how much they have streamlined our processes and tooling for working with our storage on a daily basis.  NetApp, keep up the good work!