January St. Louis PSUG meeting was a blast!

A couple of weeks ago I had the pleasure to attend another STL PSUG meeting. Mike Lombardi presented on “Getting Started with a Real Problem” and did a great job.

His scenario was someone who didn’t really know PowerShell at all and needed to troubleshoot a 3-server web farm where the nodes had different problems.

There were some technical difficulties with his lab setup (he used Lability, which was cool), but he stuck with it and we did all of the fixing in the scenario using a workstation rather than RDP’ing into the nodes.

The recording of the event (which was live-streamed) can be found here.

I will be presenting next month on writing functions that work with the pipeline.

–Mike

Debugging VisioBot3000

magnifying glassThe Setup

Sometime around late August of 2016, VisioBot3000 stopped working.  It was sometime after the Windows 10 anniversary update, and I noticed that when I ran any of the examples in the repo, PowerShell hung whenever it tried to place a container on the page.

I had not made any recent changes to the code.  It failed on every box I had.

First attempts at debugging

So…I really get fed up with people who want to blame “external forces” for problems in their code.  When I found that none of the examples worked (though they obviously did when I wrote them), I figured that I must have done something stupid.

Hey!  I’m using Git!  Since I’ve got a history of 93 commits going back to march, I figured I could isolate the problem.

So…I reverted to a commit a few weeks earlier.  And it failed exactly the same way.

Back a few weeks before that.  No change.

Back to the first talk I gave at a user group….no change.

I gave up.

For several months.

Reaching out for help

After Thanksgiving, I posted a question on /r/Powershell explaining the situation.  I got one reply, suggesting that I watch things in ProcMon while debugging.  Seemed like a great thing to do,  When I got around to trying it, however, it didn’t show anything useful (at least to me…digging through the thousands of lines of output is somewhat difficult).

Making it Simple

Late last year, I thought, I should come up with a minimal verifiable example.  Rather than say “all of my code breaks”, I should be able to come up with the smallest possible example that breaks.  To that end, I wanted to include as little VisioBot3000 code as I could, and show that something’s up with Visio’s COM interface (or something like that).  To that end, I went back to the slides I used when demonstrating Visio automation to the St. Louis PSUG back in March of 2016 and cobbled together an example:

 

$Visio = New-Object –ComObject Visio.Application 
$doc= $Visio.Documents.Add('') 
$page= $doc.Pages[1] 
$stencilPath='SERVER_U.VSSX' 
$stencil=$Visio.Documents.OpenEx($stencilPath,64) 
$ServerMaster=$stencil.Masters['Server'] 
$bert=$page.Drop($ServerMaster,5,5)
$containerStencil=$Visio.Documents.OpenEx($Visio.GetBuiltinStencilFile(2,2),64)
$ContainerMaster=$containerStencil.Masters['Plain']
$container=$page.DropContainer($ContainerMaster,$Bert) 

And of course, that worked just fine on all of the boxes I had. That meant I had code that worked and code that didn’t work on the same box. Sounds like a great opportunity for debugging. I just needed to slowly change the working code until it didn’t work, right? That’s how my brain works, anyway.

Here’s what I came up with:

Import-Module c:\temp\VisioBot3000 -Force

New-VisioApplication  

New-VisioDocument C:\temp\VisioPrimitives1.vsdx 
$visio=get-visioapplication
$doc= $visio.ActiveDocument

Register-VisioStencil -Name Containers -BuiltIn Containers
Register-VisioStencil -path SERVER_U.VSSX -name Servers
Register-VisioShape -Name WebServer -From Servers -MasterName Server
Register-VisioContainer -Name Domain -From Containers -MasterName 'Plain'

$foo=$doc.Pages['Page-1']

New-VisioContainer -shape Domain -Label MyDomain -contents {
		    New-VisioShape -master WebServer -Label PrimaryServer -x 5 -y 5
}

That code works, and uses VisioBot3000 functions for just about everything. Notice the three variable assignment lines. They don’t have any logical effect on the code. The $foo variable is not used. However, if I leave the line assigning to $foo out, it stops working.

Import-Module c:\temp\VisioBot3000 -Force

New-VisioApplication  

New-VisioDocument C:\temp\VisioPrimitives1.vsdx 
$visio=get-visioapplication
$doc= $visio.ActiveDocument

Register-VisioStencil -Name Containers -BuiltIn Containers
Register-VisioStencil -path SERVER_U.VSSX -name Servers
Register-VisioShape -Name WebServer -From Servers -MasterName Server
Register-VisioContainer -Name Domain -From Containers -MasterName 'Plain'

#Commenting out the following line makes the code hang when dropping the container
#$foo=$doc.Pages['Page-1']

New-VisioContainer -shape Domain -Label MyDomain -contents {
		    New-VisioShape -master WebServer -Label PrimaryServer -x 5 -y 5
}

To add insult to injury, changing that line to [void]$doc.Pages[‘Page-1’] or [void]$doc.Pages[1] both result in a script that doesn’t hang.

So…somehow accessing the Pages property of the current document is fixing the problem.

My brain hurts.

Anyone have any ideas about this?  I’d love to get some feedback on what in the world is going on.

 

–Mike

Where I’ve been for the last few months

As I mentioned in my previous posts, I kind of fell off the planet (blog-wise, at least) at the end of August. I had good intentions for finishing the year out strong.  There were three different items that contributed to my downfall.


First, I’ve been battling lots of different illnesses (none of them anything major) pretty much continually since early June.  For three entire months, I coughed all the time.  Right now, I can’t hear in one ear because of the fluid backed up there.  That ear has only been a problem for a few days, but the other one (which cleared out yesterday) had been full for three weeks.  Like I said, nothing major, no life-threatening conditions, but over time it wears you down.


Second, I broke down and bought a server.  I have been putting off this purchase, but some book royalty money came through and I pulled the trigger.  Buying it didn’t take long.  What has been interesting is learning to do all of the things that most of you sysadmins take for granted.  I’ve never really been a sysadmin, more of a developer/DBA/toolsmith who happens to really, really like a language which is designed for sysadmins.  So, I’ve been building hyper-v hosts, lots of guests, building domains, joining domains, and trying to script as much as possible.  I’ve learned a lot and there’s still a lot to learn.  Most of it, though, is not stuff that I’ll probably blog about, because it’s really basic.  There might be a post or 2 coming, but it’s hard to say.


Third, and this one is the most “fun”, is that VisioBot3000 stopped working.  If you haven’t read my posts on automating Visio with PowerShell, VisioBot3000 is a module I wrote which allows you (among other things) to define your own PowerShell-based DSL to express diagrams in Visio.  By “stopped working” I mean that sometime around the end of August, trying to use Visio containers always caused the code to hang.

I am pretty good at debugging, so I tried the usual tricks.  I stepped through the code in the debugger.  The line of code that was hanging was pretty innocent-looking.  All of the variables had appropriate values.  But it caused a hang.  On my work laptop and my home laptop…two different OSes.  I tried reverting to an old commit…No luck.  I even tried copying code out of presentations I had done on VisioBot3000 and the results were the same.  I even posted on the PowerShell subreddit asking for ideas on how to debug.  The only suggestion was to use SysInternals process monitor to follow the two processes and see if I could find what was causing the issue.  I tried that a week or two ago (sometime during the holidays) and guess what?  It started working on my work laptop.  Still doesn’t work on my home laptop, though, or the VM I built and didn’t allow to patch to see if a patch was the culprit.


Conclusion: I’m mostly better health-wise, am getting comfortable with the server, and VisioBot3000 is working somewhere, so I should be back on track with rambling about PowerShell.

–Mike

PowerShellStation 2017 Goals

Following up on yesterday’s post reviewing my progress on goals from 2016, I thought I’d try to set out some goals for the new year.  I’m going to group them into 3 groups: Technology, Community, and Content.

Content Goals

  1. Write 100 posts.  I didn’t do so well with this last year, but this year will be different.  I’m not sure why I don’t write more often.  I enjoy writing and feel good about myself when I do it.  I’m going to try to be consistent with it as well, not having several months with no posts.
  2. Write a book.  I’ve written a couple of books with a publisher (here and here) and I think that was valuable experience.  I’m going to try to do it on my own.  That should enable me to keep the cost down.  I’m also going to try to do it a lot quicker (and maybe shorter) than the other books.  BTW…I’ve already started.
  3. Write a course.  I love to teach PowerShell, and I’ve got a lot of practice doing it at work.  I’m considering recording “lectures” for a course (like on Udemy).
  4. Edit/Contribute 50 topics on StackOverflow.com’s Documentation project for PowerShell.  It seems like a reasonable platform for information about PowerShell, and there are already a bunch of topics there ready to be filled in.

Community Goals

  1. Start a regional PSUG in Southwest Missouri.  I live there, so it’s silly for me to have to drive 3 hours to go to a user group meeting.  I don’t intend to stop going to those long-distance meetings altogether, but there are a lot of people in SWMO who don’t have a group.
  2. Continue Speaking.  If they’ll have me, I plan to continue speaking at local or regional user groups.  I’m also looking for “nearby”  SQL Saturdays, PowerShell Saturdays, or other settings.
  3. Continue the UG and teaching at work.  This one is pretty easy, but I don’t want to get distracted and let these fall apart.

Technology Goals

  1. Get handy with DSC and/or Chef.  I’m still scripting virtualization/provisioning “manually” (i.e. scripting the steps I’d do manually) rather than using a system to do that for me.  I wanted to do it that way so I would understand what goes on, but now that I know, I want to be out of that business.  DSC is almost certain to be part of the equation.  Chef might be, but that’s an open question (also, Packer, Vagrant, Ansible, etc.)
  2. Deploy operational tests with Pester/PoshSpec/OVF.  I see a lot of promise with these, but everything is single-machine focused.  Something like this looks like a good start, but needs some flexibility.
  3. Nano, Containers, (flavor of the month).  This one is kind of a wildcard.  These two (nano and containers) are new technology solutions that I understand at a surface level, but don’t have a good idea why or where I would use them.  I’m not sure if I’ll dig into one of these two or something else that pops up in the year, but there will be an in-depth project.

Bonus Goal

If I can get good with DSC, I really want to be able to spin up an entire environment from scratch.  By that, I mean from scripts (and downloaded ISOs) I want to be able to create a DC (with certificate services) and a DSC pull server, and then deploy the servers for a lab and have them configure themselves via the pull server.  For more of a bonus, use the newly created certificate services server to handle the passwords properly in the DSC configs.  By the way, I’m aware of Lability and PS-AutoLab-Env.  They’re both awesome but not quite what I’m looking for here.

 Those ought to keep me busy for the year.  What are you planning to do/share/learn this year?  Write about it and post a link in the comments!

–Mike

PowerShellStation 2016 Goals Review

I did a goal review back on August 22, reporting some good progress on my yearly goals and some plans for the remainder of the year.  Somehow, I seemed to have fallen off the earth since then.  I only posted twice since checklist of your goals
then, and both of those were in the week following the review.  I’ll be posting this week about what happened (spoiler alert…not much).

In the meantime, here’s how I did on my goals for 2016

  1. 100 posts. I only got to 35.  That’s kind of embarassing.  On the plus side, I had some of my best months in the last year (January – 10 posts, April – 8 posts, August – 7 posts).  If I could keep that kind of momentum it would make a lot of difference.  On the down side, if you exclude those 3 months I only has 10 posts in the remaining 9 months.  That’s abysmal.
  2. Virtualization Lab. In my review I mentioned that the box I bought to do virtualization on was only at 16GB of RAM and I needed to bump it up.  Didn’t do that.  I also mentioned the possibility of buying an R710 off of eBay.  Did that.  Dual, quad-core cpus, 36GB of RAM, 8TB of storage (so far).  I’ve done more virtualization since I bought it (in October) than I had ever done before.  I can definitely say I got this goal accomplished!
  3. Configure Servers with DSC. Other than the talk I did at MidMo, I haven’t really done much DSC this year.  Now that I’ve got a solid lab machine, this is high on the list for 2017.
  4. PowerShell User Group. I’ve started a PSUG at work (I work for a sofware company, so there are hundreds of people using PowerShell) and we’ve had 3 meetings so far.  They’ve mostly been sharing news and what we’re working on, but it’s a good start.  Beginning to form a community there.  Also, I attended several (more than a dozen, less than 20) meetings of local-ish PSUGs in Missouri.  The two I know of are each a 3-hour drive one way to get to so that’s a challenge but they’ve been great.  They both started this year, and I’ve tried to lend my support as much as I can.  I’ve spoken 6 or 7 times (I didn’t keep track) and had a great time at all of the meetings.
  5. Continue Teaching at Work.  Did lots of teaching.  I’d have to check the calendar to get a real total, but it was at least 10 days of teaching.
  6. Share more on GitHub. Really got into Github this year.  VisioBot3000, SQLPSX, POSH_Ado, etc. Next step: PowerShellGallery!
  7. Write more professional scripts. I think this will always be a goal of mine.  I’ve published a couple of checklists and try to be thoughtful about how to write better code as I’m writing it, but I often find myself writing “throwaway” code and cleaning it up later.  Need to eliminate as much of that first step as possible.
  8. Speak. I’ve spoken at 6 user group meetings this year and at 2 or 3 others in the past.  If you’ve got a UG within driving distance of SW Missouri (KS, NW Arkansas, Oklahoma), let me know…I really enjoy sharing what I’m doing as well as speaking on “general” PowerShell topics.  Also, it doesn’t need to be a PSUG…I’ve spoken at .NET and SQL groups as well.
  9. Encourage. Another perennial task.  I haven’t been as active in this as I have in the past.
  10. Digest. (from the goal review)

I get about 10 different paper.li daily digests either in email or on twitter. I don’t find a lot of value in them…they don’t seem to be curated for the most part, but I think adding another into the fray at this point would probably be lost. I’m going to skip this one this year…but keep it on the back burner.

I’ve been thinking about maybe doing something slightly different here.  Maybe a “module of the month” or “meet a PowerShell person” regular post.  Any suggestions?

Well…by my count I accomplished 6 (maybe 7) of the 10 goals from last year.  If you haven’t thought about what you’re going to try to accomplish this year I highly recommend you do.  Remember, if you don’t know where you’re going, you might not like where you end up!  A concrete list of goals, shared with friends (or with the public) makes it  easy to know if you’re achieving your goals or have lost sight of your goals.

–Mike

Module Structure Preferences (and my module history)

Modules in the Olden Days

Back in PowerShell 1.0 days, there were no modules.  Uphill both ways.  In 1.0, we only had script files.  That means you had a couple of choices when it came to deploying functions.  Either you put a bunch of functions in a .ps1 file and dot-sourced it, or you used parameterized scripts in place of functions.  I guess you could have put a function in a .ps1 file, but you’d still need to dot-source it, right?  Personally (at work), I had a bunch of .ps1 files, each devoted to a single “subject”.  I had one that dealt with db access (db_utils), one that interfaced with PeopleSoft (ps_utils), a module for scheduled tasks (schtasks_utils), and so on.  You get the idea.

PowerShell 2.0 – The Dawn of Modules

One of the easiest features to love in PowerShell 2.0 was support for modules (although remoting and advanced functions are cool, they’re not quite as easy).  There’s a $PSModulePath pointing to places to put modules in named folders, and in the folders you have .psm1 or .psd1 files.  There are other options (like .dll), but for scripts, these are what you run into.

Transitioning into modules for me started easy:  I just changed the extensions of the .ps1 files to .psm1.  I had written functions (require and reload) which knew where the files were stored, and handled dot-sourcing them.  You had to dot-source require and reload, but it was clear what was going on.  When modules were introduced, I changed the functions to look for psm1 files and import with Import-Module if they existed, and just carry on as before otherwise.

That’s Kind of Gross

Yep.  No module manifests, and dozens of .psm1 files in the same folder.  To make it worse, I wasn’t even using the PSModulePath, because the .psm1 files weren’t in proper named folders.  The benefit for me was that I didn’t have to change any code.  I let that go for several years.  Finally I broke down and put the files in proper folders and changed the code to stop using the obsolete require/reload functions and use Import-Module directly.  I still haven’t written module manifests for them.  I’m so bad.

What about new modules?

Good question!  For new stuff (written after 2.0 was introduced), I started with the same module structure:  single .psm1 file with a bunch of functions in it.  Probably put an Export-ModuleMember *-* in there to make sure that any helper functions don’t “leak”, but that was about it.  To be fair, I didn’t do a lot of module writing for quite a while, so this wasn’t a real hindrance.

Is there a problem with that?

No…there’s no problem with having a simple .psm1 script module containing functions.  At least from a technical standpoint.  Adding a module manifest is nice because you can document dependencies and speed up intellisense by listing public functions, but that’s extra.

The problem came when I wrote a module with a bunch of functions.  VisioBot3000 isn’t huge, but it has 38 functions so far.  At one point, the .psm1 file was over 1300 lines long.  That’s too much scrolling and searching in the file to be useful in my opinion.

What’s the recommended solution?

I’ve seen several posts recommending that each function should be broken out into a single .ps1 file and the .psm1 file should dot-source them all.  That definitely gets past the problem of having a big file.  But in my mind it creates a different problem.  The module directory (or sub-folder where the .ps1 files live) gets really big and it takes some work to find things.  Lots of opening and closing of files.  And the dot-sourcing operation isn’t free…it takes time to dot-source a large set of files.  Not a showstopper, but noticeable.

My tentative approach

How I’ve started organizing my modules is similar to how I organized “modules” in the 1.0 era.  Back then, each file was subject-specific.  In VisioBot3000, I split the functions out based on noun.

VisioBotPS1Files

I still have relatively short source files, but now each file generally has a get/set pair, and if other functions use the same noun they’re there too.

I’ve found that I often end up editing several functions in the same file to address issues, enhancements, etc.  I think it makes sense from a discoverability standpoint as well.  If I was looking at the source, I’d find functions which were related in the same file, rather than having to look through the directory for files with similar filenames.

Anyway, it’s what I’m doing.  You might be writing all scripts (no functions) and liking that.  More power to you.

Let me know what you think.

–Mike

VisioBot3000 Settings Import

It’s been a while since I last spoke about VisioBot3000.  I’ve got the project to a reasonably stable point…not quite feature complete but I don’t see a lot of big changes.

One of the things I found even as I wrote sample diagram scripts was that quite a bit of the script was taken up by things that wold probably be done exactly the same way in most diagrams.  For instance, if you’re doing a lot of server diagrams, you will probably be using the exact same stencils and the same shapes on those stencils, with the same nicknames.  Doing so makes it a lot easier to write your diagram scripts because you’re developing a “diagram language” which you’re familiar with.

For reference, here’s an example script from the project (without some “clean up” at the beginning):

Diagram C:\temp\TestVisio3.vsdx  

# Define shapes, containers, and connectors for the diagram
Stencil Containers -From C:\temp\MyContainers.vssx 
Stencil Servers -From SERVER_U.vssx
Shape WebServer -From Servers -MasterName 'Web Server'
Container Location -From Containers -MasterName 'Location'
Container Domain -From Containers -MasterName 'Domain'
Container Logical -From Containers -MasterName 'Logical'
Connector SQL -Color Red -arrow 

#this is the diagram
Set-NextShapePosition -x 3.5 -y 7
Logical MyFarm {
    Location MyCity {
        Domain MyDomain  {
            WebServer PrimaryServer
            WebServer HotSpare

        }
    }
    Location DRSite {
        Domain MyDomain -name SiteB_MyDomain {
            Set-RelativePositionDirection Vertical
		    WebServer BackupServer 

	    }
    }
}
SQL -From PrimaryServer -To BackupServer 
Hyperlink $BackupServer -link http://google.com

Complete-Diagram 

Lines 3-10 are all about setting up the standard shapes and stencils. The rest of the script is directly involved in the specific diagram I’m trying to build.

As an aside, one of the things I love about PowerShell is how much of the code I write is directly involved in solving the problem I’m working on. You don’t have to “code around PowerShell” very often.

Anyway, back to Visio.  Since a lot of the code is duplicated, I thought…how can I get that out of the diagram script?

My first thought was to write a function which took the name of an “include script” and somehow run the script.  The problem there is scoping.  Since the function is called (not dot-sourced) it couldn’t easily affect the global state, so shape, container, and connection functions wouldn’t get created.  That’s not useful.

My second thought was to allow an “include” script and have the user dot-scope the include script, but that wouldn’t be very obvious to someone looking at the diagram script.  Also, they could put anything in that, and it would seem to be “hidden” (for instance, if they put code unrelated to stencils, shapes, etc.).

I finally decided that a .psd1 file (like a module manifest) would be the best approach.  I could structure the hashtable in it to have the information I needed to set up the stencils, shapes, and connections, and read it in using PowerShell.  I would just need to write a “dispatch” function which transformed the items in the hashtable into the correct function calls.

So…reading in a .psd1 file is interesting.  I’m using PowerShell 5.0, so a choice (which I hadn’t heard of) is Import-PowerShellData.  This cmdlet is specifically created for this instance.  It has a -path parameter which points to the .psd1 file and returns a hashtable.  Nothing could be easier.  But….I really don’t want to pin the project to 5.0, especially just for this function.

Another option is Import-LocalizedData.  This one is a bit trickier, since it’s meant to read psd1 files in language-specific folders for purposes of localization.  To make it do what I want, I needed to split the path into a directory and filename, but then it does the trick.  This is a 4.0 cmdlet, which I guess I’m ok with.

There are other options for reading .psd1 files.  Most of them involve using Invoke-Expression, which I don’t like to use unless  I can avoid it.

And now that I read those, I see the ArgumentToConfigurationDataTransformation parameter attribute, which is a 4.0 feature added for DSC.  It’s even better than what I wrote (using Import-LocalizedData).

Time to re-write that function.

Any way you implement it, the function call in VisioBot3000 would look like this:

Import-VisioSettings -path <path to settings file>

and the settings file would look something like this:

@{
StencilPaths='c:\temp'

Stencils=@{Containers='C:\temp\MyContainers.vssx';
           Servers='SERVER_U.vssx'}

Shapes=@{WebServer='Servers','Web Server';
         DBServer='Servers','Database Server'
        }
Containers=@{Domain='Containers','Domain'
            }

Connectors=@{SQL=@{Color='Red';Arrow=$true}}

}

What do you think? Do you have anywhere you can use psd1 files to streamline your scripts?
–Mike

August 2016 Goal Review

Thought I’d take a minute and review the progress on my goals for the year.

  1. 100 posts. I’m at 32 (33 if you count this) so it’s going to take some dedication to make it. It’s already 10 more than last year, so that’s progress, but with 19 weeks left and 68 posts left, I’m going to have to beat 3 per week. I’m going to try. Might have to set a schedule (which would really help, but I’m not really wired that way).
  2. Virtualization Lab. I’ve got a couple of boxes running Hyper-V (client Hyper-V, but with nested virtualization I can get a lot closer). I’ve been playing with building servers, sysprep, differencing disks, etc. I feel like I’ve pretty much got this one covered. Need to jump the big box up to 32GB though. Thinking about getting a R710 off of eBay for a “next step” on this…maybe next year.
  3. Configure Servers with DSC. I’ve done some work with DSC both at home and at work, and gave a talk on DSC at the MidMO PSUG this month. Feel good about this, too.
  4. PowerShell User Group. First meeting at work is two days from now! I’ve also been to about a dozen meetings in Missouri (and posting about some of them). I’ve been honored to speak at 6 meetings so far as well, so this one is good. Bonus: I’ve started paperwork for reserving meeting space in the town I work (rather than 3 hours away), but haven’t scheduled anything yet.
  5. Continue Teaching at Work. I might not get 10 sessions in, but I think I’m already to 7. Good on this one.
  6. Share more on GitHub. VisioBot3000, SQLPSX, POSH_Ado, etc. Next step: PowerShellGallery!
  7. Write more professional scripts. I haven’t really checked, but I think this one is getting better.
  8. Speak. As I mentioned in #4, I’ve spoken at 6 user groups this year, so more than covered. If you’ve got a PSUG within driving distance of SW Missouri (KS, NW Arkansas, Oklahoma), let me know…I really enjoy sharing what I’m doing as well as speaking on “general” PowerShell topics.
  9. Encourage. I’ve probably let this one slide a bit. Note to self to comment more on other people’s posts.
  10. Digest. I get about 10 different paper.li daily digests either in email or on twitter. I don’t find a lot of value in them…they don’t seem to be curated for the most part, but I think adding another into the fray at this point would probably be lost. I’m going to skip this one this year…but keep it on the back burner.

Re-Thinking Positional Parameters

I mentioned in a previous post that I’ve recently changed my mind a bit about the Position parameter attribute. I guess technically it is the position parameter of the Parameter parameter attribute (i.e. there’s a parameter attribute called “Parameter” and it has a parameter called position). I don’t think you could come up with something much more difficult to correctly name.

Anyway, I’ve always had a low opinion of this particular parameter. Before I explain why, let’s review how it works.

If you have an advanced function, say Get-Stuff, you might see something like this:

Function Get-Stuff{
[CmdletBinding()]
Param([Parameter(Position=0)]$param1,
      [Parameter(Position=1)]$param2,
      [Parameter(Position=2)]$param3)
    
    #body of function
    write-output @{Param1=$param1;Param2=$param2;Param3=$param3}
}

The point of the Position= parameter (of the Parameter parameter attribute in the Param statement) is to say that if you use positional parameters in calling the function, you can expect that $param1 is in the first position (position 0), $param2 is in the second position (position 1) and so on.

The great part about it is that this code works the same without the Position= statements. Parameters (ignoring parameter sets) can be used in the order they’re listed in the function unless the PositionalBinding parameter of the CmdletBinding attribute is set to $false (which it isn’t, by default).

I’ve seen a lot of code which, in a function that has 12 parameters, mechanically sets Position= from 0 to 11 in order. It’s not exactly confusing, but it’s certainly not clear why you would bother doing it since it’s not necessary. From that observation, I decided that there was no point to using the Position= setting at all.

Micah Battin spoke a few months ago at the STLPSUG meeting on PowerShell Functions and got me to reconsider my position (no pun intended).

I explained my reservations and he said something to the effect that you should turn off automatic positional arguments (PositionalBinding=$false in the CmdletBinding()) and then set the position for the first one or two parameters. Those are the ones that people will be specifying postionally anyway.

Have you ever seen someone call a cmdlet and list 12 parameters with no names? I would guess not. There’s no reason to expect that people will want to write useless code like that, so why should your function definition encourage it?  By only including positions for the one or two “essential parameters” you’re doing your part to make the caller’s code better.

I’m going to be editing my “Advanced Function” snippet to include PositionalBinding=$False and see where it takes me.

What do you think?

–Mike

Hyper-V issues with Windows 10 Anniversary Update

My main home computer is running Windows 10 and Hyper-V, and I was really looking forward to the anniversary update.  PowerShell Direct, nested virtualization, and containers all sound awesome.  I’ve played with them in a test box, but not on my main box.

So…I got home after work the day of the update, installed it (took an hour or so) and rebooted.

First problem…Hyper-V management services wouldn’t start.  This was a problem with a driver…took a bit to find it, but got it solved that night.

Second problem….none of my VMs would start.  Had to re-create the virtual switch and re-assign to the VMs.  Not a huge deal.

Third problem…new 2016 TP5 VMs wouldn’t boot.  This one took a bit longer, and I found the actual workaround in a github issue comment.  I knew there was a reason I got a hundred or so of those a day.  🙂

The ingredients for this third problem are:

  • 2016 TP5
  • Gen 2 VM
  • Secure Boot
  • Version 8.0 firmware

If you have a VM (let’s say it’s called TheBadVM) you can get it to boot with the following command (with the VM stopped of course):

Get-VM -Name TheBadVM | Set-VMFirmware -EnableSecureBoot Off -SecureBootTemplate MicrosoftUEFICertificateAuthority

Then, the VM will start.

I hope this helps you. I figured this might be easier to find than the original (thanks to Ryan Yates!).

–Mike