Great Books for PowerShell Ideas

I get asked a lot about what PowerShell books people should be reading. The easy answer is, “It depends”.

If you’re looking for a tutorial book (or two) to get you started with PowerShell, the only answer I give is “Learn PowerShell in a Month of Lunches”, followed by “Learn PowerShell Toolmaking in a Month of Lunches”. There are other good books in this space (including one I wrote), but these are by far the best I’ve found.

If you’re looking for a reference book, I generally recommend Bruce Payette’s “PowerShell in Action”. It has a new version coming out soon (april?) and I can hardly wait. Besides that book, “PowerShell in Depth” (by Jones, Hicks, and Siddaway) is also a safe bet.

If you’ve got the basics of PowerShell down, and are looking for ideas for how to do something, here are some books that aren’t mentioned as often, but are indispensible:

  1. PowerShell Cookbook (Lee Holmes)
  2. PowerShell Deep Dives (several)
  3. PowerShell for Developers (Doug Finke)

What are your book recommendations? Did I miss something essential?

-Mike

Some small PowerShell bugs I’ve found lately

I love PowerShell. Everyone who knows me knows that. Recently, though, I seem to be running into more bugs. I’m not complaining, because PowerShell does tons of amazing things and the problems I’m encountering don’t have a huge impact. With that said, here they are.

Pathological properties in Out-GridView

PowerShell has always allowed us to use properties with names that aren’t kosher. For instance, we can create an object that has properties with spaces and symbols in the name like this:

$obj=[pscustomobject]@{'test property #1'='hello'}

This capability is essential, since we often find ourselves importing a CSV file that we don’t have any control over. (As an exercise, look at the expanded CSV output from schtasks.exe). To access those properties we can use quotes where most languages doesn’t like them.

$obj.'test property #1'

Or we can use variables (again, something most languages won’t let you do this easily):

$prop='test property #1'; $obj.$prop

A friend called me last week with an interesting issue which turned out to be related to this kind of behavior. He had a SQL query which renamed output columns in “pathological” ways. When he piped the output of the SQL to Out-GridView, the ugly columns showed up in the output, but the columns were empty.

Here’s a minimal case to reproduce the issue:

[pscustomobject]@{'test property.'='hello'} | out-gridview

The problem here is that the property name ends with a dot. Here’s a UserVoice entry that explains that Out-GridView doesn’t like property names that end in whitespace, either. I added a comment about dots for completeness’ sake.

Formatting remote Select-String output

Another minor issue I’ve run into is that deserialized select-string output doesn’t format nicely. The issue looks to be that the format.ps1xml for MatchInfo objects uses a custom ToString() method that doesn’t survive the serialization. What happens is that you just get blank lines instead of any helpful output. The objects are intact, though, all of the properties are there. So using the output is fine, just that the formatting is broken. Here’s a minimal example:

"hello`r`n"*6 | Out-File c:\temp\testFile.txt
write-host 'Local execution'
select-string -Path c:\temp\testfile.txt -Pattern hello -SimpleMatch  

write-host 'Remote execution'
invoke-command -ScriptBlock{ select-string -Path c:\temp\testfile.txt -Pattern hello -SimpleMatch} -ComputerName localhost   

I didn’t find anything close on UserVoice, so I posted a new entry.
Neither of these caused any real problem, but they were fun to dig into.

What bugs have you found lately? Have you reported them?

-Mike

February STLPSUG Meeting

I had the privilege of sharing again at the STLPSUG. February’s meeting was at Model Technologies, and Jason Rutherford was a great host.

I spoke on being a good citizen on the pipeline, both for output and input. Basically, best practices for pipeline output (which is fairly straight-forward), and techniques for accepting pipeline input (including $input, filters, and parameter attributes).

The group was a bit more advanced than usual, which was cool. There was a lot of fun heckling (I’ll give you $5 if you put $input in the process block, for instance) and a lot of participation from everyone.

As usual, after the presentation the talk turned into a giant DevOps discussion.

If you live anywhere near St. Louis and haven’t attended one of these meetings, I highly recommend them. Mike Lombardi has done a great job keeping the group moving and focused.

You can find out about upcoming meetings on meetup.com.

P.S. My friend and co-worker Ian was able to come with me this time. Made the drive a lot more fun, and he had a good time, too.

January St. Louis PSUG meeting was a blast!

A couple of weeks ago I had the pleasure to attend another STL PSUG meeting. Mike Lombardi presented on “Getting Started with a Real Problem” and did a great job.

His scenario was someone who didn’t really know PowerShell at all and needed to troubleshoot a 3-server web farm where the nodes had different problems.

There were some technical difficulties with his lab setup (he used Lability, which was cool), but he stuck with it and we did all of the fixing in the scenario using a workstation rather than RDP’ing into the nodes.

The recording of the event (which was live-streamed) can be found here.

I will be presenting next month on writing functions that work with the pipeline.

–Mike

Debugging VisioBot3000

magnifying glassThe Setup

Sometime around late August of 2016, VisioBot3000 stopped working.  It was sometime after the Windows 10 anniversary update, and I noticed that when I ran any of the examples in the repo, PowerShell hung whenever it tried to place a container on the page.

I had not made any recent changes to the code.  It failed on every box I had.

First attempts at debugging

So…I really get fed up with people who want to blame “external forces” for problems in their code.  When I found that none of the examples worked (though they obviously did when I wrote them), I figured that I must have done something stupid.

Hey!  I’m using Git!  Since I’ve got a history of 93 commits going back to march, I figured I could isolate the problem.

So…I reverted to a commit a few weeks earlier.  And it failed exactly the same way.

Back a few weeks before that.  No change.

Back to the first talk I gave at a user group….no change.

I gave up.

For several months.

Reaching out for help

After Thanksgiving, I posted a question on /r/Powershell explaining the situation.  I got one reply, suggesting that I watch things in ProcMon while debugging.  Seemed like a great thing to do,  When I got around to trying it, however, it didn’t show anything useful (at least to me…digging through the thousands of lines of output is somewhat difficult).

Making it Simple

Late last year, I thought, I should come up with a minimal verifiable example.  Rather than say “all of my code breaks”, I should be able to come up with the smallest possible example that breaks.  To that end, I wanted to include as little VisioBot3000 code as I could, and show that something’s up with Visio’s COM interface (or something like that).  To that end, I went back to the slides I used when demonstrating Visio automation to the St. Louis PSUG back in March of 2016 and cobbled together an example:

 

$Visio = New-Object –ComObject Visio.Application $doc= $Visio.Documents.Add('') $page= $doc.Pages[1] $stencilPath='SERVER_U.VSSX' $stencil=$Visio.Documents.OpenEx($stencilPath,64) $ServerMaster=$stencil.Masters['Server'] $bert=$page.Drop($ServerMaster,5,5) $containerStencil=$Visio.Documents.OpenEx($Visio.GetBuiltinStencilFile(2,2),64) $ContainerMaster=$containerStencil.Masters['Plain'] $container=$page.DropContainer($ContainerMaster,$Bert) 

And of course, that worked just fine on all of the boxes I had. That meant I had code that worked and code that didn’t work on the same box. Sounds like a great opportunity for debugging. I just needed to slowly change the working code until it didn’t work, right? That’s how my brain works, anyway.

Here’s what I came up with:

Import-Module c:\temp\VisioBot3000 -Force

New-VisioApplication  

New-VisioDocument C:\temp\VisioPrimitives1.vsdx 
$visio=get-visioapplication
$doc= $visio.ActiveDocument

Register-VisioStencil -Name Containers -BuiltIn Containers
Register-VisioStencil -path SERVER_U.VSSX -name Servers
Register-VisioShape -Name WebServer -From Servers -MasterName Server
Register-VisioContainer -Name Domain -From Containers -MasterName 'Plain'

$foo=$doc.Pages['Page-1']

New-VisioContainer -shape Domain -Label MyDomain -contents {
		    New-VisioShape -master WebServer -Label PrimaryServer -x 5 -y 5
}

That code works, and uses VisioBot3000 functions for just about everything. Notice the three variable assignment lines. They don’t have any logical effect on the code. The $foo variable is not used. However, if I leave the line assigning to $foo out, it stops working.

Import-Module c:\temp\VisioBot3000 -Force

New-VisioApplication  

New-VisioDocument C:\temp\VisioPrimitives1.vsdx 
$visio=get-visioapplication
$doc= $visio.ActiveDocument

Register-VisioStencil -Name Containers -BuiltIn Containers
Register-VisioStencil -path SERVER_U.VSSX -name Servers
Register-VisioShape -Name WebServer -From Servers -MasterName Server
Register-VisioContainer -Name Domain -From Containers -MasterName 'Plain'

#Commenting out the following line makes the code hang when dropping the container
#$foo=$doc.Pages['Page-1']

New-VisioContainer -shape Domain -Label MyDomain -contents {
		    New-VisioShape -master WebServer -Label PrimaryServer -x 5 -y 5
}

To add insult to injury, changing that line to [void]$doc.Pages[‘Page-1’] or [void]$doc.Pages[1] both result in a script that doesn’t hang.

So…somehow accessing the Pages property of the current document is fixing the problem.

My brain hurts.

Anyone have any ideas about this?  I’d love to get some feedback on what in the world is going on.

 

–Mike

Where I’ve been for the last few months

As I mentioned in my previous posts, I kind of fell off the planet (blog-wise, at least) at the end of August. I had good intentions for finishing the year out strong.  There were three different items that contributed to my downfall.


First, I’ve been battling lots of different illnesses (none of them anything major) pretty much continually since early June.  For three entire months, I coughed all the time.  Right now, I can’t hear in one ear because of the fluid backed up there.  That ear has only been a problem for a few days, but the other one (which cleared out yesterday) had been full for three weeks.  Like I said, nothing major, no life-threatening conditions, but over time it wears you down.


Second, I broke down and bought a server.  I have been putting off this purchase, but some book royalty money came through and I pulled the trigger.  Buying it didn’t take long.  What has been interesting is learning to do all of the things that most of you sysadmins take for granted.  I’ve never really been a sysadmin, more of a developer/DBA/toolsmith who happens to really, really like a language which is designed for sysadmins.  So, I’ve been building hyper-v hosts, lots of guests, building domains, joining domains, and trying to script as much as possible.  I’ve learned a lot and there’s still a lot to learn.  Most of it, though, is not stuff that I’ll probably blog about, because it’s really basic.  There might be a post or 2 coming, but it’s hard to say.


Third, and this one is the most “fun”, is that VisioBot3000 stopped working.  If you haven’t read my posts on automating Visio with PowerShell, VisioBot3000 is a module I wrote which allows you (among other things) to define your own PowerShell-based DSL to express diagrams in Visio.  By “stopped working” I mean that sometime around the end of August, trying to use Visio containers always caused the code to hang.

I am pretty good at debugging, so I tried the usual tricks.  I stepped through the code in the debugger.  The line of code that was hanging was pretty innocent-looking.  All of the variables had appropriate values.  But it caused a hang.  On my work laptop and my home laptop…two different OSes.  I tried reverting to an old commit…No luck.  I even tried copying code out of presentations I had done on VisioBot3000 and the results were the same.  I even posted on the PowerShell subreddit asking for ideas on how to debug.  The only suggestion was to use SysInternals process monitor to follow the two processes and see if I could find what was causing the issue.  I tried that a week or two ago (sometime during the holidays) and guess what?  It started working on my work laptop.  Still doesn’t work on my home laptop, though, or the VM I built and didn’t allow to patch to see if a patch was the culprit.


Conclusion: I’m mostly better health-wise, am getting comfortable with the server, and VisioBot3000 is working somewhere, so I should be back on track with rambling about PowerShell.

–Mike

PowerShellStation 2017 Goals

Following up on yesterday’s post reviewing my progress on goals from 2016, I thought I’d try to set out some goals for the new year.  I’m going to group them into 3 groups: Technology, Community, and Content.

Content Goals

  1. Write 100 posts.  I didn’t do so well with this last year, but this year will be different.  I’m not sure why I don’t write more often.  I enjoy writing and feel good about myself when I do it.  I’m going to try to be consistent with it as well, not having several months with no posts.
  2. Write a book.  I’ve written a couple of books with a publisher (here and here) and I think that was valuable experience.  I’m going to try to do it on my own.  That should enable me to keep the cost down.  I’m also going to try to do it a lot quicker (and maybe shorter) than the other books.  BTW…I’ve already started.
  3. Write a course.  I love to teach PowerShell, and I’ve got a lot of practice doing it at work.  I’m considering recording “lectures” for a course (like on Udemy).
  4. Edit/Contribute 50 topics on StackOverflow.com’s Documentation project for PowerShell.  It seems like a reasonable platform for information about PowerShell, and there are already a bunch of topics there ready to be filled in.

Community Goals

  1. Start a regional PSUG in Southwest Missouri.  I live there, so it’s silly for me to have to drive 3 hours to go to a user group meeting.  I don’t intend to stop going to those long-distance meetings altogether, but there are a lot of people in SWMO who don’t have a group.
  2. Continue Speaking.  If they’ll have me, I plan to continue speaking at local or regional user groups.  I’m also looking for “nearby”  SQL Saturdays, PowerShell Saturdays, or other settings.
  3. Continue the UG and teaching at work.  This one is pretty easy, but I don’t want to get distracted and let these fall apart.

Technology Goals

  1. Get handy with DSC and/or Chef.  I’m still scripting virtualization/provisioning “manually” (i.e. scripting the steps I’d do manually) rather than using a system to do that for me.  I wanted to do it that way so I would understand what goes on, but now that I know, I want to be out of that business.  DSC is almost certain to be part of the equation.  Chef might be, but that’s an open question (also, Packer, Vagrant, Ansible, etc.)
  2. Deploy operational tests with Pester/PoshSpec/OVF.  I see a lot of promise with these, but everything is single-machine focused.  Something like this looks like a good start, but needs some flexibility.
  3. Nano, Containers, (flavor of the month).  This one is kind of a wildcard.  These two (nano and containers) are new technology solutions that I understand at a surface level, but don’t have a good idea why or where I would use them.  I’m not sure if I’ll dig into one of these two or something else that pops up in the year, but there will be an in-depth project.

Bonus Goal

If I can get good with DSC, I really want to be able to spin up an entire environment from scratch.  By that, I mean from scripts (and downloaded ISOs) I want to be able to create a DC (with certificate services) and a DSC pull server, and then deploy the servers for a lab and have them configure themselves via the pull server.  For more of a bonus, use the newly created certificate services server to handle the passwords properly in the DSC configs.  By the way, I’m aware of Lability and PS-AutoLab-Env.  They’re both awesome but not quite what I’m looking for here.

 Those ought to keep me busy for the year.  What are you planning to do/share/learn this year?  Write about it and post a link in the comments!

–Mike

PowerShellStation 2016 Goals Review

I did a goal review back on August 22, reporting some good progress on my yearly goals and some plans for the remainder of the year.  Somehow, I seemed to have fallen off the earth since then.  I only posted twice since checklist of your goals
then, and both of those were in the week following the review.  I’ll be posting this week about what happened (spoiler alert…not much).

In the meantime, here’s how I did on my goals for 2016

  1. 100 posts. I only got to 35.  That’s kind of embarassing.  On the plus side, I had some of my best months in the last year (January – 10 posts, April – 8 posts, August – 7 posts).  If I could keep that kind of momentum it would make a lot of difference.  On the down side, if you exclude those 3 months I only has 10 posts in the remaining 9 months.  That’s abysmal.
  2. Virtualization Lab. In my review I mentioned that the box I bought to do virtualization on was only at 16GB of RAM and I needed to bump it up.  Didn’t do that.  I also mentioned the possibility of buying an R710 off of eBay.  Did that.  Dual, quad-core cpus, 36GB of RAM, 8TB of storage (so far).  I’ve done more virtualization since I bought it (in October) than I had ever done before.  I can definitely say I got this goal accomplished!
  3. Configure Servers with DSC. Other than the talk I did at MidMo, I haven’t really done much DSC this year.  Now that I’ve got a solid lab machine, this is high on the list for 2017.
  4. PowerShell User Group. I’ve started a PSUG at work (I work for a sofware company, so there are hundreds of people using PowerShell) and we’ve had 3 meetings so far.  They’ve mostly been sharing news and what we’re working on, but it’s a good start.  Beginning to form a community there.  Also, I attended several (more than a dozen, less than 20) meetings of local-ish PSUGs in Missouri.  The two I know of are each a 3-hour drive one way to get to so that’s a challenge but they’ve been great.  They both started this year, and I’ve tried to lend my support as much as I can.  I’ve spoken 6 or 7 times (I didn’t keep track) and had a great time at all of the meetings.
  5. Continue Teaching at Work.  Did lots of teaching.  I’d have to check the calendar to get a real total, but it was at least 10 days of teaching.
  6. Share more on GitHub. Really got into Github this year.  VisioBot3000, SQLPSX, POSH_Ado, etc. Next step: PowerShellGallery!
  7. Write more professional scripts. I think this will always be a goal of mine.  I’ve published a couple of checklists and try to be thoughtful about how to write better code as I’m writing it, but I often find myself writing “throwaway” code and cleaning it up later.  Need to eliminate as much of that first step as possible.
  8. Speak. I’ve spoken at 6 user group meetings this year and at 2 or 3 others in the past.  If you’ve got a UG within driving distance of SW Missouri (KS, NW Arkansas, Oklahoma), let me know…I really enjoy sharing what I’m doing as well as speaking on “general” PowerShell topics.  Also, it doesn’t need to be a PSUG…I’ve spoken at .NET and SQL groups as well.
  9. Encourage. Another perennial task.  I haven’t been as active in this as I have in the past.
  10. Digest. (from the goal review)

I get about 10 different paper.li daily digests either in email or on twitter. I don’t find a lot of value in them…they don’t seem to be curated for the most part, but I think adding another into the fray at this point would probably be lost. I’m going to skip this one this year…but keep it on the back burner.

I’ve been thinking about maybe doing something slightly different here.  Maybe a “module of the month” or “meet a PowerShell person” regular post.  Any suggestions?

Well…by my count I accomplished 6 (maybe 7) of the 10 goals from last year.  If you haven’t thought about what you’re going to try to accomplish this year I highly recommend you do.  Remember, if you don’t know where you’re going, you might not like where you end up!  A concrete list of goals, shared with friends (or with the public) makes it  easy to know if you’re achieving your goals or have lost sight of your goals.

–Mike

Module Structure Preferences (and my module history)

Modules in the Olden Days

Back in PowerShell 1.0 days, there were no modules.  Uphill both ways.  In 1.0, we only had script files.  That means you had a couple of choices when it came to deploying functions.  Either you put a bunch of functions in a .ps1 file and dot-sourced it, or you used parameterized scripts in place of functions.  I guess you could have put a function in a .ps1 file, but you’d still need to dot-source it, right?  Personally (at work), I had a bunch of .ps1 files, each devoted to a single “subject”.  I had one that dealt with db access (db_utils), one that interfaced with PeopleSoft (ps_utils), a module for scheduled tasks (schtasks_utils), and so on.  You get the idea.

PowerShell 2.0 – The Dawn of Modules

One of the easiest features to love in PowerShell 2.0 was support for modules (although remoting and advanced functions are cool, they’re not quite as easy).  There’s a $PSModulePath pointing to places to put modules in named folders, and in the folders you have .psm1 or .psd1 files.  There are other options (like .dll), but for scripts, these are what you run into.

Transitioning into modules for me started easy:  I just changed the extensions of the .ps1 files to .psm1.  I had written functions (require and reload) which knew where the files were stored, and handled dot-sourcing them.  You had to dot-source require and reload, but it was clear what was going on.  When modules were introduced, I changed the functions to look for psm1 files and import with Import-Module if they existed, and just carry on as before otherwise.

That’s Kind of Gross

Yep.  No module manifests, and dozens of .psm1 files in the same folder.  To make it worse, I wasn’t even using the PSModulePath, because the .psm1 files weren’t in proper named folders.  The benefit for me was that I didn’t have to change any code.  I let that go for several years.  Finally I broke down and put the files in proper folders and changed the code to stop using the obsolete require/reload functions and use Import-Module directly.  I still haven’t written module manifests for them.  I’m so bad.

What about new modules?

Good question!  For new stuff (written after 2.0 was introduced), I started with the same module structure:  single .psm1 file with a bunch of functions in it.  Probably put an Export-ModuleMember *-* in there to make sure that any helper functions don’t “leak”, but that was about it.  To be fair, I didn’t do a lot of module writing for quite a while, so this wasn’t a real hindrance.

Is there a problem with that?

No…there’s no problem with having a simple .psm1 script module containing functions.  At least from a technical standpoint.  Adding a module manifest is nice because you can document dependencies and speed up intellisense by listing public functions, but that’s extra.

The problem came when I wrote a module with a bunch of functions.  VisioBot3000 isn’t huge, but it has 38 functions so far.  At one point, the .psm1 file was over 1300 lines long.  That’s too much scrolling and searching in the file to be useful in my opinion.

What’s the recommended solution?

I’ve seen several posts recommending that each function should be broken out into a single .ps1 file and the .psm1 file should dot-source them all.  That definitely gets past the problem of having a big file.  But in my mind it creates a different problem.  The module directory (or sub-folder where the .ps1 files live) gets really big and it takes some work to find things.  Lots of opening and closing of files.  And the dot-sourcing operation isn’t free…it takes time to dot-source a large set of files.  Not a showstopper, but noticeable.

My tentative approach

How I’ve started organizing my modules is similar to how I organized “modules” in the 1.0 era.  Back then, each file was subject-specific.  In VisioBot3000, I split the functions out based on noun.

VisioBotPS1Files

I still have relatively short source files, but now each file generally has a get/set pair, and if other functions use the same noun they’re there too.

I’ve found that I often end up editing several functions in the same file to address issues, enhancements, etc.  I think it makes sense from a discoverability standpoint as well.  If I was looking at the source, I’d find functions which were related in the same file, rather than having to look through the directory for files with similar filenames.

Anyway, it’s what I’m doing.  You might be writing all scripts (no functions) and liking that.  More power to you.

Let me know what you think.

–Mike

VisioBot3000 Settings Import

It’s been a while since I last spoke about VisioBot3000.  I’ve got the project to a reasonably stable point…not quite feature complete but I don’t see a lot of big changes.

One of the things I found even as I wrote sample diagram scripts was that quite a bit of the script was taken up by things that wold probably be done exactly the same way in most diagrams.  For instance, if you’re doing a lot of server diagrams, you will probably be using the exact same stencils and the same shapes on those stencils, with the same nicknames.  Doing so makes it a lot easier to write your diagram scripts because you’re developing a “diagram language” which you’re familiar with.

For reference, here’s an example script from the project (without some “clean up” at the beginning):

Diagram C:\temp\TestVisio3.vsdx  

# Define shapes, containers, and connectors for the diagram
Stencil Containers -From C:\temp\MyContainers.vssx 
Stencil Servers -From SERVER_U.vssx
Shape WebServer -From Servers -MasterName 'Web Server'
Container Location -From Containers -MasterName 'Location'
Container Domain -From Containers -MasterName 'Domain'
Container Logical -From Containers -MasterName 'Logical'
Connector SQL -Color Red -arrow 

#this is the diagram
Set-NextShapePosition -x 3.5 -y 7
Logical MyFarm {
    Location MyCity {
        Domain MyDomain  {
            WebServer PrimaryServer
            WebServer HotSpare

        }
    }
    Location DRSite {
        Domain MyDomain -name SiteB_MyDomain {
            Set-RelativePositionDirection Vertical
		    WebServer BackupServer 

	    }
    }
}
SQL -From PrimaryServer -To BackupServer 
Hyperlink $BackupServer -link http://google.com

Complete-Diagram 

Lines 3-10 are all about setting up the standard shapes and stencils. The rest of the script is directly involved in the specific diagram I’m trying to build.

As an aside, one of the things I love about PowerShell is how much of the code I write is directly involved in solving the problem I’m working on. You don’t have to “code around PowerShell” very often.

Anyway, back to Visio.  Since a lot of the code is duplicated, I thought…how can I get that out of the diagram script?

My first thought was to write a function which took the name of an “include script” and somehow run the script.  The problem there is scoping.  Since the function is called (not dot-sourced) it couldn’t easily affect the global state, so shape, container, and connection functions wouldn’t get created.  That’s not useful.

My second thought was to allow an “include” script and have the user dot-scope the include script, but that wouldn’t be very obvious to someone looking at the diagram script.  Also, they could put anything in that, and it would seem to be “hidden” (for instance, if they put code unrelated to stencils, shapes, etc.).

I finally decided that a .psd1 file (like a module manifest) would be the best approach.  I could structure the hashtable in it to have the information I needed to set up the stencils, shapes, and connections, and read it in using PowerShell.  I would just need to write a “dispatch” function which transformed the items in the hashtable into the correct function calls.

So…reading in a .psd1 file is interesting.  I’m using PowerShell 5.0, so a choice (which I hadn’t heard of) is Import-PowerShellData.  This cmdlet is specifically created for this instance.  It has a -path parameter which points to the .psd1 file and returns a hashtable.  Nothing could be easier.  But….I really don’t want to pin the project to 5.0, especially just for this function.

Another option is Import-LocalizedData.  This one is a bit trickier, since it’s meant to read psd1 files in language-specific folders for purposes of localization.  To make it do what I want, I needed to split the path into a directory and filename, but then it does the trick.  This is a 4.0 cmdlet, which I guess I’m ok with.

There are other options for reading .psd1 files.  Most of them involve using Invoke-Expression, which I don’t like to use unless  I can avoid it.

And now that I read those, I see the ArgumentToConfigurationDataTransformation parameter attribute, which is a 4.0 feature added for DSC.  It’s even better than what I wrote (using Import-LocalizedData).

Time to re-write that function.

Any way you implement it, the function call in VisioBot3000 would look like this:

Import-VisioSettings -path <path to settings file>

and the settings file would look something like this:

@{
StencilPaths='c:\temp'

Stencils=@{Containers='C:\temp\MyContainers.vssx';
           Servers='SERVER_U.vssx'}

Shapes=@{WebServer='Servers','Web Server';
         DBServer='Servers','Database Server'
        }
Containers=@{Domain='Containers','Domain'
            }

Connectors=@{SQL=@{Color='Red';Arrow=$true}}

}

What do you think? Do you have anywhere you can use psd1 files to streamline your scripts?
–Mike