VisioBot3000 Settings Import

It’s been a while since I last spoke about VisioBot3000.  I’ve got the project to a reasonably stable point…not quite feature complete but I don’t see a lot of big changes.

One of the things I found even as I wrote sample diagram scripts was that quite a bit of the script was taken up by things that wold probably be done exactly the same way in most diagrams.  For instance, if you’re doing a lot of server diagrams, you will probably be using the exact same stencils and the same shapes on those stencils, with the same nicknames.  Doing so makes it a lot easier to write your diagram scripts because you’re developing a “diagram language” which you’re familiar with.

For reference, here’s an example script from the project (without some “clean up” at the beginning):

Diagram C:\temp\TestVisio3.vsdx  

# Define shapes, containers, and connectors for the diagram
Stencil Containers -From C:\temp\MyContainers.vssx 
Stencil Servers -From SERVER_U.vssx
Shape WebServer -From Servers -MasterName 'Web Server'
Container Location -From Containers -MasterName 'Location'
Container Domain -From Containers -MasterName 'Domain'
Container Logical -From Containers -MasterName 'Logical'
Connector SQL -Color Red -arrow 

#this is the diagram
Set-NextShapePosition -x 3.5 -y 7
Logical MyFarm {
    Location MyCity {
        Domain MyDomain  {
            WebServer PrimaryServer
            WebServer HotSpare

        }
    }
    Location DRSite {
        Domain MyDomain -name SiteB_MyDomain {
            Set-RelativePositionDirection Vertical
		    WebServer BackupServer 

	    }
    }
}
SQL -From PrimaryServer -To BackupServer 
Hyperlink $BackupServer -link http://google.com

Complete-Diagram 

Lines 3-10 are all about setting up the standard shapes and stencils. The rest of the script is directly involved in the specific diagram I’m trying to build.

As an aside, one of the things I love about PowerShell is how much of the code I write is directly involved in solving the problem I’m working on. You don’t have to “code around PowerShell” very often.

Anyway, back to Visio.  Since a lot of the code is duplicated, I thought…how can I get that out of the diagram script?

My first thought was to write a function which took the name of an “include script” and somehow run the script.  The problem there is scoping.  Since the function is called (not dot-sourced) it couldn’t easily affect the global state, so shape, container, and connection functions wouldn’t get created.  That’s not useful.

My second thought was to allow an “include” script and have the user dot-scope the include script, but that wouldn’t be very obvious to someone looking at the diagram script.  Also, they could put anything in that, and it would seem to be “hidden” (for instance, if they put code unrelated to stencils, shapes, etc.).

I finally decided that a .psd1 file (like a module manifest) would be the best approach.  I could structure the hashtable in it to have the information I needed to set up the stencils, shapes, and connections, and read it in using PowerShell.  I would just need to write a “dispatch” function which transformed the items in the hashtable into the correct function calls.

So…reading in a .psd1 file is interesting.  I’m using PowerShell 5.0, so a choice (which I hadn’t heard of) is Import-PowerShellData.  This cmdlet is specifically created for this instance.  It has a -path parameter which points to the .psd1 file and returns a hashtable.  Nothing could be easier.  But….I really don’t want to pin the project to 5.0, especially just for this function.

Another option is Import-LocalizedData.  This one is a bit trickier, since it’s meant to read psd1 files in language-specific folders for purposes of localization.  To make it do what I want, I needed to split the path into a directory and filename, but then it does the trick.  This is a 4.0 cmdlet, which I guess I’m ok with.

There are other options for reading .psd1 files.  Most of them involve using Invoke-Expression, which I don’t like to use unless  I can avoid it.

And now that I read those, I see the ArgumentToConfigurationDataTransformation parameter attribute, which is a 4.0 feature added for DSC.  It’s even better than what I wrote (using Import-LocalizedData).

Time to re-write that function.

Any way you implement it, the function call in VisioBot3000 would look like this:

Import-VisioSettings -path <path to settings file>

and the settings file would look something like this:

@{
StencilPaths='c:\temp'

Stencils=@{Containers='C:\temp\MyContainers.vssx';
           Servers='SERVER_U.vssx'}

Shapes=@{WebServer='Servers','Web Server';
         DBServer='Servers','Database Server'
        }
Containers=@{Domain='Containers','Domain'
            }

Connectors=@{SQL=@{Color='Red';Arrow=$true}}

}

What do you think? Do you have anywhere you can use psd1 files to streamline your scripts?
–Mike

August 2016 Goal Review

Thought I’d take a minute and review the progress on my goals for the year.

  1. 100 posts. I’m at 32 (33 if you count this) so it’s going to take some dedication to make it. It’s already 10 more than last year, so that’s progress, but with 19 weeks left and 68 posts left, I’m going to have to beat 3 per week. I’m going to try. Might have to set a schedule (which would really help, but I’m not really wired that way).
  2. Virtualization Lab. I’ve got a couple of boxes running Hyper-V (client Hyper-V, but with nested virtualization I can get a lot closer). I’ve been playing with building servers, sysprep, differencing disks, etc. I feel like I’ve pretty much got this one covered. Need to jump the big box up to 32GB though. Thinking about getting a R710 off of eBay for a “next step” on this…maybe next year.
  3. Configure Servers with DSC. I’ve done some work with DSC both at home and at work, and gave a talk on DSC at the MidMO PSUG this month. Feel good about this, too.
  4. PowerShell User Group. First meeting at work is two days from now! I’ve also been to about a dozen meetings in Missouri (and posting about some of them). I’ve been honored to speak at 6 meetings so far as well, so this one is good. Bonus: I’ve started paperwork for reserving meeting space in the town I work (rather than 3 hours away), but haven’t scheduled anything yet.
  5. Continue Teaching at Work. I might not get 10 sessions in, but I think I’m already to 7. Good on this one.
  6. Share more on GitHub. VisioBot3000, SQLPSX, POSH_Ado, etc. Next step: PowerShellGallery!
  7. Write more professional scripts. I haven’t really checked, but I think this one is getting better.
  8. Speak. As I mentioned in #4, I’ve spoken at 6 user groups this year, so more than covered. If you’ve got a PSUG within driving distance of SW Missouri (KS, NW Arkansas, Oklahoma), let me know…I really enjoy sharing what I’m doing as well as speaking on “general” PowerShell topics.
  9. Encourage. I’ve probably let this one slide a bit. Note to self to comment more on other people’s posts.
  10. Digest. I get about 10 different paper.li daily digests either in email or on twitter. I don’t find a lot of value in them…they don’t seem to be curated for the most part, but I think adding another into the fray at this point would probably be lost. I’m going to skip this one this year…but keep it on the back burner.

Re-Thinking Positional Parameters

I mentioned in a previous post that I’ve recently changed my mind a bit about the Position parameter attribute. I guess technically it is the position parameter of the Parameter parameter attribute (i.e. there’s a parameter attribute called “Parameter” and it has a parameter called position). I don’t think you could come up with something much more difficult to correctly name.

Anyway, I’ve always had a low opinion of this particular parameter. Before I explain why, let’s review how it works.

If you have an advanced function, say Get-Stuff, you might see something like this:

Function Get-Stuff{
[CmdletBinding()]
Param([Parameter(Position=0)]$param1,
      [Parameter(Position=1)]$param2,
      [Parameter(Position=2)]$param3)
    
    #body of function
    write-output @{Param1=$param1;Param2=$param2;Param3=$param3}
}

The point of the Position= parameter (of the Parameter parameter attribute in the Param statement) is to say that if you use positional parameters in calling the function, you can expect that $param1 is in the first position (position 0), $param2 is in the second position (position 1) and so on.

The great part about it is that this code works the same without the Position= statements. Parameters (ignoring parameter sets) can be used in the order they’re listed in the function unless the PositionalBinding parameter of the CmdletBinding attribute is set to $false (which it isn’t, by default).

I’ve seen a lot of code which, in a function that has 12 parameters, mechanically sets Position= from 0 to 11 in order. It’s not exactly confusing, but it’s certainly not clear why you would bother doing it since it’s not necessary. From that observation, I decided that there was no point to using the Position= setting at all.

Micah Battin spoke a few months ago at the STLPSUG meeting on PowerShell Functions and got me to reconsider my position (no pun intended).

I explained my reservations and he said something to the effect that you should turn off automatic positional arguments (PositionalBinding=$false in the CmdletBinding()) and then set the position for the first one or two parameters. Those are the ones that people will be specifying postionally anyway.

Have you ever seen someone call a cmdlet and list 12 parameters with no names? I would guess not. There’s no reason to expect that people will want to write useless code like that, so why should your function definition encourage it?  By only including positions for the one or two “essential parameters” you’re doing your part to make the caller’s code better.

I’m going to be editing my “Advanced Function” snippet to include PositionalBinding=$False and see where it takes me.

What do you think?

–Mike

Hyper-V issues with Windows 10 Anniversary Update

My main home computer is running Windows 10 and Hyper-V, and I was really looking forward to the anniversary update.  PowerShell Direct, nested virtualization, and containers all sound awesome.  I’ve played with them in a test box, but not on my main box.

So…I got home after work the day of the update, installed it (took an hour or so) and rebooted.

First problem…Hyper-V management services wouldn’t start.  This was a problem with a driver…took a bit to find it, but got it solved that night.

Second problem….none of my VMs would start.  Had to re-create the virtual switch and re-assign to the VMs.  Not a huge deal.

Third problem…new 2016 TP5 VMs wouldn’t boot.  This one took a bit longer, and I found the actual workaround in a github issue comment.  I knew there was a reason I got a hundred or so of those a day.  🙂

The ingredients for this third problem are:

  • 2016 TP5
  • Gen 2 VM
  • Secure Boot
  • Version 8.0 firmware

If you have a VM (let’s say it’s called TheBadVM) you can get it to boot with the following command (with the VM stopped of course):

Get-VM -Name TheBadVM | Set-VMFirmware -EnableSecureBoot Off -SecureBootTemplate MicrosoftUEFICertificateAuthority

Then, the VM will start.

I hope this helps you. I figured this might be easier to find than the original (thanks to Ryan Yates!).

–Mike

August Missouri User Group Update

It’s been a while since I last sent an update on Missouri user groups.  I missed the June meetings in St. Louis (Michael Greene talking about the release pipeline) and in Columbia (Josh Rickard talking about his anti-phishing toolkit).

In July, Mike Lombardi shared about setting up a private PowerShell Gallery at the STLPSUG meeting. At the July MidMo meeting in Columbia I spoke about validating connectivity in a firewalled environment and Josh Rickard talked briefly about DSC.

I followed up in Columbia a couple of weeks ago with a more in-depth DSC discussion and had a great time.

In a couple of days I’ll be in St. Louis talking about proxy functions and Michael Greene will be sharing as well.

I’m also starting a user group at work and in the initial steps of starting one closer to home.

It’s exciting to see all of this activity and enthusiasm in the Missouri PowerShell community.  We just need someone in the Kansas City area to get things going up north.

–Mike

Re-Learning

The Importance of Learning What You Already Know

A couple of months ago I went to a PowerShell user group meeting on a subject that I already knew really well.  Since it involved a 3-hour drive (one-way) I almost decided not to go.  I had a great time, though, and I thought I’d share some observations I made about the experience.

Starting with the conclusion:  don’t skip out on something (a book, a blog post, a meeting, a video, etc.) just because you’re familiar with the subject.

Now for the reasons:

You might not know it as well as you thought!

First, you might not understand the material as well as you thought.  You probably haven’t used every feature of everything that the speaker is talking about.  Reading an overview in a book (how I’ve learned a lot of stuff) is not nearly as useful as seeing someone demonstrate in front of you. The user group meeting I referred to earlier was about functions, which I’ve used extensively, written about, and taught dozens of times.  I had read about the HelpMessage parameter attribute.  I don’t use it, though, and somehow I had gotten the wrong impression about how it worked.  Interestingly enough, someone else at the meeting thought it worked the same way I did.  We were both wrong.

You might have different opinions than the presenter!

A presentation will usually include some material that is opinion-based.  The simple matter of selecting topics to include conveys an opinion of what is important and what isn’t, for instance.  Trying to determine the opinion of the presenter can provide an opportunity for rich discussion.  If your opinion is different, politely asking why they think that way instead of the way you think can lead to some really good learning opportunities.  Again, in the case of this meeting, it was the Position parameter attribute.  I see it used all the time, and I generally think it’s overused.  The presenter had a different opinion and the ensuing discussion changed my mind.  I’m planning on writing the topic up as a post, so I won’t expand on it more here.   The point is that there’s more to the talk than the bullet points, and the “space between” can often be as educational as the explicit material.

Fine-tuning your understanding

Even if you do understand the “big picture” of the subject, there’s bound to be an angle you hadn’t thought of.  Listening to material that you generally understand gives you the freedom to pay attention to the details that you might miss if you’re trying to get a general understanding.   So, since I wasn’t worried about not grokking the material, I could pay attention to how the speaker was using functions, his naming conventions, etc.  Nuances that I might have missed as a first-time learner were readily available to me.

Encouraging sharing

The previous points were selfish, that is, they were direct benefits for you.  This point is more about being beneficial to the speaker and the community in general.  It’s not nearly as much fun or rewarding to speak to a really small crowd, especially if you’ve spent a lot of time developing and organizing the material.  By just showing up, you’ve encouraged someone who might be making the decision if it’s worthwhile to share or not.

–Mike

SQLPSX Update

Ok. I finally pulled the trigger on a major update (structurally, at least) to SQLSPX. This is the first big change in about 5 years. If you missed the post from a couple of weeks ago warning about this you might want to go back and read it.

In short, SQLPSX hadn’t been updated in a long time. The main downfall (besides not incorporating new SQLServer features) of that delay was that SQLPSX was still trying to load very old SMO DLLs. In the meantime, the SQL Client tools team released an updated PowerShell module for SQL Server as part of the SSMS July 2016 update named SQLServer, which was the name of one of the modules in SQLPSX. So, it was time to do something.

I could have simply renamed the module (and the two functions in it) that collided with the official MS module.  I’ve done that, but I also made some other changes which I’ll explain now.

Some new content

As I mentioned before, Patrick Keisler contributed some code for dealing with mirroring, Central Management Servers, and updated the code to try to load updated SMO assemblies.

The SQLServer module is now called SQLPSXServer

I didn’t want to change the name much.  I also renamed Get-SQLDatabase and Get-SQLErrorLog to Get-SQLPSXDatabase and Get-SQLPSXErrorLog to avoid name collisions with the MS SQLServer module.  I do check to see if the original functions exist, and if they don’t I create aliases so you can use the old names if you don’t load the SQLServer module.

The SQLPSX installer is gone.

In the early days of PowerShell, an installer made a lot more sense.  People didn’t exactly know where to put things, what needed to be run, should we modify the profile…lots of questions.  The community is a lot more comfortable with modules now, so I don’t think an installer is a benefit to the project.  It also slows the project down because we need to create a “build” of the installer with new code.  Since modules are xcopy installable, there’s little benefit in my opinion to having to do a bunch of work every time we make a small change to the code.

The SQLPSX “super-module” is gone.

If you’ve used SQLPSX before, you might remember that there was a “parent” module which loaded all of the other modules, whether you wanted them loaded or not.  Particularly gross (to me, though I see the idea) is that it looked to see if you had Oracle DAC installed, and imported the Oracle tools as well.  And the ISE toools if you were in the ISE.  In my opinion, the community is comfortable enough with modules that simply having a set of modules that you load when you want makes more sense to me.  There’s very little overlap between the modules, so it’s likely that you will use them one at a time anyway.

The Oracle, MySQL and ISE Tools are gone.

This one might make people mad, though I hope not.  First of all, the ISE tools worked fine in 2.0, but not so much after that.  I haven’t had time (or interest, to be honest) to look at them, but I also didn’t find using the ISE as a SQL Editor to be a great experience.  If you want to grab the module(s) for ISE and update them, more power to you!

The Oracle and MySQL tools were always kind of fun to me.  They started out as cut/paste from adolib (the ADO.NET wrapper in SQLPSX), replacing the SQLServer data provider with the Oracle and MySQL provider.  Some extra work was done in them, and I don’t want to disparage that work.  But at the outset, SQLPSX is a SQLServer-based set of modules.  If you want to take OracleClient and run with it, that’s awesome and I hope it helps you.  Let me know, because I’ll probably end up using it myself at some point.

Some of the “odd” modules are gone

There were a few modules that didn’t really fit the SQLServer theme (WPK…copy of a WPF toolkit distributed by Microsoft, PerfCounters).  I’ve removed them from the main modules list as well.

TestScripts, Libraries, and Documentation are gone

The TestScripts were very out of date, I’m not sure how the libraries were used, and the documentation was old documentation for a single module.

Gone doesn’t really mean gone.

There’s a Deprecated folder with all of this stuff in it, so when we find something that I broke by removing it, we can put it back.

This isn’t quite a release yet.

So first of all, I haven’t changed code in most of the modules, so if they don’t work, they probably didn’t to start with.  If you find something that’s broken (or you think might be broken), please add an issue to the project on GitHub or if you feel comfortable with the code, send a pull request with a fix.  I have done some simple testing with adolib (which really is my only code contribution to the project) and SQLPSXServer (which I renamed).  Other than that, it’s open season.  I’ll probably let this bake for a few weeks before I start updating version numbers in module manifests.

If you have questions about what’s going on or why I made the changes I did, feel free to reach out to me.  If you want to help with the project in any capacity, I’d love to hear from you.

Hopefully I didn’t step on too many toes.  If yours were stepped on, I apologize.  Let me know what I did and I’ll try to make it right.  The community works better if it communicates.

–Mike

You don’t need semicolons in multi-line hashtable literals.

This is not a world-changing topic, but I thought it was worth sharing.

If you have written hashtable literals on a single line, you’ve seen this before:

$hash=@{Name='Mike';Blog='powershellstation.com'}

Sometimes, it makes more sense to write the hashtable over several lines, especially if it has several items. I’ve always written them like this:

$hash=@{Name='Mike';
        Blog='powershellstation.com';
        Language='PowerShell'
}

I was watching Don Jones’ toolmaking sessions on youtube and what he did was slightly different.

$hash=@{Name='Mike'
        Blog='powershellstation.com'
        Language='PowerShell'
}

I watched and waited for him to get errors, but he’s Don, so he got it right.

I’m not sure how I missed this, but you don’t need semicolons in multi-line hashtable literals.

–Mike

The future of SQLPSX

With the recent seismic shift in the SQL PowerShell tools landscape, I thought it would be a good idea to address the state and future of the SQLPSX project.

First of all, SQLPSX is not going away. There will always be some functions or scripts that don’t make it into the official modules installed with SQL Server. I’m very excited to see the first sets of changes to the official SQL client tools and the energy in both the community and the MS Team is very exciting. On the other hand, SQLPSX has been around for a long time and some people have grown accustomed to using it.

My plans for SQLPSX are the following:

  • Rename the SQLServer module to SQLPSXServer to avoid a conflict with the official SQLServer module
  • Remove the “main” SQLPSX module which loads the sub-modules
  • Move several modules to a “Deprecated” folder (SQLISE, OracleISE, WPK, ISECreamBasic
  • Remove the installer…most people do xcopy installs anymore
  • Edit the codeplex page to point here

There has been some activity on Github lately from a new contributor (Patrick Keisler) who has updated the SMO assembly loading as well as other places assemblies are loaded. He also contributed a module for dealing with mirroring and with a Central Management Server (CMS). I’ve been talking with people on the SQL Server community slack channel about getting some testing done (I don’t have a lot of different SQL Servers sitting around) and hope to have a new release this month. That will be the first real release in about 5 years!

If you want to know how you can get involved in SQLPSX, let me know.

–Mike

Custom objects and PSTypeName

A couple of weeks ago, Adam Bertram wrote a post which got me really excited. As an aside, if you’re not following Adam you’re missing out. He posts ridiculously often and writes high quality posts about all kinds of cool stuff. I’m writing about one such post.

Anyway, his post was about using the PSTypeName property of PSCustomObjects and the PSTypeName() parameter attribute to restrict function parameters based on a “fake” type. By “fake”, I mean that there isn’t necessarily a .NET type with a matching name, and even if there was, these objects aren’t those types. An example might help:

First, by including a special property called PSTypeName in the PSCustomObject, the object “becomes” that type.

$obj=[pscustomobject]@{PSTypeName='Mike';
                       DisplayName='My custom object'}
$obj | Get-Member

PSTypeName1

So now we have an object which thinks it is of type “Mike”.

Using the PSTypeName() parameter attribute, we can create a parameter which only accepts objects of that type!

function Get-DisplayName{
    param([PSTypeName('Mike')]$obj)
    $obj.DisplayName
 }

Calling this function with our object and a string shows that it only accepted our object.

PSTypeNames2

The text of the error is as follows:

Get-DisplayName : Cannot bind argument to parameter ‘obj’, because PSTypeNames of the argument do not match the PSTypeName
required by the parameter: Mike.

It’s probably worth mentioning that we can’t just put [Mike] as the parameter type. PowerShell really wants “real” types in that situation.

So, now we can easily create our own types of objects without dealing with C# or PowerShell classes, and still get the “type” validated in a function parameter….Sweet!

But wait! Back in 2009, I wrote about modifying objects in order to get special formatting or special new members from PS1XML files, through PowerShell’s awesome Extended Type System. I did that by looking at the PSObject underlying the object (all objects in PowerShell do this), and inserting a value into the PSTypeNames array.

It turns out that the PSTypeName() parameter attribute recognizes objects that are “customized” this way as well:

$obj2=get-service | select-object -first 1
$obj2.PSOBject.TypeNames.Insert(0,'MIKE')
$obj2 | Get-Member

PSTypeNames3

Now that we have the object (which is really a ServiceController object, but we told it to pretend to be a Mike), let’s try it in the Get-DisplayName function we wrote above:
PSTypeNames4

So, we can use the PSTypeNames() parameter attribute for any object which has had its PSTypeNames() fiddled with. That’s great for me, because I get a lot of objects from a database, so they’re really DataRow objects. I add an entry to PSTypeNames to help represent what kind of objects they came from (e.g. what table or query they came from) but until I found out about this I was stuck using ValidateScript() to check parameters.

What do you think? Is this something you can see a use for?

Let me know in the comments or on social media.

–Mike

P.S. Adam says that Kirk Munro was the original source of this information. He’s awesome too, so I’m not really surprised.
P.P.S…Kirk commented below that Oisín Grehan and Jason Shirk were his sources in this. Great teamwork in the PowerShell community!