Tuesday, September 18, 2012

Powershell - the good and the bad. The story of a survivor.

Today's topic is Powershell - it's usefulness and weirdness.

It is the Microsoft super cool console which is used to provide command-line API for all of MS big products like Office, SQL Serveer, etc. It is build on top of .Net so every .Net class you write, you can take advantage of it in the command-line.

It has it's advantages:
- using objects and not strings for pipelining commands
- the .net framework
- ease-of-use - with just a few commands you can open up the Powershell world.
- very readable
- good help about_xxx articles

Let's start with the
Good Stuff

Pipelining is a powerful feature. You can pass multiple result objects from 1 command to the next one just like this:

How you delete everything inside a folder ?
Get-Item "c:\folder\" | Remove-Item -Force -Recurse #alias is dir or ls
How to see how to use a command ? What are its parameters? What does it do?
Get-Help "Get-Item" -Full  #alias is man or help 
What are the properties/methods of a process ?
Get-Process | Select-Object -First 1 | Get-Member #alias is gm

With just Get-Help (or man), Get-Member (or gm) and Get-Command (to list all available commands/functions/aliases) , you can do anything!

You'll use pipelining all the time! It's really cool!

Some nice hits are:
- filtering a collection of items by a property. Returns all folders with name ending with the string "project"
$folderList | ? { $_.Name -match ".*project" }
which is short way to write:
Foreach-Object $folder in $folderList | Where-Object {  $folder.Name -match ".*project"}

- doing a repetitive task on many objects. Display

$folderList = dir | ? {$_.GetType().Name -match ".*DirectoryInfo.*" }
$folderList | % { Write-Host "$($_.Name) - $($_.CreationTime)" }
- errors
When there is an error just get the automatic variable $Error (array of error objects) and check the last 5 errors $Error[0..5]

Main usage of Powershell is doing some quick check on something in the Microsoft infrastructure or writing a script to automate a routine tedious task like creating a report or killing all not-responding processes at the moment.
Overall system administrators will love it.

and here ends the good things... If you try to use it for something else, you'll hit many different obstacles:

Strange behaviour

Functions want to return something!

In fact, every function you call that returns an object, and the result is not saved in a variable, is automatically transfered as a result. OK, but when you have multiple such calls inside your function, these results are added into an array and here's the /unexpected behaviour/...
function myFunc(){
$b = myFunc
$b.GetType()  --- #it says array

The other way to stop this from happening is to enclose the call with brackets and [void] the result like this:
 [void] (Get-Process)
The strange empty array

When comparing empty array to $null  with the comparison operator -eq (equals), doesn't return neither $true, nor $false...just ... nothing.

This is partially explained with the Powershell ability to auomatically flatten an array to its elements. When it's an empty array - the operation doesn't execute. But there's no error as well... or warning.

Try this:
@() -eq $null
$null -eq @()
Also if you are used to separate each function parameter with comma, you will have a hard time!

Using comma ( ,) will generate an array. Well, Powershell likes arrays a lot, and you will, too, but only after the initial struggle.

Using it for plain programming

Unfortunately, when you want to use it to code features, and even small frameworks, you hit

some obstacles like
- Visual Studio doesn't support it... at all! No other IDE with satisfying experience, too
- PowerShell ISE isn't feature-rich enough
- only suitable is Notepad++ with word completion and many macros along the way...
- no compile-time catching bugs like - unused variables, or undefined ones...

Unfinished or just plain broken features

Modules as Custom objects

As of Powershell v2, there is a new method to package commands (cmdlets) - modules (instead of snapins).

So you can import modules into the runspace or import them as CustomObject, which has
methods as the functions declared in the .psm1 file (module script file)

These CustomObjectModules are very buggy when it comes to multiple imports and generating exceptings...Trust me. The latter is very nasty...
magine writing a few modules which should use each other - common utils, etc. And then you have an exception. Now where exactly did the exception occur ? Try to get catch it with try-catch - the Error record contains only the row/execution line of your current module in development! Nothing about the inner exception or error stack... !! Which is not useful at all.

In order to be productive with these modules, I ended up writing my own ExceptionStackTrace building function which collected errors, line numbers, positioned the stack trace correctly, and stops when it reaches another stack trace (.Net one) which is descriptive enough...  Well, I'm sure it could have been done better, but it works stable enough and I like that it solved this problem which should have been handled by the Powershell Team!

Also nested imports fail from time to time with functions which cannot be found...
I am seriously, not going to use Modules as CustomObjects anymore... Too bad. Almost like Object-oriented programming... but with serious issues...

While Powershell provides nice API for remote calls - Invoke-Command for executing single scriptblock and Enter-PSSession for interactive mode. They are thought quite nice, but unfortunately, the last few weeks I've come to the conclusion that they are quite fragile and I was often the observer of the following results:
- OutOfMemory Error
- Stopped execution without errors
- WinRM was not configured correctly errors pointing to articles in MSDN which weren't specific enough.

So basically here are my findings about Powershell Remoting:

- complex requirements for remote machine for the remoting to work. Not a single full guide for this.

- adding snapins remotely for machine with no Internet access - 2 minutes ! You have to turn off the Internet Options > Advanced > Security option to "Check for publisher's certificate invocation" and it will pass in seconds.

- remoting stops without any error or logs. Hard to debug. Could be missing configurations/rights, could be low memory

- slightly more complex remoting produces many errors. For example starting a new resource-hungry process on the remote machine results in OutOfMemory Error. Solved by increasing the memory per PowerShell property described at the end of the article.

- inconsistent behaviour - different errors for unchanged code

- lack of documentation for some of the features - only a few blog posts saved me a few hours/days

A few tips here:

- the remoting script you run should be as light as possible and should just execute a file/script located on the remote machine. This was you can lower remoting data transfers(logs, commands, variables, etc)

- for complex, time-consuming tasks use Invoke-Command xxxx -AsJob parameter. This way it will create invisible background thread, which will take even less resources. Actually, for my complex scenario , it proved to be the only working option...

Get-Item wsman:localhost\Shell\MaxMemoryPerShellMB 
Set-Item wsman:localhost\Shell\MaxMemoryPerShellMB 512 -Force


Don't get me wrong. Using Powershell is overall an easy and straight-forward experience, but when you try something a bit more kinky and ... let's say that the 'fun' starts there.

Best regards,
Leni Kirilov

No comments:

Post a Comment