Saturday, September 29, 2012

Powershell, WMI, rapidfailprotection, and Set-WmiInstance

Recently, I've been having to admin IIS7+ with WMI. I was pretty comfortable reading WMI data with Powershell, but the need to write data just hadn't come up. I figured as easy as it was in C#, it would be even easier with Powershell.

For the most part, it is. There's lots of examples on how to do it.

There's other settings which aren't so straightforward, though. Application pool rapidFailProtection was the one that led me on my investigation. It's contained in an EmbeddedObject, which isn't really a WMI object. For example, it has no __PATH.

Setting it directly didn't work, and neither did Put().

The trick is to put the EmbeddedObject into a temporary variable, SetPropertyValue on the temp variable, and pass the temp variable back as an argument to Set-WmiInstance.
$creds = Get-Credential
$ComputerName = "target.example.com"  
$appPool = (Get-WmiObject -Namespace "root\WebAdministration" `
            -Class "ApplicationPool" `
            -ComputerName $ComputerName `
            -Credential $creds `
            -Authentication PacketPrivacy | `
            where { $_.Name -eq "DefaultAppPool" })
            
if( $null -ne $appPool )
{
    write-host $appPool.Name
    $failureNode = $appPool.Failure
    $failureNode.SetPropertyValue( "RapidFailProtection", $false )

    Set-WmiInstance `
        -InputObject $appPool `
        -Arguments @{ failure = $failureNode }
}

(Get-WmiObject -Namespace "root\WebAdministration" `
            -Class "ApplicationPool" `
            -ComputerName $ComputerName `
            -Credential $creds `
            -Authentication PacketPrivacy | `
            where { $_.Name -eq "DefaultAppPool" }).Failure

If you examine the WMI class you created at the beginning of the script, you won't see the change. Refreshing from the source (or just looking on the server) shows the updated value.

Tuesday, June 21, 2011

LulzSec and the State of Security

Well, the skiddies sure have been busy, haven't they? Starting with Sony's Playstation Network, to (supposedly) the entire UK 2011 Census, LulzSec has appeared over and over again in the news.

It was announced that a 19-year old mastermind has been arrested in the UK. Well, that should do it, right?

Here's some thoughts about that.

  • It is pretty hard to claim that LulzSec even has a leader. They act more like a collective. Anyone can claim to be "LulzSec".
  • Most of the attacks have been easy. No "mastermind" is needed.
  • It is more likely that the active participants are among the least-sophisticated of hackers. They are just the noisiest.
Those aren't really important, though. What's important is that these vulnerabilities have always been there. LulzSec is only the first to admit that they got to this data. It is all too likely that groups or individuals who were more interested in the data than the publicity got there first.

It isn't as if there hasn't been ample discussion and warnings about security. I've been on the wrong side of expediency too often to believe that. The story is always the same: a variation of "we don't need to worry about that, we need to worry about shipping".

It is a shame that it all came to this, that it takes a bunch of kids using common tools to get past the denials and false assurances of these institutions in order to get the kind of attention that these issues deserve. There will be many innocent victims because institutions who have the resources and knowledge to be among the best failed their most basic function: keeping your money safe.

Saturday, July 10, 2010

Operations is as Important as Development

A couple of recent events have inspired me to write this post. The second was Reddit's needing help. The first was an event on my current contract.

My takeaway from Reddit's problem is brief: they finally figured out that just adding new features isn't enough. They have to get their house in order, and they have to take steps to keep it that way. To be fair, I don't know that the source of their problem was poor operational planning, in fact, it sounds like they devote a fair amount of time and effort to it.

The event at work was something of which I've seen way too much. A business wants to roll out a pretty sophisticated reporting application, high availability, super fast - you know, what everyone wants. My part of the whole thing was to determine what hardware and software resources to provide, and then build them out (it's a big enterprise shop, so there's a lot more "fun" associated with it).

It is easy to say "I want multiple web servers", it is another thing to build for multiple web servers. There's a few ways to do it, and what direction is chosen determines what to build.

This particular business not only hadn't really thought about it, they didn't want to think about it. Okay, fine, I can build for that, too - but it is gonna cost them. To the tune of about a half million dollars, just to start. With only a little bit of planning and forethought, by utilizing resources already in place they could get everything they wanted for a few thousand. That's a business decision - if the business can justify that much money, it's their budget, and who am I to complain?

While I may not complain, I can't help but notice that this doesn't really seem like smart business. In that particular case, I really don't know enough about what is going on from the business side to really make that kind of judgement.

However, I have been in enough situations where I did know enough about what is going on to be able to make that judgement. I've seen it enough that I don't want to be in a position where those decisions are made, or diminished. In fact, it is in my best interest to let these shops ignore this importance - because I charge top dollar for coming in and fixing it after the fact. If a shop wants to make a poor business decision which is good for my business...well, that's a good business decision on my part, isn't it?

If you're not bothered by the idea of getting yourself in such a tight bind that you have to shell out thousands of dollars for someone like me (and yes, thousands of dollars is accurate. Sometimes tens of thousands), then save yourself some time and stop reading.

System administration shouldn't be considered after the fact. It shouldn't be considered a necessary evil. It deserves as much prominence and dedication as any other part of the development process. An application isn't just the application that is installed or written - what it is running on is just as important.

System administration has it's own demands,it's own skillsets, it's own schedule, and it's own rhythm. This should be recognized up-front, and it should be given first-class priority. I'm probably being repetitious at this point, but it seems like it really isn't appreciated.

So, I'll say it again: too many businesses, large and small, waste money and time because they don't recognize the importance of their operations, and treat them as afterthoughts. Too many businesses limit themselves, their options, their very potential, because they don't recognize this.

I started to write this with an eye towards specific steps, specific considerations, and a lot of ideas to think about for implementing this. As I wrote, though, it only became clearer that the first step was pointing out what doesn't seem obvious: operations are important. Learn that, know that, believe that, and live that, and you won't have to worry about paying someone like me a ridiculous amount of money when it is least convenient to you.

Ignore this, and that's fine with me. My contact info is to the right, and I sincerely hope to hear from you.

Monday, June 21, 2010

Exporting MongoDb with Powershell

I'm just going to leave this right here. This fixes up the output from mongoexport to create an array, suitable for XMLSpy...

function Export-Wmi
{
 Param (
  [Parameter(mandatory=$true)]
  [string]
  $DbName,
  
  [Parameter(mandatory=$true)]
  [array]
  $CollectionNames
 )
 
 Process {
  $CollectionNames |
  foreach {
   $output = ""
   $cmd = ""
   $docs = C:\Path\To\Mongo\bin\mongoexport.exe `
    --db $DbName `
    --collection $_ `
    2>$null
   
   $output += "["
   $bfirst = $true
   foreach( $doc in $docs )
   { 
    if( -not $bfirst )
    {
     $output += ","
    }
    
    $output += $doc
    
    $bfirst = $false
   }
   
   $output += "]"
   
   $output | Out-File -FilePath "C:\Path\To\Data\$_.json"  
  }
 }
}

Export-Wmi -DbName "reports" -CollectionNames @("coll1", "coll2" )

Wednesday, June 2, 2010

Powershell, MongoDB, and WMI

I had reason to write a short cmdlet which converts WMI objects to MongoDB docs. It ain't much, but it works so far:

 [Cmdlet( VerbsData.ConvertTo, "MongoDoc" )]
 public class ConvertToMongoDoc : Cmdlet
 {
  [Parameter( Mandatory = true, ValueFromPipeline = true )]
  public PSObject InputObject { set; get; }

  protected override void ProcessRecord()
  {
   Document converted = (Document)WmiConvert( InputObject.BaseObject );
   this.WriteObject( converted );
  }

  protected object WmiConvert( object obj )
  {
   if( null == obj )
    return null;

   object newObj = obj;


   Type objType = obj.GetType();

   TypeCode objTypeCode = Type.GetTypeCode( objType );
   String objTypeName = objType.FullName;
   
   if( objType.IsGenericType )
    return obj;

   switch ( objTypeCode )
   {
    case TypeCode.String:
     return obj;
   }

   switch ( objTypeName  )
   {
    case "System.TimeSpan":
     return ( (TimeSpan)obj ).Ticks;

    case "System.Int16":
    case "System.UInt16":
     return System.Convert.ToInt32( obj );

    case "System.UInt64":
    case "System.UInt32":
     return System.Convert.ToInt64( obj );

    case "System.Byte":
     return null;

    case "MongoDB.Driver.Document":
     return obj;
   }

   if( objType.IsArray )
   {
    ArrayList aTmp = new ArrayList();
    foreach ( var s in (object[])obj )
    {
     aTmp.Add( WmiConvert( s ) );
    }
    return aTmp;
   }

   if( 0 == objTypeName.IndexOf( "System.Management." ) )
   {
    Document tmpdoc = null;

    switch( objTypeName )
    {
     case "System.Management.ManagementObject":
     case "System.Management.ManagementBaseObject":
      tmpdoc = new Document();
      ManagementBaseObject mbo = (ManagementBaseObject) obj;
      tmpdoc.Add( "WmiPath", mbo.SystemProperties["__PATH"].Value );
      tmpdoc.Add( "WmiServer", mbo.SystemProperties["__SERVER"].Value );
      AddPropsToDoc( tmpdoc, mbo.Properties );

      newObj = tmpdoc;
      break;

     default:
      break;
    }

    return newObj;
   }
   return newObj;
  }

  protected void AddPropsToDoc( Document doc, object obj )
  {
   foreach ( var propData in (PropertyDataCollection)obj )
      {
       doc.Add( propData.Name, WmiConvert( propData.Value )  );
      }
  }
 }

Hope this helps.