Automate the Auditors Away – Veeam Backup Report v2

Earlier this week I had a piece published over on the Solarwinds Thwack forums titled Start Automating Your Career where I tongue in cheek point out that automation has reached a fever pitch over the last few years. My hope is that by sharing a couple of actionable tips, more people can take their first steps towards scripting and automating.

Given that I was publicly offering advice to people on how to automate their mundane tasks away, I thought that it was only fair if I took my own advice. Or in other words, I was going to eat my own dog food.

In my Thwack piece I offer up three general steps that you can take to get started automating:

  • Pick a Framework
  • Find a Task
  • Use the community

What you’ll find below is proof of my belief in this process, and how I leveraged these tips to automagic one of my problems away.

The Task and Framework

Audit’s are a fact of life in my role and any audit requires a lot of document collection. Anytime you can make that collection process easier, the auditors can expect a more consistent result and you can expect to pull out less hair. Audits happen with some level of regularity and auditors look for consistent data, so there are opportunities abound for standardization and automation of a process. In this case the task is fairly obvious: create a tool to efficiently capture the data necessary for audits. More specifically, I need to get backup documentation for our auditors.

Luckily, we use Veeam Backup and Replication (VBR) for our backups. The fine folks over at Veeam have provided a PowerShell module by default with VBR installations for at least the last several major versions of the product. Recently I used this framework to create a simple script that I describe in Veeam backup report via PowerShell. The script creates a little gap report, but that’s about it. It was fun task, but I got feedback from several folks that took the form of “But what about…” This time around I decided to see how easily I could build a more extensible script, that could have more utility moving forward.

So I’ve identified my task: automate data collection. I’ve also got a framework to use with, the Veeam PowerShell module, so I guess all that’s left is to use the community….

Invoke-vCommunity

A number of years back, I decided that I wanted to become more involved in the community, so I became a VMUG leader. Helping people and networking with like-minded individuals was intoxicating and this past year I took a couple opportunities to further engage, most recently with the Veeam Vanguards. I mention this group, because it’s the Vanguards I reached out to for help with one item of this script.

I just couldn’t get a section of my report to work the way I wanted it to. I knew I was close, but this project was supposed to be fun, and beating your head on your own desk isn’t fun, so I asked for help. It’s not an easy thing to do, but making yourself vulnerable and opening yourself often yields positive results.

Within minutes of posting my question in the Vanguard Slack, my friend Craig popped up and said “Hey, I know someone else who’s having troubles with this too!” Several more minutes go by, and lo and behold, here’s Craig with a KB article to help me out. Ultimately, the KB didn’t provide the fix, but that’s immaterial, as the community and the Vanguards were there to help! The other great thing about the vCommunity, is that there are often opportunities to pay it forward. In the same spirit as Craig sharing the KB, I took a few minutes out of my day to share the resolution on the Veeam forums, on the off chance that someone else needs a helping hand.

The Script

Since I followed my tips, I should have a script right?

I do! It’s a big one, so I’m going to break the script down into a few sections. It is large, so I’m including a link to a downloadable version at the bottom of this post.

The start

First and foremost, I want this script to be helpful and useful. I decided to make use of Parameters to make the process easier to run from the commandline. The use of parameters makes it easier for others to use, with some reasonable expectations of what’s going to happen. I also plan on using this myself, so I made it easy to just run by specifying defaults for many of the parameters. I’ll go another level deeper in a post on parameters very soon-ish.

I’ll also highlight that some of these parameters aspirational. You’ll notice that some are commented out. That’s intentional, to highlight where I think this script could go next. To that end if you try this script out and find it helpful, please let me know and I’ll continue development of it.

Some highlights

  • ReportType is mandatory, because that’s why we’re here.
  • Both OutputType and ReportType leverage Validation Sets to control input values.
  • The Parameter Sets named VBRCred lump parameters together related items
### Define input parameter
[CmdletBinding(DefaultParametersetName='None')]
param(
    [Parameter(ParameterSetName='ReportType',Mandatory=$true)][ValidateSet('All', 'Backup','Copy','History','Gap')][string]$ReportType ="All",
    #[Parameter(ParameterSetName='JobType',Mandatory=$false)][switch]$SingleJob,
    #[Parameter(ParameterSetName='JobType',Mandatory=$true)][string]$JobName,
    [string]$vCenterSvr="vcenter",
    [string]$VBRserver="VBR",
    [ValidateSet( 'HTML')][string]$OutputType='HTML',
    [Parameter(ParameterSetName='VBRCred',Mandatory=$false)][switch]$UseAlternateVBRCredentials  ,
    [Parameter(ParameterSetName='VBRCred',Mandatory=$true)][System.Management.Automation.PSCredential]$VBRCredential   

)

The End or is it the Beginning?

Most of the fun stuff is in the middle of this script, so let’s get the end out of the way first. It’s like eating your salad before getting to the main course. Because I make heavy use of functions, the main routine is simple, clean and readable. Declare a bunch of things I’ll use, make sure the environment is ready and then get to it! You’ll note that I don’t comment everything, but I try to provide comments around the theme of a given section.

The magic in the main routine is the switch statement. If you’ll recall, ReportType was a mandatory parameter. That’s because the operation of this script revolves around the data that we’re gathering. Everything else is a simply a supporting character

#####Main
###Tasks for All, set the variables.
$VBRjobsHistoryArray=@()
$VBRjobsOverviewResults=@()
$ofs=";"
get-veeampluginstatus
connect-vbr

###Do the things based on the parameter things
switch ($ReportType){
  "All"{
    $GapResults=get-gapreport
    $VBRjobs=get-vbrjob
    foreach ($job in $VBRjobs){
      $VBRjobsOverviewResults+=Get-BackupJobOverview -inJob $job
      $VBRjobsHistoryArray+=Get-VBRJobHistoryOverview -injob $job
    }
    break
  }
  "Gap"{
    $GapResults=Get-GapReport
  }
  "History"{
$VBRjobsHistoryArray+=Get-VBRJobHistoryOverview -injob $job
  }
}

###Make it Pretty. Oh so pretty
Build-Output

The Good Stuff!

I stated above that this project needed to be extensible. This script will be broken up in chunks that you can run selectively. I’d also wanted to have the ability to add more functionality in the future, so putting all of the work in functions only makes sense. Here’s a breakdown of what each function does:

  • get-veeampluginstatus. The first function I wrote. How can you tell? I got sloppy with my capitalization. This entire script is predicated on using the VeeamPSSnapIn that’s part of the VBR install, so obvious starting place is to verify that it’s installed and loaded. This and the connect-vbr function are just about getting ready to do work.
  • Get-BackupJobOverview. The first thing auditors want to know is the overview of what you’re doing with your backups. That’s what we’re doing here, creating a basic output for our friends. I really like using custom PowerShell Objects, and you’ll see a few of them throughout this script. I have another post in the work on these nifty items, but it’s probably sufficient to point out that a custom PS object is created by using the New-Object commandlet and data is added to our custom object by using the Add-Member Commandlet. You’ll see that I use the same technique in multiple places, which should make for a more readable product. Another reason to use custom PS objects:  I made this script for ME and my teams needs. By using a custom object, it becomes very very easy to swap other data elements in and out to fit your needs, without refactoring the entire script.
  • Get-GapReport is the same content from Veeam backup report via PowerShell, only put into a function, so no reason to cover it again here.
  • get-scheduleoverview along with Get-BackupJobOverview and Get-VBRJobHistoryOverview were where I had a lot of fun and are the most important parts of this script. In each instance I pass in a single Veeam backup Job (CBackupJob) object. There are a ton of both properties (things that make up the object) and methods (things that you can do with the object), so in reality anything you can get out of the GUI, you can get out the PowerShell objects.  A couple of fun examples for how I put the VBR module to work:
    • In the Get-BackupJobOverview function I want to determine if my backup job is a full backup or not. After tinkering around with my friend Get-Member, I realized that the Veeam Backup Job (CBackupJob) object is full of other objects, like the CDomBackupStorageOptions object, which contains… you guessed it a property called EnableFullBackup. You can see how I drill down to the object in line 34 (also sampled immediately below)
        $JobHash | Add-Member -type NoteProperty -name FullBackup -Value $injob.BackupStorageOptions.EnableFullBackup

2019-12-10 22_18_46-Windows PowerShell ISE

    • I also mentioned that there are a lot of methods made available to you from the VBR cmdlets. Honestly most of the data is surfaced within the Job object itself, but if you want to scratch a little deeper… I make use of a couple methods in the Get-VBRJobHistoryOverview function. On line 145 and 146 I use the GetBackupStats() and GetDetails() methods respectively. This is  the data that I need, but there’s a ton more you can do to fit your needs. As you can see from the statistics on this one object (another nested object), there’s WAY more that you can get busy with. 2019-12-10 22_33_20-Windows PowerShell ISE

The Dog Food

So what do you get out of this beautiful script? That’s what we’re here for right, to see the proof in the pudding, errr dog food. At the moment there are three primary reports being created, all using the ConvertTo-HTML commandlet to make it look pretty. If you want to explore how it’s output, check out the Build-Output function.

The Backup Overview report tries to boil down what are the most basic key elements to your backup job into one table.

OverviewReport

Similarly the History Overview Report tries to distill down the most recent history of a given job into a digestable format.

HistoryOverviewReport.png

And lastly the Gap Report pulls a list of all VM’s from the vCenter target, compares the list against your various VBR jobs, so that at a glance you can see which VM’s are protected by what jobs.

GapreportHeaderGapreportdetails

The End

That’s the script in a nutshell. There’s a lot more that I could dig into here, so be on the lookout for some additional PowerShell posts soon.

I hope that if you’re using Veeam Backup and Replication, that you start putting their deep PowerShell commandlets to use soon. There’s a lot of power you can and should be taking advantage of there.

Lastly I hope this demonstrates that by choosing a task and diving in, you too can start automating your problems way.

Get your very own pretty dog food script here!

function get-veeampluginstatus{
  if(! $(Get-PSSnapin -Name VeeamPSSnapin -Registered -ea SilentlyContinue) ){
    Write-Host 'This script requires the VeeamPSSnapIn to continue. Please install this and retry.'
    exit
  }
  elseif( ! $(Get-PSSnapin -name VeeamPSSnapIn -ea SilentlyContinue)){
    Add-PSSnapin -Name VeeamPSSnapIn
  }
}
function connect-vbr{
  $session=Get-VBRServerSession -ErrorAction SilentlyContinue -WarningAction SilentlyContinue
  if($session){
    Write-Host "You are already connected to Veeam Backup and Replication $($session.server). This process will continue using the existing session."
  }
  else{
    if($UseAlternateVBRCredentials){
      Connect-VBRServer -Server $VBRserver -Credential $VBRCredential
    }
    else{
      Connect-VBRServer -Server $VBRserver
    }
  }
  if (! $(Get-VBRServerSession -ErrorAction SilentlyContinue -WarningAction SilentlyContinue)){
    write-host "we were unable to connect to $VBRserver. This script cannot proceed without a connection and will now exit."
    #exit
  }
}
function Get-BackupJobOverview($inJob){
  $JobHash=new-object system.object
  #$jobhash=$inJob.Name
  $JobHash | Add-Member -type NoteProperty -name Name -value $injob.Name
  $JobHash | Add-Member -type NoteProperty -name Enabled -value $injob.IsScheduleEnabled
  $JobHash | Add-Member -type NoteProperty -name JobType -value $(if ($injob.IsBackup){"Backup"}Elseif($injob.IsBackupSync){"Copy"})
  $JobHash | Add-Member -type NoteProperty -name FullBackup -Value $injob.BackupStorageOptions.EnableFullBackup
  $JobHash | Add-Member -type NoteProperty -name Description -value $injob.Description
  $JobHash | Add-Member -type NoteProperty -name Schedule -value $(get-scheduleoverview -injob $injob )
  $JobHash | Add-Member -type NoteProperty -name VMs -value $($injob.GetObjectsInJob()| Select-Object -Property name -ExpandProperty name|out-string)
  $JobHash | Add-Member -type NoteProperty -name Target -value $injob.TargetDir
  $JobHash | Add-Member -type NoteProperty -name RetentionCycles -value $injob.BackupStorageOptions.RetainCycles

  return $JobHash
}

function Get-GapReport{
  ### v3 not ready for targetted clusters yet
  ### $targetclusters=@("cl1","cl2")
  $GapJobArray =@()

  ###check if existing vCenter connections match entered
  if($global:DefaultVIServers -and ! ($vcentersvr -in $global:DefaultVIServers.name)){
    write-host "You are not connected to the host specified in the 'vCenterSvr'"
    write-host  "Press 'Y' to continue and disconnect from other sessions. Any other key will end this script. "
    write-host "Continue?  "
    $response = read-host
    if ( $response -ne "Y" ) {  }
    Disconnect-VIServer * -Confirm:$false -Force
  }

  $null=Connect-VIServer $vcentersvr 

  ### Get a hash table from Veeam of all Jobs and member servers
  foreach($Gapjob in Get-VBRJob)
  {
    $GapJobHash=new-object system.object
    $GapVMs=$Gapjob.GetObjectsInJob() | Select-Object -Property name -ExpandProperty name
    $GapJobHash | Add-Member -type NoteProperty -name Name -value $Gapjob.Name
    $GapJobHash | Add-Member -type NoteProperty -name VMs -value $GapVMs
    $GapJobArray +=$GapJobHash
  }

  ###Get all Vm's in the target clusters. Iterate through hash table and if a job match add value to VMArray
  $GapSummaryArray =@()

  Foreach ($GapVM in get-vm){
      $GapVMArray=new-object system.object
      $vname=$(get-vm $GapVM).name
      $GapVMArray|Add-Member -type NoteProperty -name VM -Value $vname

      for ($i=0; $i -lt $GapJobArray.count ;$i++){
        if($GapJobArray[$i].VMs.Count -gt 0){
          if($GapJobArray[$i].VMs -contains $vname ){
            $GapVMArray|Add-Member -type NoteProperty -Name $($GapJobArray[$i].name) -Value "enabled"
          }
          else{
            $GapVMArray|Add-Member -type NoteProperty -Name $($GapJobArray[$i].name) -Value "-"
          }
        }
      }
      $GapSummaryArray +=$GapVMArray
   }
  return $GapSummaryArray
}

function get-scheduleoverview($injob){

  $sched=$injob.ScheduleOptions

  #Daily
  if($Sched.OptionsDaily.enabled -eq $true){
    $ScheduleOverview="Daily; " + $Sched.OptionsDaily.DaysSrv + "; " + $Sched.OptionsDaily.TimeLocal.TimeofDay
  }

  #Monthly
  elseif($Sched.OptionsMonthly.enabled -eq $true){
    $ScheduleOverview="Monthly; " + $Sched.OptionsMonthly.DayNumberInMonth.ToString() + " "
    if(! $Sched.OptionsMonthly.Months.Count -eq 12){
      $ScheduleOverview+=$Sched.OptionsMonthly.Months.ToString()
    }
    if($Sched.OptionsMonthly.DayNumberInMonth -eq "OnDay"){
      $ScheduleOverview+=$sched.OptionsMonthly.DayOfMonth.ToString() + "; "
    }
    else{
      $ScheduleOverview+=$sched.OptionsMonthly.DayOfWeek.tostring() + "; "
    }

    $ScheduleOverview+=$Sched.OptionsMonthly.TimeLocal.TimeofDay.ToString()
  }
  #periodically
  elseif($sched.OptionsPeriodically.Enabled -eq $true){
    $ScheduleOverview="Periodically; Period " + $Sched.OptionsPeriodically.FullPeriod + " minutes; "
  }
  #continuous
  elseif($sched.OptionsContinuous.Enabled -eq $true){
    $ScheduleOverview="Continuous; ; "
  }

  return $scheduleoverview
}

function Get-VBRJobHistoryOverview($injob){

  $History=Get-VBRBackupSession | Where-Object {$_.origjobname -eq $injob.name}
  $History= $history |Sort-Object -Property CreationTime -Descending

  $name=$injob.name
  write-host $name

  $HistoryHash=new-object system.object
  $HistoryHash | Add-Member -type NoteProperty -name Name -value $injob.Name
  if ($History){
    $HistoryHash | Add-Member -type NoteProperty -name LastResult -value $history[0].Result
    $HistoryHash | Add-Member -type NoteProperty -name StartTime -value $history[0].CreationTime
    $HistoryHash | Add-Member -type NoteProperty -name EndTime -value $history[0].EndTime
    $HistoryHash | Add-Member -type NoteProperty -name BackupSize -value $($history[0].GetBackupStats()).BackupSize
    $HistoryHash | Add-Member -type NoteProperty -name Details -value $($history[0].GetDetails())

    $lastfive=@()
    For ($i=0;$i -lt 5; $i++){ $lastfive+=$History[$i].Result}
    $HistoryHash | Add-Member -type NoteProperty -name LastFive -value $($lastfive|out-string)
  }else{
     $HistoryHash | Add-Member -type NoteProperty -name Details -value "No History Found"
  }
  write-host $HistoryHash

  return $HistoryHash
}
<span id="mce_SELREST_start" style="overflow:hidden;line-height:0;"></span>
Function Build-Output{
  $OutputFile="VeeamBackupOverview_$(get-date -Format HHmm_ddMMMyy)."
  switch ($OutputType){
    "HTML"{
      $Header=
        @"

          table {
          font-family: "Trebuchet MS", Arial, Helvetica, sans-serif;
          border-collapse: collapse;
          width: 100%;
          }
          th {
          padding-top: 12px;
          padding-bottom: 12px;
          text-align: left;
          background-color: #4CAF50;
          color: white;
          }
          TD {border-width: 1px;
          padding: 3px;
          border-style: solid;
          border-color: black;}

"@    ## Must remail left aligned, no whitespace allowed before string terminator
      $OutputFile=$OutputFile+"html"
      $OverviewFrag= $VBRjobsOverviewResults | ConvertTo-Html -As Table -Fragment -PreContent '</pre>
<h2>Overview Report</h2>
<pre>
'|Out-String
      $HistoryFrag= $VBRjobsHistoryArray | ConvertTo-Html -As Table -Fragment -PreContent '</pre>
<h2>History Overview Report</h2>
<pre>
'|Out-String
      $GapFrag = $GapResults | ConvertTo-Html -As Table -Fragment -PreContent '</pre>
<h2>Gap Report</h2>
<pre>
'|Out-String
      ConvertTo-Html -Head $Header -PostContent $OverviewFrag,$HistoryFrag,$GapFrag|Out-File $OutputFile
    }
  }
}
<span id="mce_SELREST_start" style="overflow:hidden;line-height:0;"></span>

A Letter to the Veeam Vanguards

In 2000 one of my closest friends was getting married, so he and I went to NYC for a boys weekend. While we were there, we had a party and I made many friends for life and even met my future wife. The minute I got home from that trip, I sat down at my desk and wrote a letter to all my new friends. Unfortunately that letter was lost or otherwise destroyed (I’m looking at you Eric) and with the exception of notes to my son, rarely have I felt compelled to write a similar letter since. Until today.

I’ve been home from my first Veeam Vanguard Summit for about 24 hours now. I had to wait to pen this for two reasons. First, if I penned it immediately when I got home, I may end up married to a number of Vanguards and that’s not right or legal. Secondly, without 12 hours of sleep it would have read “Veeam Vangoiui7ioe79etgjl o87rnngdufi…”

I was pretty excited when I received my email in February acknowledging that I’d been accepted into the Veeam Vanguard program. When we first got together I was struck by the lack of pressure. There were little expectations of promoting the brand or pressure to produce content. The message that I heard was, ‘we want to share with you and in return we’d like your frank feedback’. To be honest, at first I was skeptical and was thinking there was some BS. I was wrong.

The year continues and there are periodic calls and webinars, some cool opportunities and early access to info. Some good, fun chatter on Slack should have tuned me in a little more as to what was to come.

For the Veeam Vanguard summit, we went to Prague. Now, if I stopped here and just shared my thanks, it would have been an experience of a lifetime. Before I move on though, just a few words about Prague. I don’t know if I’ve ever been to a more magical city. The architecture is beyond amazing, the people welcoming, the food delicious and the beer plentiful. Anyone considering going, should stop considering and just start saving your pennies. Prague should be at the top of most bucket lists and I truly hope that I get to visit again in the future.

But we weren’t there to sightsee, this was after all a tech program. I expect to share thoughts about the content specifics from the Summit in the future, so I won’t spend time diving into that here. What I will share about this event is that I’ve never seen such candor from a vendor. Any vendor will tell you about what’s awesome in their products and there was definitely that, but the folks at Veeam also told us where things still need work. We got to see behind the curtains on where things are going. That’s pretty cool and again, if we stopped there, I’d still be appreciative. What really made the product discussions special though, is that our feedback was actively solicited and I know for a fact that feedback will makes it’s way back into the product roadmap and development. Seriously, when can you, as a user, sit down with the head of Development for a billion dollar company, tell them your thoughts and concerns, then watch them hand that feedback directly to their teams? I now know the answer.

A most thoughtful gift, to cap a most amazing trip

For experiences like this, ranking and coming up with what’s “the best” is an exercise in futility. One of the things that make this program the most special and what was the most warming and touching part of this experience was the people. I’ve been involved in other programs, information exchanges, educational programs and the like for years. Only once before (luckily earlier this year, I’m blessed) have I ever experienced such camaraderie. From the Veeam-ers who made the event happen, to the Vanguards who participated, I’m so very great full to have met you. As I told one of the SVP’s at our amazing final event at Staropramen Brewery, I’ve never felt so welcomed into a community and god willing, I’ll get to pay that forward in the future.

To Anton, Rick, Cade, Spiteri and the rest of the technical teams, the content was amazing. Thank you for letting us share our feedback on it. Bring on v10!

To the rest of the Veeam team, Aubrey, Rin, Chelsea, Kirsten, and anyone I may have forgotten, thank you for your support. An extra big thank you to Nikola; what an amazing event you helped put together. I really look forward to working with you again in the future.

Finally, to my fellow Vanguards, thank you for the camaraderie, friendship, laughs and welcoming.

To all of you, it was an amazing adventure. Until I see you next, Cheers!

PS. While they weren’t there, I’d be remiss if I didn’t thank my boss and employer. If I didn’t work for a such a fantastic person and organization, I wouldn’t be able to have experiences like this where I can learn, share and re-energize. You’re the best!

Hot Take from TFD19 – RPA with a Security First mindset

As someone who’s long been an advocate of automating the things, I got really excited when Automation Anywhere was announced as a presenter for Tech Field Day 19. For me it’s a great opportunity to learn about a rapidly growing market space, Robotic Process Automation (RPA). It’s also very timely for me professionally as the financial sector is a prime candidate for RPA and many conversations are being had around the efficiencies that can be derived from it.

That being said automation isn’t necessarily a unicorn. What if you write a bad process? What if you’re a bad person? What do you tell the auditors? If you don’t at least think about these questions, you may find yourself automating your way to the unemployment line.

2019-06-27 13_34_43-Tech Field Day 19 - Tech Field DayJust to set one more piece of context, I’ve been reading a lot lately about building a security program and the basics of a successful program. One idea that keeps popping up time and time again – everyone should have a security mindset, including your developers. However for many people, security is an afterthought. So it was refreshing when early on s presentation this lovely little slide popped up. The details are important, but more relevant is the fact that the folks at Automation recognized that their platform is powerful (insert necessary spiderman quote here) and therefore security has to be considered from the outset.

2019-06-27 13_44_37-Tech Field Day 19 - Tech Field DayNow from my view, these items should be table stakes for any software in 2019. The reality is that for the majority of the software industries features and functionality are prioritized over data security. I honestly am sitting here thinking through various software presentations I’ve seen over the years that treated security as a central premise, rather than an afterthought and I’m coming up empty. Given that RPA has the potential to be a giant attack vector for the bad guys, solace should be found in the fact that Automation Anywhere takes their responsibility to provide a secure solution seriously.

This security first mindset was then further demonstrated when 5 minutes were devoted to how you should properly promote code from dev to qa to prod. Taken alongside the fact that you can audit every action taken by every bot, under every user account context and the approach to security from the folks at Automation Anywhere is quite refreshing.

Many questions still exist for me, such as ‘how do you ensure resilience for your RPA solution?’ and ‘how easy are all of these controls to leverage? Can they be used at scale?’ Nevermind that you always need to do your due diligence for any platform that you introduce into your environment. All that being said, I’m looking forward to getting my hands on the free community edition of Automation Anywhere’s RPA product suite.

Disclaimer. As a guest of Tech Field Day as a delegate, my accommodations and travel have been paid for. The words and thoughts expressed herein are mine alone. I have not been compensated for this post.

Veeam backup report via PowerShell

Here’s the fun thing about audit’s. That’s right, I said fun and audit in the same breath. The great thing about audits is that they are tedious and repetitive…

Which makes them great candidates for automating!

I have a task in front of me to document our backups. Thankfully we use Veeam, which means I get to PowerShell this bad boy! It’s an audit, so I could make this super simple and just pipe the output from Get-VBRJob to a CSV and call it a day. The problem with that approach is that it doesn’t provide any additional utility beyond the audit.

What would be useful though was if I could take all the servers in my target group and compare them against all of the jobs from Veeam and output a pretty little CSV where you could at a glance tell where everything was. Here’s what I came up with:

function get-veeampluginstatus{
  if(! $(Get-PSSnapin -Name VeeamPSSnapin -Registered -ea SilentlyContinue) ){
    Write-Host "This script requires the VeeamPSSnapIn to continue. Please install this and retry."
    exit
  }
  elseif( ! $(Get-PSSnapin -name VeeamPSSnapIn -ea SilentlyContinue)){
    Add-PSSnapin -Name VeeamPSSnapIn
  }
}

get-veeampluginstatus

$targetclusters=@("cluster1","cluster2")
$JobArray =@()
$vcentersvr="vcenterserver"
$veeamsvr="vbrserver"

if(!$cred){$cred=get-credential}

Connect-VIServer $vcentersvr
Connect-VBRServer -Server $veeamsvr -Credential $cred

### Get a hash table from Veeam of all Jobs and member servers
foreach($job in Get-VBRJob)
{
  $JobHash=new-object system.object
  $vms=$job.GetObjectsInJob() | Select-Object -Property name -ExpandProperty name
  $JobHash | Add-Member -type NoteProperty -name Name -value $job.Name
  $JobHash | Add-Member -type NoteProperty -name VMs -value $vms
  $JobArray +=$JobHash
}

###Get all Vm's in the target clusters. Iterate through hash table and if a job match add value to VMArray
$SummaryArray =@()
foreach ($target in $targetclusters)
{
  foreach($VM in $(get-cluster $target|get-vm)){
    $VMArray=new-object system.object
    $vname=$(get-vm $vm).name
    $VMArray|Add-Member -type NoteProperty -name VM -Value $vname

    for ($i=0; $i -lt $JobArray.count ;$i++){
      if($JobArray[$i].VMs.Count -gt 0){
        if($JobArray[$i].VMs -contains $vname ){
          $VMArray|Add-Member -type NoteProperty -Name $($JobArray[$i].name) -Value "enabled"
        }
        else{
          $VMArray|Add-Member -type NoteProperty -Name $($JobArray[$i].name) -Value "-"
        }
      }
    }
    $SummaryArray +=$VMArray
  }
}

$SummaryArray | Export-Csv "veeam_jobs_$(get-date -Format dd_MM_yyyy).csv" -NoTypeInformation<span id="mce_SELREST_start" style="overflow:hidden;line-height:0;">&#65279;</span>
lines 1-20 In the current format, this report is meant to be run in an ad-hoc fashion, so lines 1-20 are really just setting the scene.
One thing of note, I chose to target clusters in my script. This could very easily be altered to target any container objects in vSphere by altering lines 13 & 35.
lines 24-31 When building reports, I’m a fan of using custom objects, it’s just how I roll. Plus it’s readable and easy to consume. There’s a really good explanation of custom objects and how to use them here. In this case I build a custom object, $JobHash, to hold the couple of bits of info about each job. I’ll come back to this in a few.

Each job object that’s returned by Get-VBRjob has a method associated, GetObjectsInJob(). This method tells us which VMs are in the job. Since the VMs are returned as an object, I’m just selecting the name for use in the report

Then the Add-Member commandlet is used to add the job name and VM names to $JobHash. Finally individual jobs are each added into $JobArray on line 30 before moving on.

lines 35-55 The general premise of this whole section is to take the previously built array of jobs, conveniently named $JobArray,  compare it to the list of VM’s from our $targetclusters and build our output.
lines 38-40 If you remember I said at the beginning, I want a visual report of all VM’s and their job status. So regardless of job status, every VM gets an element in $VMArray from our friend Add-Member.
lines 42-52 For each VM that’s returned on line 37 we iterate through $JobArray and based on whether a match is made or not, an entry is made for that VM-Job combo in the $VMArray table, which is concatenated onto $SummaryArray before moving on to the next VM.
lines 56 We done! Pump out the $SummaryArray to a datestamped csv file.

2019-06-01 13_26_23-veeam_jobs_01_06_2019.csv - ExcelAnd here’s what we get for output. It’s a very simple report, but it hits exactly the mark I was aiming for, which is a way to see at a glance what is configured where. There are also several places where the code could be made more efficient, but this is one case where the destination matters far more than the journey.

But, Wait! There’s More! Toying with PowerShell write speed.

Why hello old friend. It’s been a while. Sometimes life happens…

… and then you remember you actually still have a blog!

I have a random task in front of me at work where I have to do a lot of text manipulation. Obviously I’m going to PowerShell the crap out of this, but as I start framing out the script I realize I’m going to be writing a lot of strings. There is no speed requirements to this operation, but because I’m weird like that, I wanted to figure out what’s the fastest way to write a whole bunch of data to a file.

Here’s the wee little script I wrote to play around:

$arrDataToWrite=@()
$charSet="aAbBcCdDeEfFgGhHiIjJkKlLmMnNoOpPqQrRsStTuUvVwWxXyYzZ0123456789".ToCharArray()

$StringLength = "400"
$FileLength = "10000"
$arrTestFile="C:\temp\arraytestfile.txt"
$lineTestFile="C:\temp\LineTestFile.txt"

#test one. Build array, output array in one shot.
Measure-Command {
  write-host "Test: create array of characters and output entire array"
  for($i=0; $i -le $FileLength; $i++){
    $arrDataToWrite+=$(-join $($charSet |get-random -Count $StringLength))
  }
  $arrDataToWrite |Out-File $arrTestFile
}

#test two. Iterate through, outputting as we go.
Measure-command{
  write-host "Test2: create characters and output inline"
  for($line=0; $line -le $FileLength; $line++){
      $(-join $($charSet |get-random -Count $StringLength))|Out-File $lineTestFile -Append
  }
}

Methodology was pretty simple.

  • First test: build one big array full full of strings of random characters. After building the array, output the whole array to file
  • Second test: build strings of random characters and use Out-File to append the strings as we loop

2019-05-16 13_05_33-Windows PowerShell ISEThe results as you can see are pretty clear. After several test runs of varying sizes, appending strings to the file in-line is pretty consistently ~3x slower than the array method.  This makes sense when you think about the operations. In test two, every time I write to the file I have to access/open the file, write the data, close the file. If I’m counting right, that’s 3x the number of file operations… and that loop takes 3x longer…. huh…

Obviously it’s an over-simplification, but there are vastly more activities taking place in test two, hence the much longer execution time. Case closed, send this to print!

51WRFJ1M91L._SX258_BO1,204,203,200_However, in the famous words of Ron Popeil: But wait! There’s more!

I typically do a litte of digging around before posting thoughts. Call it pragmatic, call it worrying, call it imposter syndrome, but reality is I just don’t want to stick my foot in my mouth. So I’m reading a few articles when I come across the article titled Slow Code: Top 5 Ways to Make Your PowerShell Scripts Run Faster from Ashely McGlone.

Sure enough, half way down this informative article  is a heading that reads “appending to arrays”. Well, that’s what we’re working on here! So I give this technique a shot, and voila my fastest run just got 50% faster! Here’s the code from test1 further optimized

#But wait, there's more!
#test three: Build the dataset within the loop and assign it to the array in one shot
$arrTestFile2="C:\temp\arraytestfile_fast.txt"
Measure-Command{
  write-host "Test3: assign all data to array in one shot and output entire array"
  $arrDataToWrite_Fast = for($j=0; $j -le $FileLength; $j++){
    $(-join $($charSet |get-random -Count $StringLength))
  }
  $arrDataToWrite_Fast |Out-File $arrTestFile2
}

As Mr. McGlone states in the linked article, the For loop runs and stores data in memory.2019-05-16 13_26_42-Windows PowerShell ISE By assigning that For loop to the array in one shot, you only have one expensive array operation rather than ‘N’ number of array operations.

As I stated at the outset, there’s no real reason behind this activity other than learning and having fun. To that end, I hope you keep learning, keep scripting and keep having fun.

Oh and Ashley, if you should by chance come across this, thanks for your article and for helping me learn something new!

A Tale of Two Architects – A Review

I just finished two really great books on architecture, that really necessitate sharing with all of you. These two books were written for very different purposes and with very different voices, but I found them both to be enjoyable reads with educational content.

VDI Design Guide: A comprehensive guide to help you design VMware Horizon, based on modern standards

Johan van Amersfoort

The title doesn’t exactly roll of the tongue, but honestly that might be my biggest issue with this book…

My team has a small, but challenging VDI environment, so I placed an order the day this book was released and immediately tore into that lovely amazon box when it arrived. It’s been a running joke in the infrastructure and virtualization worlds for some time now that “this is the year of VDI!” Perhaps the fact that this prognostication hasn’t come to pass is due to the complexity to running a well functioning VDI environment. Seriously, just think about all the various components that make up a VDI environment: your core infrastructure Hypervisor/SAN/Network/Security/Compute, the client, the delivery model, nevermind what’s actually in the guest! Johan does a really great job of walking through all of these components and so much more in his first book.

The style of the book emulates the VCDX design methodology. I am not, nor ever will be, a VCDX but I found his explanation of the methods to be much more engaging than in other tomes. What I mean by this is that there are some architecture books out there that are extremely dogmatic and really are just guides towards attaining a certification. Johan on the other hand does such a nice job walking the reader through the various design and architecture phases that I’d strongly consider giving this book to any burgeoning architect, whether they cared about VDI or not.

Now don’t let me fool you, that this is all method and no meat, because that would be a tragedy. Like I mentioned at the outset, my team has some VDI challenges, and with the authors thorough and detailed dissection of all the various components of a VDI infrastructure, we had immediate technical takeaways. Johan walks through all the components that make up a VDI environment, providing his recommendations for why you may want to go in a specific direction, and just as importantly why you may not!

I’ve been told on a couple of occasions that I have a unique voice when I write. Given that, I have to say I thoroughly enjoyed Mr. van Amersfoort voice through this book. As it was pointed out during his recent visit to the Datanauts podcast reading this book was like sitting down with a colleague and chatting through a technical issue. It truly made it one of the most fun technical reads I can remember.

All in all if you’re interested in VDI or general Architecture principles, do yourself a favor and pick up Johan van Amersfoort’s first book.

You can find Johan at @vdidesignguide & @vhojan

IT Architect Series: The Journey

Melissa Palmer

product_thumbnail.phpI picked up this book solely because I’ve enjoyed Melissa’s blog (https://vmiss.net) for some time. In the review above, I alluded to having read some bad architecture books, so I intentionally went at this book with no expectations. I have to come right out of the gate and say that this book was one of the most interesting technology books I’ve read, in that it talked about technology very little. The subtitle for this book reads “A guidebook for anyone interested in IT architecture” and a guidebook is really what Melissa gave us.

The premise to this book is to help anyone interested in technology or a burgeoning IT practitioner understand just what an Architect is and what it takes to get to be one. I can speak for no one but myself and my observations over the past 20 or so years in IT, but it seems that many systems architects just kind of eventually land in that role. They get good in one area, and maybe good in another. After some time they end up being the smartest gal/guy in the room. This is not the book to help with that sort of an endeavor and I love it! In writing this book Melissa provides a path, that worked for her en route to VCDX, on how to take a more active approach to becoming a solution provider. A sampling of the topics covered include, “Learning New Skills”, “Infrastructure Areas of Expertise” and “Architectural Building Blocks”. The format is more about the journey rather than a prescriptive roadmap. In fact throughout the book, the reader is encouraged to take a step back and see how the information shared fits within their role and worldview.

While I really enjoyed the approach and Melissa’s voice, my knock on this read is that it could use a copy edit. If you are someone who has ever joined in on the “On Premises” debate, please approach this book knowing that there is some small amount of errata present. As a wanton comma abuser, I’m certainly not throwing stones and I hope this doesn’t stop you from picking up the book; the content contained within absolutely makes up for any grammatical oopsies.

The primary content of the book clocks in at just under 200 pages. If you already are or aspire to be an architect you are going to read technical guides that are way longer than this! Just like with her blog, Melissa’s personality carries through this book. It’s obvious that a passionate person wrote this piece, in an effort to help others, all the while maintaining a sense of self. A perfect example is when discussing assumptions towards the end of the book, Melissa creates an analogy where she uses the word ‘chicken’ ten times in a paragraph. I literally laughed out loud, to which my wife responded “Is your geek book amusing dear?” Yes, yes it is.

Many IT practitioners discount some of the “softer” skills required in a business environment. It’s in this vein where I think the book really shines. If you are someone who has a hard time communicating in either written or verbal form, you are probably going to have a hard time obtaining an architect level role. Melissa spends a significant portion of the book emphasizing what these skills actually are, why you need them and tips on how to improve them. I’m thinking about getting a couple more copies of the book for some folks who could really us some self-reflection in this area…

Obviously anyone with aspirations of reaching an architect level would benefit from picking up this guide. If I were a college professor teaching folks what it was like to work in IT and to give them a broad perspective, I’d have them read this book. As someone who’s worked in an architectural role, I learned a number of things as well, meaning even seasoned IT pros can benefit from picking this up. Reading this book over the past few days, it became obvious that Melissa cares about people and the solutions they provide, so by that token perhaps we could all benefit from the reflective approach conveyed throughout this book.

You can find Melissa at @vmiss33 & @ITArchJourney

VMworld 2018 – FOMO? Never fear!

fomo

In just a few days friends, colleagues, teachers, luminaries and thought leaders will be converging on Las Vegas for the biggest and best virtualization conference in the world. If you’re in the same shoes as me, VMworld 2018 just isn’t in the cards. Hearing that Tony Hawk , Run DMC, The Roots and Snoop would be a part had me a bit bummed. However it was when I heard that Malala would be participating in the general sessions, that I turned that attitude around.

It was then that I realized there is still a wealth of ways to experience VMworld, even when you’re 2,638 miles away from Las Vegas, not that I’m counting or anything.

General Sessions

Like I alluded to above, it was seeing that Malala would be participating in the general sessions that helped turn my attitude around. The reason for this is that VMware makes an effort to broadcast the General Sessions live.

If you haven’t been to a major conference, these sessions are the reason why a lot of people refer to conferences as a “show”. It’s time for the heavy hitters, for the big production and for news to drop. The general sessions that I’ve attended tend to follow a pattern:

Day 1 State of the Union. Let’s highlight our successes, broad industry trends and how we are positioned to respond or better yet, led those trends.
Day 2-N  Thought leaders. Talk about growth, and what the future holds. Not everything that you see at a tech conference will become reality. I feel like it’s these days where you see organizations testing the water to see how ideas and roadmaps feel among the various stakeholders.
Last day  Honestly these are my favorite sessions. The show’s almost over, some folks have already left town and honestly the people who are left are likely kind of burnt out. VMworld always saves something cool for those brave and/or hardy folks who are left standing on the last day.

Now unfortunately that final cool session is only for attendees. It’s probably a good reason to start working on your budget justification to attend next year… For the Monday and Tuesday sessions however, you’ll want to set a calendar reminder to tune in at 9:00AM PT for the general sessions live on VMworld.com

vBrownBag Tech Talks

The vBrownBag talks are one of my favorite parts of VMworld. If you’re reading this blog, you already know about the crew, but if by some chance you don’t know… vBrownBag is a community of passionate people who want to share and facilitate sharing within the IT Infrastructure community.

2017-10-01 12_01_50-Clipboard
Getting my feet wet at my first #vBrownBag session

The other cool part about vBrownBag is that they produce Tech talks. These are short community sessions ranging from just a few minutes up to a half hour in length. You can check out my 2017 session on life as a SMB in a big Enterprise world or PowerCLI for examples. (Go easy on me, I was nervous about my other sessions). The whole point of vBrownBag is sharing and the very cool people who produce the Tech Talks do a damn fine job at it. If you want to follow along live, you can check out the action on vbrownbag.com or if you are unable to participate live all sessions are posted to the vBrownBag YouTube channel, usually within an hour or so.

Community members coming together to share with each other. For everyone involved it’s a labor of love and how can you beat that?

VMware {code} Power Sessions

2018-08-23 22_08_03-VMware code - Home _ FacebookI am super excited about this new offering! And maybe a touch bummed that I’m not going to be participating… But just because I won’t be presenting, doesn’t mean that I won’t be following along. Similar to what the vBrownBag folks are doing, the VMware community team will be hosting expert-led presentations from community members, but with a focus on DevOps and developers. All the action will be live streamed via the {code} facebook page. You can check out the entire line-up by searching for CODE sessions in the content catalog.

VMTN

Since we’re talking about community, let’s not forget about VMTN. The VMTN page is always a hotbed of activity during VMworld. I’m not sure why it’s a secret, but nevertheless it is kind of the secret sauce to staying in the know during the show. If you wanted a place to participate in contests, watch live streams, chime in with all of your community friends, then you might want to head over to the VMTN page.

Bloggers

Holy crap! How can I forget the bloggers! While writing a blog post! Shame!

In my mind the blogosphere (is that still a term?) is the lifeblood of our vCommunity. It’s where passionate people go to talk about the things that matter most. Where we share our successes, our trials and all of the cool things we learn about! What better place to do that than at the VMworld. VMware has a really strong blog presence that’s only gotten stronger over the past year or two. I’m obviously partial to the PowerCLI blog, but next week I’ll be keeping an eye on the official VMworld Blog. If you can’t make the general sessions, this is where the news will drop. I’ll just leave that there…

FullSizeRender-1-1024x475

Beyond the official blog, there are dozens of people blogging about nearly everything that happens at the show. As someone who’s live blogged a general session, I can tell you that the only reason someone would blog from a show is to share with others. Here’s a good place to start if you’re looking for some of the fine folks who’ll be attempting to document everything that happens in Vegas next week. Well, not everything…

Beam me up … me

Sorry (not sorry), horrible dad joke there.

Did you know you can drive a robot at VMworld? Seriously. Ok, not a robot, but a BEAM. Don’t know what a BEAM is? It’s basically FaceTime mounted on a remote control car. You can register to drive one of these super awesome RC devices around the VMTN space. How awesome is that?!?

img_3481

I’m sure that there are more ways to experience VMworld if you’re not there, but honestly I’m tired just writing this, let alone trying to sample all of the above options. No matter which way you go, there is definitely no fear about being able to make the most of VMworld from afar.

Start PowerShell’ing your Backups

VeeamOn was such a great and educational show. For my small part of it, I wanted to share how you can automate the deployment/management of your infrastructure. Why would you want to automate? Seriously, did I really just ask that… Speed and  standardization/predictability are the primary drivers for scripting via PowerShell. Or awesomeness. Yup, we’ll stick with the fact that PowerShell is full of awesomeness.

I started my VeeamOn presentation with an example of just how awesome you can become with your PowerShell scripts. In the first part of the video you see how long it takes a person to manually configure a job. Keep in mind this is someone who pretends to know what they are doing, but still errors happen. The second example shows just how long it takes to implement the exact same job via script. So let’s take a moment to parse the Why’s of PowerShell

  • Speed. When I made this video, I had a practiced each click. It still took 2 minutes and 45 seconds to create a backup job. Conversely, the script took 30 seconds to complete. This is for one machine. Now think about if you’re setting this up for 100 machines… manually you’re looking at 4.5 and half hours. Via script, 50 minutes. Yes this is dirty math as there are many factors that go into the equation, but you can’t argue with the fact that scripting is more efficient, even factoring in the ~20 minutes I spent writing the script.
  • Standardization. I’ve worked with Veeam B&R for a number of years now and as I mentioned above, I practiced the workflow to try and take out any bias from the equation. Still I made an oops. We’re human after all. As you’ll see in the code below, by standardizing you can remove much of the human variable (bad pun intended) and produce a more consistent output.

 

Without further ado, here’s the code and desciptions about what’s going on.

lines 2-3: If the PsSnapIn for Veeam isn’t loaded, let’s go ahead and load it. You’re using ISE right, so you probably want it in your editor session as I talked about here
lines 6-27: When you stop and think about it, backup jobs are complex and have lots of options. This is an uber simple example and still you’ve got 16 lines of configurations. By moving this off into a function you make your code both more readable and repeatable. Thankfully the developers have used very intuitive naming conventions for the job options!
line 32: connect to the Backup and Recovery server. Full disclosure, I set the $cred variable in a previous script and got lazy. Sorry!
lines 6-27: Finally, we are getting to the action. It’s exciting! But it’s also not. We set most of the options already. At this point all we need to do is execute the various VBR cmdlets.

  • Add-VBRViBackupJob
    Short and simple, create the backup job… Or is it????  We create a job, but we also leverage:
    Find-VBRViEntity: Find the VMware entity to backup that we specified in our variables above.
    Get-VBRBackupRepository: Fairly self-explanatory, find the backup repository that the job will use.
  • Set-VBRJobOptions
    There is a flow to objects created via the VBR PowerShell cmdlets. Create the object. Set options on the object. This is the later. These are the options configured in our SetVariables function in lines 19-26.
  • Set-VBRJobAdvancedBackupOptions
    Do I need to explain what this does?
  • Set-VBRJobSchedule
    Again, pretty self-explanatory right?
  • Add-VBRViBackupCopyJob
    Remember when I screwed up in the video above? Why do you have to bring up those painful memories??? Anyway, moving on this one creates a new copy job on the previously set variables.

Each of the above cmdlets have a significant number of options to fit your environment. I’d encourage you to peruse the Veeam PowerShell Reference guide for additional options.

Here’s the code. We’ll cover more advanced workflows in our next post in this series. Stay tuned!

#VeeamOn Simple Backup Job Demo
if( ! $(get-pssnapin -Name VeeamPSSnapIn -ea SilentlyContinue)) {
Add-PSSnapin VeeamPSSnapIn
}

function SetVariables{
$Global:PWord= ConvertTo-SecureString -String "VMware1!" -AsPlainText -Force
$Global:cred=New-Object -TypeName "System.Management.Automation.PSCredential" -ArgumentList "lab\Administrator", $PWord
$Global:VBRserver="VBR.lab.local"

$Global:VBRBackupName="Gold_Tier_Backup"
$Global:VBRRepositoryName="VBR_Local_Repository"
$Global:VBRBackupEntity="GoldTier"

$Global:VBRBackupCopyName="Gold_Copy"
$Global:VBRReplRepositoryName="VBR_Replication_Repository"

#set JobOptions
$Global:VBRJobOptions=Get-VBRJobOptions -Job $VBRBackupName
$VBRJobOptions.JobOptions.RunManually = $false
$VBRJobOptions.BackupStorageOptions.RetainCycles = 3
$VBRJobOptions.BackupStorageOptions.RetainDays = 7
$VBRJobOptions.BackupStorageOptions.EnableDeletedVmDataRetention = $true
$VBRJobOptions.BackupStorageOptions.CompressionLevel = 6
$VBRJobOptions.NotificationOptions.SendEmailNotification2AdditionalAddresses = $true
$VBRJobOptions.NotificationOptions.EmailNotificationAdditionalAddresses = "test@test.com"
}

#Call SetVariables function
SetVariables

if ( ! $(get-vbrserver -Name $VBRserver) ) {Connect-VBRServer -Credential $cred -Server $VBRserver}
sleep 5

$null=Add-VBRViBackupJob -Name $VBRBackupName -Entity $(find-vbrvientity -Tags -Name $VBRBackupEntity) -BackupRepository $(Get-VBRBackupRepository -Name $VBRRepositoryName)

$null=Set-VBRJobOptions -Job $VBRBackupName -Options $VBRJobOptions

$null=Set-VBRJobAdvancedBackupOptions -Job $VBRBackupName -EnableFullBackup $true -FullBackupDays Friday -FullBackupScheduleKind Daily

$null=Set-VBRJobSchedule -Job $VBRBackupName -DailyKind WeekDays -At 01:00

$null=Add-VBRViBackupCopyJob -DirectOperation -Name $VBRBackupCopyName -Repository $(Get-VBRBackupRepository -Name $VBRReplRepositoryName) -Entity $(find-vbrvientity -Tags -Name $VBRBackupEntity)

 

Getting started with Veeam for PowerShell

Shame on me! Right after VeeamOn 2018, life threw my family some major league curve-balls and I never had a chance to get my code shared out. Time to fix that…

For those of you who may be coming at this fresh, Veeam has provided a PowerShell SnapIn for configuring, maintaining and monitoring Backup and Replication. Simply choose to install the Veeam Backup and Recovery console from the B&R ..iso file or follow the instructions in this KB article. When you launch the Backup and Replication console, you’ll find a PowerShell menu option under the main Console menu. The way I write, I really need an ISE and as built you just get a PoSH window, rather than ISE. So I did a little sleuthing… and I mean little. I typed the command Get-History and low and behold, the VBR shortcut fires off a PowerShell script located at:

“C:\Program Files\Veeam\Backup and Replication\Console\Install-VeeamToolkit.ps1”

It’s always an interesting read to see how other people solve problems. Basically the script does a bunch of validation and calls another script C:\Program Files\Veeam\Backup and Replication\Console\Initialize-VeeamToolkit.ps1  more validation, aliases, options etc and finally we see two things:

  • The script functionality is delivered by VeeamPSSnapIn
  • The functions Get-VBRCommand and Get-VBRToolkitDocumentation are defined in the Initialize-VeeamToolkit.ps1 script. You’ll need another path if you want to make use of them, but I’m gonna help you out there in a minute.

TLDR: Add-PSSnapin VeeamPSSnapIn

Above I mentioned Get-VBRToolkitDocumentation.  Interestinglyl this function fires up the Veeam documentation at https://helpcenter.veeam.com/docs/backup/powershell/

It’s a pretty good and comprehensive document set, so I’d highly recommend checking it out. Seriously, it’s a really good guide with some great examples that I’d encourage you to explore.

Get-VBRCommand

This is an interesting one. It basically uses the Get-Command to list out all of the VBR commands. If you drill into an individual command you get some more info about what’s under the cover, but what’s really interesting to me is if you take a peek at some of the numbers coming out of this function. As of this writing, there are

  • 510 individual cmdlets in the Veeam Backup and Replication SnapIn,
  • 27 Verbs
  • 259 Nouns

If you’re like me, that’s a pretty intimidating sample to tackle. But if you look at the data slightly differently, it gets much more manageable.

Starting with the Verbs, we see that 1/5 of the cmdlets are get’s and when you combine that with Set and Add over half of all the cmdlets from this snappin are accounted for.

veeam backup and replication powershell toolkit

Now when you turn to look at Nouns, the data is very different.

veeam backup and replication powershell toolkit2

Wow… Verbs are consolidated, however nouns are numerous. This is odd at first glance, but when you think about it, it totally makes sense. The Veeam B&R snappins are meant to support a variety storage/backup/infrastructure products, but the actions you can perform across these products are more or less consistent. This consistency is great for you as you get started on your way towards automating your infrastucture with PowerShell. We’ll start going deeper into that infrastructure management in our next post, stay tuned!

If(Code -ne ISE){DontFret}

I’m behind the times, as usual…

I’ve been reluctant to give up my PowerShell ISE with ISEsteroids for years now, but I think it’s finally time to get onboard with VS code. It’s definitely a bit of a shift, so I thought that I’d add my thoughts to the chorus of users who’ve made the switch from ISE to Visual Studio Code.

I have an odd sense of humor, so I thought it would be fun to use ISE to download and install it’s replacement. Yes, I know it would’ve been faster to simply get it via browser, but did I mention my odd sense of humor…


$Uri="<a href="https://go.microsoft.com/fwlink/?Linkid=852157">https://go.microsoft.com/fwlink/?Linkid=852157</a>"

$download="$Home\Downloads\vscode_installer.exe"

Invoke-WebRequest -Uri $uri -OutFile $download

Start-Process -FilePath $download -ArgumentList "SP","/silent","/Log"

So after allowing UAC to run the file, we have a base install of VS code, that we can launch by simply typing the command ‘code‘ inside any Windows command interpreter.

2018-06-04 14_53_22-Untitled-1 — Visual Studio CodeSince I write pretty much exclusively for PowerShell, there’s a couple of things that I need to do right out of the gate to make this tool useful. First off, code is meant to be portable and to fit many needs, so there isn’t a ton installed out of the gate. Code handles this conundrum via Extensions. To add an extension, simply click Extensions in the Activity bar. This will open up the Extension marketplace. In the marketplace simply search for the desired extension, in this case PowerShell and hit install. Code will make you reload your session in order to make use of the newly added PowerShell features.

I’m almost exclusively going to be writing PowerShell, so I’d like this to be configured  as best I can for that purpose. Step 1, make PowerShell the default terminal. We can do this a couple of ways, although it looks like the folks on the Code team may have changed the default behavior since the last time I looked. But I digress…

We can get to our user settings from the File Menu -> Preferences -> Settings, however I want to use one of the powerful features of Code, the command palette. The command palette is a very dynamic and powerful tool in Code, but much has already been written on it, so no need to retread the same ground. After entering the palette by typing ctrl-shift-p , I simply start typing what it is that I’m looking for, in this case default shell, and IntelliSense figures out the rest. After selecting the setting I’d like to change, VS code kindly offers me some suggestions. Sure enough after selecting the PowerShell option for “Terminal: Select Default Shell” I see a new setting in my user settings json file.

 

Finally when I go to check out my terminal, viola PowerShell is my default:2018-06-04 15_50_19-settings.json — Visual Studio Code

Next up, I want to make sure that my default language for VS code is PowerShell. This time I manually edit my settings (File Menu -> Preferences -> Settings) and add the line for “files.defaultLanguage”: “powershell”

 The reason for this is that by default VS Code files are in a plain text format (*.txt, *.gitignore). I’m sure that’s great for a lot of folks, but for me I use PowerShell in a day to day operational role. I’m not always writing code for reuse, often I’m writing and executing ad hoc scripts to be run via my friend F8. By changing the default language, when I start up a new, untitled and yet to be saved script Code knows that I’d like it to be interpretted as PowerShell and they even put the pretty icon for PoSh in the tab for my script.
2018_06_05_16_25_28_Untitled_1_Visual_Studio_Code
Oh, you can also see that IntelliSense recognizes the language and provides context specific assistance as well. BTW for this configuration, I found that I needed to restart Code completely, not just reload the window for the defaultlanguage setting to take effect.

The last thing I need to customize to make this feel like home is to set the switch for “powershell.integratedConsole.focusConsoleOnExecute”: false. Code’s default behavior is to move context from the script selection to the console on execution. If you’ve been using ISE, you’re used to the context staying at the script selection. Setting this switch to false will replicate the ISE behavior of not changing focus.

At this point I have Code configured to feel familiar enough that I can start using it for some functions. If you just want to jump to the punchline, here’s my very simple settings.json file.

{
"terminal.integrated.shell.windows": "C:\\Windows\\System32\\WindowsPowerShell\\v1.0\\powershell.exe",
"files.defaultLanguage": "powershell",
"powershell.integratedConsole.focusConsoleOnExecute": false
}

14727674df540bb3e5302bcb011cfb45_0As a long time ISESteroids user thought I’m having a really hard time making the shift from having a persistent Variables window to using the Code debugger. I’m not sure if I’m being a crotchety old man pounding away at the keyboard or if this will really be a road block for me making the shift permanently to Code. I guess time will tell…. In the meantime, if you’re new to using this tool, I’d urge you to pay attention to the tutorial Microsoft provides on install of VS code. It’s quite a solid walk through and will help with some of nuances of getting started.

If you have any additional tips and tricks on making the shift from ISE to Code, please reach out.

Until next time, happy writing.