Why do we vCommunity?

I went out for a lunch walk the other day to clear my head and listened to a really great episode of Freakonomics. The episode Honey I grew the economy focused on the process and motivators that drive innovation. The deep dive into drivers in particular resonated a lot with me. I saw parallels to how we approach IT as we kick off 2020. While that was super interesting, it wasn’t a giant leap for me to ask a similar question of:

Why do we get engaged in the various technological communities and what do we derive from those communities?

This seemed like a fun opportunity to hear from some of our Technology Community members to understand: Why do we vCommunity?

To Learn

For me, first and foremost, vCommunity is about Education. It stands to reason that it’s the most common entry point for people to engage. It makes sense: You need help on a topic, so you go looking around and only then do you learn about the plethora of opportunity’s available to you. When I personally started down the virtualization path, training dollars were tight, so to improve my knowledge I had to look elsewhere and as luck would have it, I found the Boston VMUG UserCon. That free one day training, gave me access to educational presentations, subject matter experts and hands on training. It was a formative moment for me, and opened my eyes to other avenues of learning. It also showed me that there were a ton of engaged people out there sharing via different mediums.

Stalwart of the vCommunity, Kyle Ruddy had something similar to share regarding his introduction to the tech communities “What I wasn’t ready for was the amount of blogs out there detailing issues I had run into. It was a complete lightbulb moment. From that point it was a gradual process of moving from reading blogs to creating blogs.”

Kyle also highlights a hallmark of the community in that there tends to be a strong desire to contribute back in the form of blogs, videos, podcasts and a host of other mediums. That was after all the genesis for VirtualVT as well. If you look at the mechanics of sharing content, it takes time and effort. You can’t help but boost your knowledge whenever you’re contributing technical content, because you end up spending more time with the underlying technology’s.  Unintentional education? Knowledge Osmosis? Whatever you call it, the brain gets bigger the more you feed it, and you have to keep feeding if you’re creating content about technologies.

To Grow

Bolstering your knowledge via the educational opportunities the communities provide is definitely a path to bigger and better things, but I’ll let you in on a not-so-secret, secret. Getting involved and putting yourself out there as a contributing member of the community can be equally impactful on your career. Contributing is an idealistic endeavor, but it often has the side effect of building your brand at the same time. I met the MVP power couple of Dave and Cristal Kawula this past fall and in a recent blog post Dave shares a bit about how he and Cristal started MVPDays and the impact their event has had on a specific community member: “…I talked him into doing his first presentation, which led to him speaking at user groups and conferences. Earlier this year he became a Microsoft MVP, and a few days ago, he actually accepted a position at Microsoft.

The experience that Dave highlights is not unique to MVPDays. It crosses groups and goes to the heart of what Matt Heldstab (VMUG Board of Directors) shared with me recently “The fantastic power of this vCommunity and its ability to elevate the careers of its members never ceases to amaze me.

It’s Fun!

vDodgeBall, vSoccer, vPoker are just a few of the side-events that come to mind for ways that we like to enjoy the lighter side of our geekdom. One of my favorite events was when our 2017 vExpert party was held at the Pinball Museum in Las Vegas. The reality is that many of us work really hard and through a plethora of events, engagement in the vCommunity can be a nice way to blow off some steam. My friend AJ Murry who I co-led a local VMUG group with, hits on this point “In the vCommunity I have found my people. I have made life long friends. I have learned great things and shared amazing experiences.

It’s all about the people

2019-12-31 14_09_51-Community _ Definition of Community by Merriam-WebsterI mean it’s referred to as the vCommunity for a reason! The one theme that comes up time and time again when talking about our tech communities, was the value of our peer connections. We learn, share and when times are tough, we support each other. Nikola Pejkova, Veeam Vanguard Community Mangager highlights the value of these connections: “I love being part of community because it enables its members cooperate together, strengthen and enrich their knowledge and learn from each others experiences.

Everywhere I’ve gone and nearly everyone I’ve interacted with has been gracious with their knowledge and time. It’s a hallmark of these communities and a reason why there are so many deep bonds. You see it in every independent blog post, every community presentation and every response to a forum post. That’s the real magic to the vCommunity: We want to be there for each other and to collective lift each other up!

So how do YOU get involved?

It’s an amazing thing being part of this community, but like many things in this world, it can be intimated to get started, so what can YOU do?

Well there’s no time like the present. There are user communities abound. Listen, I live in a rural state where my favorite urban legend says that there are more cows than people. If I can find a local community, so can you! Find one (meetup can be an excellent starting point) and go, even if it’s not in your wheelhouse. Especially if it’s not in your wheelhouse! Go learn something new, meet some interesting people and hopefully have a good time!

Got something worthwhile to share? A blog is stupid easy to create these days. I wouldn’t be here if it wasn’t! Only a year or two, it was really hard to podcast or create video equipment, as the equipment required was cost prohibitive to most. As we kick off 2020, there’s no reason not to share it loud a proud! And if creating online content isn’t enough for you, there are always conference CFPs (calls for papers) that are looking for passionate people to share their successes.

All of these options strike fear into your heart, but you’d still like to help others? Online opportunities are abound as well. Helping someone solve a problem or answer a question in a forum, benefits not just you and poster, but future assistance seekers as well.

Whatever the avenue, just do it! If you’re still not convinced, I’d like to give Kyle the last word on why we vCommunity. “Now, why do I continue to be involved… over the years, I’ve found it extremely rewarding to share my experiences and knowledge, become a mentor, [and] … build up a number of friendships that exist still to this day.”

Thank you to my friends quoted here and to all my friends out and about in the communities for all that you do.

Getting started with Veeam Backups for Microsoft Office 365

The topic of data protection for cloud services seems to surfacing a lot lately. I’ve had a debate with myself as to whether this is akin to when you buy a car, all of a sudden you notice just how many of your car are on the road OR as we’re approaching 2020 is everyone coming to the realization that even in the cloud you need to protect your assets and that not everything can be ephemeral.

The premise of the conversation is that with Cloud, you push off ownership of the infrastructure/platform. But what about the data? Who is responsible for making sure that it’s protected? It’s a good fundamental question that there is a lot of FUD around. But cutting straight to the chase, it’s your responsibility to ensure that your data is protected. AWS alludes to this with their shared security model, but other cloud providers bury this fact in their T’s & C’s. Gartner has a good whitepaper (behind paywall, so not linked here) laying out that YOU are responsible for YOUR data, so you can’t presume that your Cloud provider is protecting it for you

With this in mind, it seemed like an opportune time to look at how we’re protecting our Office 365 assets. What I’d like to cover today is a brief exploration of the Veeam Backup for Microsoft Office 365.

Getting up and Running

VBO365_installThere’s little added value to an Organization running their own email infrastructure, so there has been an explosion in usage of Office 365 over the past few years. The fine folks over at Veeam have recognized this reality and recently released v4 of the Veeam Backup for Office 365 applications. New options provide additional flexibility to cost-effectively store data in Azure Blob storage, in addition to performance enhancements and encryption options.

After downloading the installer, the first thing you might notice is the size of downloads. THEY’RE tiny! When you’re going through the installation and this is the sum total of the options you have during installation, it makes sense as to why the packages are so small. About thirty seconds later you should have a pair of  new applications available to you:

  • Veeam Backup for Office 365
  • Veeam Backup for Microsoft Office 365 PowerShell

Firing up the application for the first time you’ll get a notice about installing a license. I have to say that as a SMB customer, I really appreciate what Veeam has been doing with sharing community editions. When you’re running a small shop, you still need to be able to protect your environment, and having access to a fully functional enterprise class product is extremely valuable. All of the home-labbers out there should also take note. If you’re really trying to test the solution out though, you can get a free 30-day trial license for a fully featured test run of VBO.vbo365_licBefore we move on to getting VBO configured, I’d like to point out that there is a PowerShell module available for VBO, so I’ll sprinkle a couple of those nuggets throughout this post… Like how to install a license using Install-VBOLicense:


Getting Ready for our first backup

First thing we need to do here, is add an organization that we’ll be backing up. The choices of Organizations we have available to us are: Microsoft Office 365, Hybrid and On-Premises. In this case I’m just going to be targeting a SharePoint instance that is fully in Office 365. VBO_AddOrg

If you want to avoid the GUI you can also use the Add-VBOOrganization cmdlet.

Next you’ll have the opportunity to use the New-VBOOffice365ConnectionSettings cmdlet or the UI to configure your connection to the o365 instance. For the purposes of this blog, we’re just going to stick with Basic Authentication, but you should probably consider using Modern Authentication as MFA and the enhanced security it provides, is highly recommended. Regardless of which direction you go, please make sure to pay attention to the prerequisites guide and particularly the section on permissions.

Backup Time!

That’s what we’re here for right? It’s backup time!

If you’re not into the UI, the Add-VBOJob cmdlet will get this done for you.

We’ve got an Organization, so next we need a job. Click the button for “New Backup Job” or right-click your newly added organization and select the option to “Add to backup job…” After the name and description you have the option to choose what elements you’d like to backup on the “Select objects to back up” vbo_bu_schedulescreen. You can get granular with users, groups, sites or organizations.

Again, to keep things simple we’re just going to tackle the entire organization. If you’d like, on the next screen you can choose objects to exclude from the backup. After selecting the backup proxy’s and the backup repository, you finally have the option to schedule your new awesome Office 365 backups.

That’s IT! Congratulations, in about 10 minutes you’ve managed to provide data protection to your Office 365 environment. I wish this post was longer, but the solution is simple and just works, so there you go! Easy backups of your cloud solution in a matter of minutes

I wish you all a peaceful and joyous holiday season.

PS. For my fellow PowerShell fans, the guide for these modules is quite nice.

PPS. We come to you with a late breaking update from my friend and fellow Vanguard Jim Jones! You can find a best practice guide for VBO at https://vbo.veeambp.com/. I’m not sure how I’ve missed this until now, but you can find a whole host of guides on how to configure your backups according to best practices.

Automate the Auditors Away – Veeam Backup Report v2

Earlier this week I had a piece published over on the Solarwinds Thwack forums titled Start Automating Your Career where I tongue in cheek point out that automation has reached a fever pitch over the last few years. My hope is that by sharing a couple of actionable tips, more people can take their first steps towards scripting and automating.

Given that I was publicly offering advice to people on how to automate their mundane tasks away, I thought that it was only fair if I took my own advice. Or in other words, I was going to eat my own dog food.

In my Thwack piece I offer up three general steps that you can take to get started automating:

  • Pick a Framework
  • Find a Task
  • Use the community

What you’ll find below is proof of my belief in this process, and how I leveraged these tips to automagic one of my problems away.

The Task and Framework

Audit’s are a fact of life in my role and any audit requires a lot of document collection. Anytime you can make that collection process easier, the auditors can expect a more consistent result and you can expect to pull out less hair. Audits happen with some level of regularity and auditors look for consistent data, so there are opportunities abound for standardization and automation of a process. In this case the task is fairly obvious: create a tool to efficiently capture the data necessary for audits. More specifically, I need to get backup documentation for our auditors.

Luckily, we use Veeam Backup and Replication (VBR) for our backups. The fine folks over at Veeam have provided a PowerShell module by default with VBR installations for at least the last several major versions of the product. Recently I used this framework to create a simple script that I describe in Veeam backup report via PowerShell. The script creates a little gap report, but that’s about it. It was fun task, but I got feedback from several folks that took the form of “But what about…” This time around I decided to see how easily I could build a more extensible script, that could have more utility moving forward.

So I’ve identified my task: automate data collection. I’ve also got a framework to use with, the Veeam PowerShell module, so I guess all that’s left is to use the community….


A number of years back, I decided that I wanted to become more involved in the community, so I became a VMUG leader. Helping people and networking with like-minded individuals was intoxicating and this past year I took a couple opportunities to further engage, most recently with the Veeam Vanguards. I mention this group, because it’s the Vanguards I reached out to for help with one item of this script.

I just couldn’t get a section of my report to work the way I wanted it to. I knew I was close, but this project was supposed to be fun, and beating your head on your own desk isn’t fun, so I asked for help. It’s not an easy thing to do, but making yourself vulnerable and opening yourself often yields positive results.

Within minutes of posting my question in the Vanguard Slack, my friend Craig popped up and said “Hey, I know someone else who’s having troubles with this too!” Several more minutes go by, and lo and behold, here’s Craig with a KB article to help me out. Ultimately, the KB didn’t provide the fix, but that’s immaterial, as the community and the Vanguards were there to help! The other great thing about the vCommunity, is that there are often opportunities to pay it forward. In the same spirit as Craig sharing the KB, I took a few minutes out of my day to share the resolution on the Veeam forums, on the off chance that someone else needs a helping hand.

The Script

Since I followed my tips, I should have a script right?

I do! It’s a big one, so I’m going to break the script down into a few sections. It is large, so I’m including a link to a downloadable version at the bottom of this post.

The start

First and foremost, I want this script to be helpful and useful. I decided to make use of Parameters to make the process easier to run from the commandline. The use of parameters makes it easier for others to use, with some reasonable expectations of what’s going to happen. I also plan on using this myself, so I made it easy to just run by specifying defaults for many of the parameters. I’ll go another level deeper in a post on parameters very soon-ish.

I’ll also highlight that some of these parameters aspirational. You’ll notice that some are commented out. That’s intentional, to highlight where I think this script could go next. To that end if you try this script out and find it helpful, please let me know and I’ll continue development of it.

Some highlights

  • ReportType is mandatory, because that’s why we’re here.
  • Both OutputType and ReportType leverage Validation Sets to control input values.
  • The Parameter Sets named VBRCred lump parameters together related items
### Define input parameter
    [Parameter(ParameterSetName='ReportType',Mandatory=$true)][ValidateSet('All', 'Backup','Copy','History','Gap')][string]$ReportType ="All",
    [ValidateSet( 'HTML')][string]$OutputType='HTML',
    [Parameter(ParameterSetName='VBRCred',Mandatory=$false)][switch]$UseAlternateVBRCredentials  ,


The End or is it the Beginning?

Most of the fun stuff is in the middle of this script, so let’s get the end out of the way first. It’s like eating your salad before getting to the main course. Because I make heavy use of functions, the main routine is simple, clean and readable. Declare a bunch of things I’ll use, make sure the environment is ready and then get to it! You’ll note that I don’t comment everything, but I try to provide comments around the theme of a given section.

The magic in the main routine is the switch statement. If you’ll recall, ReportType was a mandatory parameter. That’s because the operation of this script revolves around the data that we’re gathering. Everything else is a simply a supporting character

###Tasks for All, set the variables.

###Do the things based on the parameter things
switch ($ReportType){
    foreach ($job in $VBRjobs){
      $VBRjobsOverviewResults+=Get-BackupJobOverview -inJob $job
      $VBRjobsHistoryArray+=Get-VBRJobHistoryOverview -injob $job
$VBRjobsHistoryArray+=Get-VBRJobHistoryOverview -injob $job

###Make it Pretty. Oh so pretty

The Good Stuff!

I stated above that this project needed to be extensible. This script will be broken up in chunks that you can run selectively. I’d also wanted to have the ability to add more functionality in the future, so putting all of the work in functions only makes sense. Here’s a breakdown of what each function does:

  • get-veeampluginstatus. The first function I wrote. How can you tell? I got sloppy with my capitalization. This entire script is predicated on using the VeeamPSSnapIn that’s part of the VBR install, so obvious starting place is to verify that it’s installed and loaded. This and the connect-vbr function are just about getting ready to do work.
  • Get-BackupJobOverview. The first thing auditors want to know is the overview of what you’re doing with your backups. That’s what we’re doing here, creating a basic output for our friends. I really like using custom PowerShell Objects, and you’ll see a few of them throughout this script. I have another post in the work on these nifty items, but it’s probably sufficient to point out that a custom PS object is created by using the New-Object commandlet and data is added to our custom object by using the Add-Member Commandlet. You’ll see that I use the same technique in multiple places, which should make for a more readable product. Another reason to use custom PS objects:  I made this script for ME and my teams needs. By using a custom object, it becomes very very easy to swap other data elements in and out to fit your needs, without refactoring the entire script.
  • Get-GapReport is the same content from Veeam backup report via PowerShell, only put into a function, so no reason to cover it again here.
  • get-scheduleoverview along with Get-BackupJobOverview and Get-VBRJobHistoryOverview were where I had a lot of fun and are the most important parts of this script. In each instance I pass in a single Veeam backup Job (CBackupJob) object. There are a ton of both properties (things that make up the object) and methods (things that you can do with the object), so in reality anything you can get out of the GUI, you can get out the PowerShell objects.  A couple of fun examples for how I put the VBR module to work:
    • In the Get-BackupJobOverview function I want to determine if my backup job is a full backup or not. After tinkering around with my friend Get-Member, I realized that the Veeam Backup Job (CBackupJob) object is full of other objects, like the CDomBackupStorageOptions object, which contains… you guessed it a property called EnableFullBackup. You can see how I drill down to the object in line 34 (also sampled immediately below)
        $JobHash | Add-Member -type NoteProperty -name FullBackup -Value $injob.BackupStorageOptions.EnableFullBackup

2019-12-10 22_18_46-Windows PowerShell ISE

    • I also mentioned that there are a lot of methods made available to you from the VBR cmdlets. Honestly most of the data is surfaced within the Job object itself, but if you want to scratch a little deeper… I make use of a couple methods in the Get-VBRJobHistoryOverview function. On line 145 and 146 I use the GetBackupStats() and GetDetails() methods respectively. This is  the data that I need, but there’s a ton more you can do to fit your needs. As you can see from the statistics on this one object (another nested object), there’s WAY more that you can get busy with. 2019-12-10 22_33_20-Windows PowerShell ISE

The Dog Food

So what do you get out of this beautiful script? That’s what we’re here for right, to see the proof in the pudding, errr dog food. At the moment there are three primary reports being created, all using the ConvertTo-HTML commandlet to make it look pretty. If you want to explore how it’s output, check out the Build-Output function.

The Backup Overview report tries to boil down what are the most basic key elements to your backup job into one table.

2019-12-15 10_18_42-Clipboard

Similarly the History Overview Report tries to distill down the most recent history of a given job into a digestable format.


And lastly the Gap Report pulls a list of all VM’s from the vCenter target, compares the list against your various VBR jobs, so that at a glance you can see which VM’s are protected by what jobs.


The End

That’s the script in a nutshell. There’s a lot more that I could dig into here, so be on the lookout for some additional PowerShell posts soon.

I hope that if you’re using Veeam Backup and Replication, that you start putting their deep PowerShell commandlets to use soon. There’s a lot of power you can and should be taking advantage of there.

Lastly I hope this demonstrates that by choosing a task and diving in, you too can start automating your problems way.

Get your very own pretty dog food script here!

function get-veeampluginstatus{
  if(! $(Get-PSSnapin -Name VeeamPSSnapin -Registered -ea SilentlyContinue) ){
    Write-Host 'This script requires the VeeamPSSnapIn to continue. Please install this and retry.'
  elseif( ! $(Get-PSSnapin -name VeeamPSSnapIn -ea SilentlyContinue)){
    Add-PSSnapin -Name VeeamPSSnapIn
function connect-vbr{
  $session=Get-VBRServerSession -ErrorAction SilentlyContinue -WarningAction SilentlyContinue
    Write-Host "You are already connected to Veeam Backup and Replication $($session.server). This process will continue using the existing session."
      Connect-VBRServer -Server $VBRserver -Credential $VBRCredential
      Connect-VBRServer -Server $VBRserver
  if (! $(Get-VBRServerSession -ErrorAction SilentlyContinue -WarningAction SilentlyContinue)){
    write-host "we were unable to connect to $VBRserver. This script cannot proceed without a connection and will now exit."
function Get-BackupJobOverview($inJob){
  $JobHash=new-object system.object
  $JobHash | Add-Member -type NoteProperty -name Name -value $injob.Name
  $JobHash | Add-Member -type NoteProperty -name Enabled -value $injob.IsScheduleEnabled
  $JobHash | Add-Member -type NoteProperty -name JobType -value $(if ($injob.IsBackup){"Backup"}Elseif($injob.IsBackupSync){"Copy"})
  $JobHash | Add-Member -type NoteProperty -name FullBackup -Value $injob.BackupStorageOptions.EnableFullBackup
  $JobHash | Add-Member -type NoteProperty -name Description -value $injob.Description
  $JobHash | Add-Member -type NoteProperty -name Schedule -value $(get-scheduleoverview -injob $injob )
  $JobHash | Add-Member -type NoteProperty -name VMs -value $($injob.GetObjectsInJob()| Select-Object -Property name -ExpandProperty name|out-string)
  $JobHash | Add-Member -type NoteProperty -name Target -value $injob.TargetDir
  $JobHash | Add-Member -type NoteProperty -name RetentionCycles -value $injob.BackupStorageOptions.RetainCycles

  return $JobHash

function Get-GapReport{
  ### v3 not ready for targetted clusters yet
  ### $targetclusters=@("cl1","cl2")
  $GapJobArray =@()

  ###check if existing vCenter connections match entered
  if($global:DefaultVIServers -and ! ($vcentersvr -in $global:DefaultVIServers.name)){
    write-host "You are not connected to the host specified in the 'vCenterSvr'"
    write-host  "Press 'Y' to continue and disconnect from other sessions. Any other key will end this script. "
    write-host "Continue?  "
    $response = read-host
    if ( $response -ne "Y" ) {  }
    Disconnect-VIServer * -Confirm:$false -Force

  $null=Connect-VIServer $vcentersvr 

  ### Get a hash table from Veeam of all Jobs and member servers
  foreach($Gapjob in Get-VBRJob)
    $GapJobHash=new-object system.object
    $GapVMs=$Gapjob.GetObjectsInJob() | Select-Object -Property name -ExpandProperty name
    $GapJobHash | Add-Member -type NoteProperty -name Name -value $Gapjob.Name
    $GapJobHash | Add-Member -type NoteProperty -name VMs -value $GapVMs
    $GapJobArray +=$GapJobHash

  ###Get all Vm's in the target clusters. Iterate through hash table and if a job match add value to VMArray
  $GapSummaryArray =@()

  Foreach ($GapVM in get-vm){
      $GapVMArray=new-object system.object
      $vname=$(get-vm $GapVM).name
      $GapVMArray|Add-Member -type NoteProperty -name VM -Value $vname

      for ($i=0; $i -lt $GapJobArray.count ;$i++){
        if($GapJobArray[$i].VMs.Count -gt 0){
          if($GapJobArray[$i].VMs -contains $vname ){
            $GapVMArray|Add-Member -type NoteProperty -Name $($GapJobArray[$i].name) -Value "enabled"
            $GapVMArray|Add-Member -type NoteProperty -Name $($GapJobArray[$i].name) -Value "-"
      $GapSummaryArray +=$GapVMArray
  return $GapSummaryArray

function get-scheduleoverview($injob){


  if($Sched.OptionsDaily.enabled -eq $true){
    $ScheduleOverview="Daily; " + $Sched.OptionsDaily.DaysSrv + "; " + $Sched.OptionsDaily.TimeLocal.TimeofDay

  elseif($Sched.OptionsMonthly.enabled -eq $true){
    $ScheduleOverview="Monthly; " + $Sched.OptionsMonthly.DayNumberInMonth.ToString() + " "
    if(! $Sched.OptionsMonthly.Months.Count -eq 12){
    if($Sched.OptionsMonthly.DayNumberInMonth -eq "OnDay"){
      $ScheduleOverview+=$sched.OptionsMonthly.DayOfMonth.ToString() + "; "
      $ScheduleOverview+=$sched.OptionsMonthly.DayOfWeek.tostring() + "; "

  elseif($sched.OptionsPeriodically.Enabled -eq $true){
    $ScheduleOverview="Periodically; Period " + $Sched.OptionsPeriodically.FullPeriod + " minutes; "
  elseif($sched.OptionsContinuous.Enabled -eq $true){
    $ScheduleOverview="Continuous; ; "

  return $scheduleoverview

function Get-VBRJobHistoryOverview($injob){

  $History=Get-VBRBackupSession | Where-Object {$_.origjobname -eq $injob.name}
  $History= $history |Sort-Object -Property CreationTime -Descending

  write-host $name

  $HistoryHash=new-object system.object
  $HistoryHash | Add-Member -type NoteProperty -name Name -value $injob.Name
  if ($History){
    $HistoryHash | Add-Member -type NoteProperty -name LastResult -value $history[0].Result
    $HistoryHash | Add-Member -type NoteProperty -name StartTime -value $history[0].CreationTime
    $HistoryHash | Add-Member -type NoteProperty -name EndTime -value $history[0].EndTime
    $HistoryHash | Add-Member -type NoteProperty -name BackupSize -value $($history[0].GetBackupStats()).BackupSize
    $HistoryHash | Add-Member -type NoteProperty -name Details -value $($history[0].GetDetails())

    For ($i=0;$i -lt 5; $i++){ $lastfive+=$History[$i].Result}
    $HistoryHash | Add-Member -type NoteProperty -name LastFive -value $($lastfive|out-string)
     $HistoryHash | Add-Member -type NoteProperty -name Details -value "No History Found"
  write-host $HistoryHash

  return $HistoryHash
<span id="mce_SELREST_start" style="overflow:hidden;line-height:0;"></span>
Function Build-Output{
  $OutputFile="VeeamBackupOverview_$(get-date -Format HHmm_ddMMMyy)."
  switch ($OutputType){

          table {
          font-family: "Trebuchet MS", Arial, Helvetica, sans-serif;
          border-collapse: collapse;
          width: 100%;
          th {
          padding-top: 12px;
          padding-bottom: 12px;
          text-align: left;
          background-color: #4CAF50;
          color: white;
          TD {border-width: 1px;
          padding: 3px;
          border-style: solid;
          border-color: black;}

"@    ## Must remail left aligned, no whitespace allowed before string terminator
      $OverviewFrag= $VBRjobsOverviewResults | ConvertTo-Html -As Table -Fragment -PreContent '</pre>
<h2>Overview Report</h2>
      $HistoryFrag= $VBRjobsHistoryArray | ConvertTo-Html -As Table -Fragment -PreContent '</pre>
<h2>History Overview Report</h2>
      $GapFrag = $GapResults | ConvertTo-Html -As Table -Fragment -PreContent '</pre>
<h2>Gap Report</h2>
      ConvertTo-Html -Head $Header -PostContent $OverviewFrag,$HistoryFrag,$GapFrag|Out-File $OutputFile
<span id="mce_SELREST_start" style="overflow:hidden;line-height:0;"></span>

A Letter to the Veeam Vanguards

In 2000 one of my closest friends was getting married, so he and I went to NYC for a boys weekend. While we were there, we had a party and I made many friends for life and even met my future wife. The minute I got home from that trip, I sat down at my desk and wrote a letter to all my new friends. Unfortunately that letter was lost or otherwise destroyed (I’m looking at you Eric) and with the exception of notes to my son, rarely have I felt compelled to write a similar letter since. Until today.

I’ve been home from my first Veeam Vanguard Summit for about 24 hours now. I had to wait to pen this for two reasons. First, if I penned it immediately when I got home, I may end up married to a number of Vanguards and that’s not right or legal. Secondly, without 12 hours of sleep it would have read “Veeam Vangoiui7ioe79etgjl o87rnngdufi…”

I was pretty excited when I received my email in February acknowledging that I’d been accepted into the Veeam Vanguard program. When we first got together I was struck by the lack of pressure. There were little expectations of promoting the brand or pressure to produce content. The message that I heard was, ‘we want to share with you and in return we’d like your frank feedback’. To be honest, at first I was skeptical and was thinking there was some BS. I was wrong.

The year continues and there are periodic calls and webinars, some cool opportunities and early access to info. Some good, fun chatter on Slack should have tuned me in a little more as to what was to come.

For the Veeam Vanguard summit, we went to Prague. Now, if I stopped here and just shared my thanks, it would have been an experience of a lifetime. Before I move on though, just a few words about Prague. I don’t know if I’ve ever been to a more magical city. The architecture is beyond amazing, the people welcoming, the food delicious and the beer plentiful. Anyone considering going, should stop considering and just start saving your pennies. Prague should be at the top of most bucket lists and I truly hope that I get to visit again in the future.

But we weren’t there to sightsee, this was after all a tech program. I expect to share thoughts about the content specifics from the Summit in the future, so I won’t spend time diving into that here. What I will share about this event is that I’ve never seen such candor from a vendor. Any vendor will tell you about what’s awesome in their products and there was definitely that, but the folks at Veeam also told us where things still need work. We got to see behind the curtains on where things are going. That’s pretty cool and again, if we stopped there, I’d still be appreciative. What really made the product discussions special though, is that our feedback was actively solicited and I know for a fact that feedback will makes it’s way back into the product roadmap and development. Seriously, when can you, as a user, sit down with the head of Development for a billion dollar company, tell them your thoughts and concerns, then watch them hand that feedback directly to their teams? I now know the answer.

A most thoughtful gift, to cap a most amazing trip

For experiences like this, ranking and coming up with what’s “the best” is an exercise in futility. One of the things that make this program the most special and what was the most warming and touching part of this experience was the people. I’ve been involved in other programs, information exchanges, educational programs and the like for years. Only once before (luckily earlier this year, I’m blessed) have I ever experienced such camaraderie. From the Veeam-ers who made the event happen, to the Vanguards who participated, I’m so very great full to have met you. As I told one of the SVP’s at our amazing final event at Staropramen Brewery, I’ve never felt so welcomed into a community and god willing, I’ll get to pay that forward in the future.

To Anton, Rick, Cade, Spiteri and the rest of the technical teams, the content was amazing. Thank you for letting us share our feedback on it. Bring on v10!

To the rest of the Veeam team, Aubrey, Rin, Chelsea, Kirsten, and anyone I may have forgotten, thank you for your support. An extra big thank you to Nikola; what an amazing event you helped put together. I really look forward to working with you again in the future.

Finally, to my fellow Vanguards, thank you for the camaraderie, friendship, laughs and welcoming.

To all of you, it was an amazing adventure. Until I see you next, Cheers!

PS. While they weren’t there, I’d be remiss if I didn’t thank my boss and employer. If I didn’t work for a such a fantastic person and organization, I wouldn’t be able to have experiences like this where I can learn, share and re-energize. You’re the best!

Hot Take from TFD19 – RPA with a Security First mindset

As someone who’s long been an advocate of automating the things, I got really excited when Automation Anywhere was announced as a presenter for Tech Field Day 19. For me it’s a great opportunity to learn about a rapidly growing market space, Robotic Process Automation (RPA). It’s also very timely for me professionally as the financial sector is a prime candidate for RPA and many conversations are being had around the efficiencies that can be derived from it.

That being said automation isn’t necessarily a unicorn. What if you write a bad process? What if you’re a bad person? What do you tell the auditors? If you don’t at least think about these questions, you may find yourself automating your way to the unemployment line.

2019-06-27 13_34_43-Tech Field Day 19 - Tech Field DayJust to set one more piece of context, I’ve been reading a lot lately about building a security program and the basics of a successful program. One idea that keeps popping up time and time again – everyone should have a security mindset, including your developers. However for many people, security is an afterthought. So it was refreshing when early on s presentation this lovely little slide popped up. The details are important, but more relevant is the fact that the folks at Automation recognized that their platform is powerful (insert necessary spiderman quote here) and therefore security has to be considered from the outset.

2019-06-27 13_44_37-Tech Field Day 19 - Tech Field DayNow from my view, these items should be table stakes for any software in 2019. The reality is that for the majority of the software industries features and functionality are prioritized over data security. I honestly am sitting here thinking through various software presentations I’ve seen over the years that treated security as a central premise, rather than an afterthought and I’m coming up empty. Given that RPA has the potential to be a giant attack vector for the bad guys, solace should be found in the fact that Automation Anywhere takes their responsibility to provide a secure solution seriously.

This security first mindset was then further demonstrated when 5 minutes were devoted to how you should properly promote code from dev to qa to prod. Taken alongside the fact that you can audit every action taken by every bot, under every user account context and the approach to security from the folks at Automation Anywhere is quite refreshing.

Many questions still exist for me, such as ‘how do you ensure resilience for your RPA solution?’ and ‘how easy are all of these controls to leverage? Can they be used at scale?’ Nevermind that you always need to do your due diligence for any platform that you introduce into your environment. All that being said, I’m looking forward to getting my hands on the free community edition of Automation Anywhere’s RPA product suite.

Disclaimer. As a guest of Tech Field Day as a delegate, my accommodations and travel have been paid for. The words and thoughts expressed herein are mine alone. I have not been compensated for this post.

Veeam backup report via PowerShell

Here’s the fun thing about audit’s. That’s right, I said fun and audit in the same breath. The great thing about audits is that they are tedious and repetitive…

Which makes them great candidates for automating!

I have a task in front of me to document our backups. Thankfully we use Veeam, which means I get to PowerShell this bad boy! It’s an audit, so I could make this super simple and just pipe the output from Get-VBRJob to a CSV and call it a day. The problem with that approach is that it doesn’t provide any additional utility beyond the audit.

What would be useful though was if I could take all the servers in my target group and compare them against all of the jobs from Veeam and output a pretty little CSV where you could at a glance tell where everything was. Here’s what I came up with:

function get-veeampluginstatus{
  if(! $(Get-PSSnapin -Name VeeamPSSnapin -Registered -ea SilentlyContinue) ){
    Write-Host "This script requires the VeeamPSSnapIn to continue. Please install this and retry."
  elseif( ! $(Get-PSSnapin -name VeeamPSSnapIn -ea SilentlyContinue)){
    Add-PSSnapin -Name VeeamPSSnapIn


$JobArray =@()


Connect-VIServer $vcentersvr
Connect-VBRServer -Server $veeamsvr -Credential $cred

### Get a hash table from Veeam of all Jobs and member servers
foreach($job in Get-VBRJob)
  $JobHash=new-object system.object
  $vms=$job.GetObjectsInJob() | Select-Object -Property name -ExpandProperty name
  $JobHash | Add-Member -type NoteProperty -name Name -value $job.Name
  $JobHash | Add-Member -type NoteProperty -name VMs -value $vms
  $JobArray +=$JobHash

###Get all Vm's in the target clusters. Iterate through hash table and if a job match add value to VMArray
$SummaryArray =@()
foreach ($target in $targetclusters)
  foreach($VM in $(get-cluster $target|get-vm)){
    $VMArray=new-object system.object
    $vname=$(get-vm $vm).name
    $VMArray|Add-Member -type NoteProperty -name VM -Value $vname

    for ($i=0; $i -lt $JobArray.count ;$i++){
      if($JobArray[$i].VMs.Count -gt 0){
        if($JobArray[$i].VMs -contains $vname ){
          $VMArray|Add-Member -type NoteProperty -Name $($JobArray[$i].name) -Value "enabled"
          $VMArray|Add-Member -type NoteProperty -Name $($JobArray[$i].name) -Value "-"
    $SummaryArray +=$VMArray

$SummaryArray | Export-Csv "veeam_jobs_$(get-date -Format dd_MM_yyyy).csv" -NoTypeInformation<span id="mce_SELREST_start" style="overflow:hidden;line-height:0;">&#65279;</span>
lines 1-20 In the current format, this report is meant to be run in an ad-hoc fashion, so lines 1-20 are really just setting the scene.
One thing of note, I chose to target clusters in my script. This could very easily be altered to target any container objects in vSphere by altering lines 13 & 35.
lines 24-31 When building reports, I’m a fan of using custom objects, it’s just how I roll. Plus it’s readable and easy to consume. There’s a really good explanation of custom objects and how to use them here. In this case I build a custom object, $JobHash, to hold the couple of bits of info about each job. I’ll come back to this in a few.

Each job object that’s returned by Get-VBRjob has a method associated, GetObjectsInJob(). This method tells us which VMs are in the job. Since the VMs are returned as an object, I’m just selecting the name for use in the report

Then the Add-Member commandlet is used to add the job name and VM names to $JobHash. Finally individual jobs are each added into $JobArray on line 30 before moving on.

lines 35-55 The general premise of this whole section is to take the previously built array of jobs, conveniently named $JobArray,  compare it to the list of VM’s from our $targetclusters and build our output.
lines 38-40 If you remember I said at the beginning, I want a visual report of all VM’s and their job status. So regardless of job status, every VM gets an element in $VMArray from our friend Add-Member.
lines 42-52 For each VM that’s returned on line 37 we iterate through $JobArray and based on whether a match is made or not, an entry is made for that VM-Job combo in the $VMArray table, which is concatenated onto $SummaryArray before moving on to the next VM.
lines 56 We done! Pump out the $SummaryArray to a datestamped csv file.

2019-06-01 13_26_23-veeam_jobs_01_06_2019.csv - ExcelAnd here’s what we get for output. It’s a very simple report, but it hits exactly the mark I was aiming for, which is a way to see at a glance what is configured where. There are also several places where the code could be made more efficient, but this is one case where the destination matters far more than the journey.

But, Wait! There’s More! Toying with PowerShell write speed.

Why hello old friend. It’s been a while. Sometimes life happens…

… and then you remember you actually still have a blog!

I have a random task in front of me at work where I have to do a lot of text manipulation. Obviously I’m going to PowerShell the crap out of this, but as I start framing out the script I realize I’m going to be writing a lot of strings. There is no speed requirements to this operation, but because I’m weird like that, I wanted to figure out what’s the fastest way to write a whole bunch of data to a file.

Here’s the wee little script I wrote to play around:


$StringLength = "400"
$FileLength = "10000"

#test one. Build array, output array in one shot.
Measure-Command {
  write-host "Test: create array of characters and output entire array"
  for($i=0; $i -le $FileLength; $i++){
    $arrDataToWrite+=$(-join $($charSet |get-random -Count $StringLength))
  $arrDataToWrite |Out-File $arrTestFile

#test two. Iterate through, outputting as we go.
  write-host "Test2: create characters and output inline"
  for($line=0; $line -le $FileLength; $line++){
      $(-join $($charSet |get-random -Count $StringLength))|Out-File $lineTestFile -Append

Methodology was pretty simple.

  • First test: build one big array full full of strings of random characters. After building the array, output the whole array to file
  • Second test: build strings of random characters and use Out-File to append the strings as we loop

2019-05-16 13_05_33-Windows PowerShell ISEThe results as you can see are pretty clear. After several test runs of varying sizes, appending strings to the file in-line is pretty consistently ~3x slower than the array method.  This makes sense when you think about the operations. In test two, every time I write to the file I have to access/open the file, write the data, close the file. If I’m counting right, that’s 3x the number of file operations… and that loop takes 3x longer…. huh…

Obviously it’s an over-simplification, but there are vastly more activities taking place in test two, hence the much longer execution time. Case closed, send this to print!

51WRFJ1M91L._SX258_BO1,204,203,200_However, in the famous words of Ron Popeil: But wait! There’s more!

I typically do a litte of digging around before posting thoughts. Call it pragmatic, call it worrying, call it imposter syndrome, but reality is I just don’t want to stick my foot in my mouth. So I’m reading a few articles when I come across the article titled Slow Code: Top 5 Ways to Make Your PowerShell Scripts Run Faster from Ashely McGlone.

Sure enough, half way down this informative article  is a heading that reads “appending to arrays”. Well, that’s what we’re working on here! So I give this technique a shot, and voila my fastest run just got 50% faster! Here’s the code from test1 further optimized

#But wait, there's more!
#test three: Build the dataset within the loop and assign it to the array in one shot
  write-host "Test3: assign all data to array in one shot and output entire array"
  $arrDataToWrite_Fast = for($j=0; $j -le $FileLength; $j++){
    $(-join $($charSet |get-random -Count $StringLength))
  $arrDataToWrite_Fast |Out-File $arrTestFile2

As Mr. McGlone states in the linked article, the For loop runs and stores data in memory.2019-05-16 13_26_42-Windows PowerShell ISE By assigning that For loop to the array in one shot, you only have one expensive array operation rather than ‘N’ number of array operations.

As I stated at the outset, there’s no real reason behind this activity other than learning and having fun. To that end, I hope you keep learning, keep scripting and keep having fun.

Oh and Ashley, if you should by chance come across this, thanks for your article and for helping me learn something new!

A Tale of Two Architects – A Review

I just finished two really great books on architecture, that really necessitate sharing with all of you. These two books were written for very different purposes and with very different voices, but I found them both to be enjoyable reads with educational content.

VDI Design Guide: A comprehensive guide to help you design VMware Horizon, based on modern standards

Johan van Amersfoort

The title doesn’t exactly roll of the tongue, but honestly that might be my biggest issue with this book…

My team has a small, but challenging VDI environment, so I placed an order the day this book was released and immediately tore into that lovely amazon box when it arrived. It’s been a running joke in the infrastructure and virtualization worlds for some time now that “this is the year of VDI!” Perhaps the fact that this prognostication hasn’t come to pass is due to the complexity to running a well functioning VDI environment. Seriously, just think about all the various components that make up a VDI environment: your core infrastructure Hypervisor/SAN/Network/Security/Compute, the client, the delivery model, nevermind what’s actually in the guest! Johan does a really great job of walking through all of these components and so much more in his first book.

The style of the book emulates the VCDX design methodology. I am not, nor ever will be, a VCDX but I found his explanation of the methods to be much more engaging than in other tomes. What I mean by this is that there are some architecture books out there that are extremely dogmatic and really are just guides towards attaining a certification. Johan on the other hand does such a nice job walking the reader through the various design and architecture phases that I’d strongly consider giving this book to any burgeoning architect, whether they cared about VDI or not.

Now don’t let me fool you, that this is all method and no meat, because that would be a tragedy. Like I mentioned at the outset, my team has some VDI challenges, and with the authors thorough and detailed dissection of all the various components of a VDI infrastructure, we had immediate technical takeaways. Johan walks through all the components that make up a VDI environment, providing his recommendations for why you may want to go in a specific direction, and just as importantly why you may not!

I’ve been told on a couple of occasions that I have a unique voice when I write. Given that, I have to say I thoroughly enjoyed Mr. van Amersfoort voice through this book. As it was pointed out during his recent visit to the Datanauts podcast reading this book was like sitting down with a colleague and chatting through a technical issue. It truly made it one of the most fun technical reads I can remember.

All in all if you’re interested in VDI or general Architecture principles, do yourself a favor and pick up Johan van Amersfoort’s first book.

You can find Johan at @vdidesignguide & @vhojan

IT Architect Series: The Journey

Melissa Palmer

product_thumbnail.phpI picked up this book solely because I’ve enjoyed Melissa’s blog (https://vmiss.net) for some time. In the review above, I alluded to having read some bad architecture books, so I intentionally went at this book with no expectations. I have to come right out of the gate and say that this book was one of the most interesting technology books I’ve read, in that it talked about technology very little. The subtitle for this book reads “A guidebook for anyone interested in IT architecture” and a guidebook is really what Melissa gave us.

The premise to this book is to help anyone interested in technology or a burgeoning IT practitioner understand just what an Architect is and what it takes to get to be one. I can speak for no one but myself and my observations over the past 20 or so years in IT, but it seems that many systems architects just kind of eventually land in that role. They get good in one area, and maybe good in another. After some time they end up being the smartest gal/guy in the room. This is not the book to help with that sort of an endeavor and I love it! In writing this book Melissa provides a path, that worked for her en route to VCDX, on how to take a more active approach to becoming a solution provider. A sampling of the topics covered include, “Learning New Skills”, “Infrastructure Areas of Expertise” and “Architectural Building Blocks”. The format is more about the journey rather than a prescriptive roadmap. In fact throughout the book, the reader is encouraged to take a step back and see how the information shared fits within their role and worldview.

While I really enjoyed the approach and Melissa’s voice, my knock on this read is that it could use a copy edit. If you are someone who has ever joined in on the “On Premises” debate, please approach this book knowing that there is some small amount of errata present. As a wanton comma abuser, I’m certainly not throwing stones and I hope this doesn’t stop you from picking up the book; the content contained within absolutely makes up for any grammatical oopsies.

The primary content of the book clocks in at just under 200 pages. If you already are or aspire to be an architect you are going to read technical guides that are way longer than this! Just like with her blog, Melissa’s personality carries through this book. It’s obvious that a passionate person wrote this piece, in an effort to help others, all the while maintaining a sense of self. A perfect example is when discussing assumptions towards the end of the book, Melissa creates an analogy where she uses the word ‘chicken’ ten times in a paragraph. I literally laughed out loud, to which my wife responded “Is your geek book amusing dear?” Yes, yes it is.

Many IT practitioners discount some of the “softer” skills required in a business environment. It’s in this vein where I think the book really shines. If you are someone who has a hard time communicating in either written or verbal form, you are probably going to have a hard time obtaining an architect level role. Melissa spends a significant portion of the book emphasizing what these skills actually are, why you need them and tips on how to improve them. I’m thinking about getting a couple more copies of the book for some folks who could really us some self-reflection in this area…

Obviously anyone with aspirations of reaching an architect level would benefit from picking up this guide. If I were a college professor teaching folks what it was like to work in IT and to give them a broad perspective, I’d have them read this book. As someone who’s worked in an architectural role, I learned a number of things as well, meaning even seasoned IT pros can benefit from picking this up. Reading this book over the past few days, it became obvious that Melissa cares about people and the solutions they provide, so by that token perhaps we could all benefit from the reflective approach conveyed throughout this book.

You can find Melissa at @vmiss33 & @ITArchJourney

VMworld 2018 – FOMO? Never fear!


In just a few days friends, colleagues, teachers, luminaries and thought leaders will be converging on Las Vegas for the biggest and best virtualization conference in the world. If you’re in the same shoes as me, VMworld 2018 just isn’t in the cards. Hearing that Tony Hawk , Run DMC, The Roots and Snoop would be a part had me a bit bummed. However it was when I heard that Malala would be participating in the general sessions, that I turned that attitude around.

It was then that I realized there is still a wealth of ways to experience VMworld, even when you’re 2,638 miles away from Las Vegas, not that I’m counting or anything.

General Sessions

Like I alluded to above, it was seeing that Malala would be participating in the general sessions that helped turn my attitude around. The reason for this is that VMware makes an effort to broadcast the General Sessions live.

If you haven’t been to a major conference, these sessions are the reason why a lot of people refer to conferences as a “show”. It’s time for the heavy hitters, for the big production and for news to drop. The general sessions that I’ve attended tend to follow a pattern:

Day 1 State of the Union. Let’s highlight our successes, broad industry trends and how we are positioned to respond or better yet, led those trends.
Day 2-N  Thought leaders. Talk about growth, and what the future holds. Not everything that you see at a tech conference will become reality. I feel like it’s these days where you see organizations testing the water to see how ideas and roadmaps feel among the various stakeholders.
Last day  Honestly these are my favorite sessions. The show’s almost over, some folks have already left town and honestly the people who are left are likely kind of burnt out. VMworld always saves something cool for those brave and/or hardy folks who are left standing on the last day.

Now unfortunately that final cool session is only for attendees. It’s probably a good reason to start working on your budget justification to attend next year… For the Monday and Tuesday sessions however, you’ll want to set a calendar reminder to tune in at 9:00AM PT for the general sessions live on VMworld.com

vBrownBag Tech Talks

The vBrownBag talks are one of my favorite parts of VMworld. If you’re reading this blog, you already know about the crew, but if by some chance you don’t know… vBrownBag is a community of passionate people who want to share and facilitate sharing within the IT Infrastructure community.

2017-10-01 12_01_50-Clipboard
Getting my feet wet at my first #vBrownBag session

The other cool part about vBrownBag is that they produce Tech talks. These are short community sessions ranging from just a few minutes up to a half hour in length. You can check out my 2017 session on life as a SMB in a big Enterprise world or PowerCLI for examples. (Go easy on me, I was nervous about my other sessions). The whole point of vBrownBag is sharing and the very cool people who produce the Tech Talks do a damn fine job at it. If you want to follow along live, you can check out the action on vbrownbag.com or if you are unable to participate live all sessions are posted to the vBrownBag YouTube channel, usually within an hour or so.

Community members coming together to share with each other. For everyone involved it’s a labor of love and how can you beat that?

VMware {code} Power Sessions

2018-08-23 22_08_03-VMware code - Home _ FacebookI am super excited about this new offering! And maybe a touch bummed that I’m not going to be participating… But just because I won’t be presenting, doesn’t mean that I won’t be following along. Similar to what the vBrownBag folks are doing, the VMware community team will be hosting expert-led presentations from community members, but with a focus on DevOps and developers. All the action will be live streamed via the {code} facebook page. You can check out the entire line-up by searching for CODE sessions in the content catalog.


Since we’re talking about community, let’s not forget about VMTN. The VMTN page is always a hotbed of activity during VMworld. I’m not sure why it’s a secret, but nevertheless it is kind of the secret sauce to staying in the know during the show. If you wanted a place to participate in contests, watch live streams, chime in with all of your community friends, then you might want to head over to the VMTN page.


Holy crap! How can I forget the bloggers! While writing a blog post! Shame!

In my mind the blogosphere (is that still a term?) is the lifeblood of our vCommunity. It’s where passionate people go to talk about the things that matter most. Where we share our successes, our trials and all of the cool things we learn about! What better place to do that than at the VMworld. VMware has a really strong blog presence that’s only gotten stronger over the past year or two. I’m obviously partial to the PowerCLI blog, but next week I’ll be keeping an eye on the official VMworld Blog. If you can’t make the general sessions, this is where the news will drop. I’ll just leave that there…


Beyond the official blog, there are dozens of people blogging about nearly everything that happens at the show. As someone who’s live blogged a general session, I can tell you that the only reason someone would blog from a show is to share with others. Here’s a good place to start if you’re looking for some of the fine folks who’ll be attempting to document everything that happens in Vegas next week. Well, not everything…

Beam me up … me

Sorry (not sorry), horrible dad joke there.

Did you know you can drive a robot at VMworld? Seriously. Ok, not a robot, but a BEAM. Don’t know what a BEAM is? It’s basically FaceTime mounted on a remote control car. You can register to drive one of these super awesome RC devices around the VMTN space. How awesome is that?!?


I’m sure that there are more ways to experience VMworld if you’re not there, but honestly I’m tired just writing this, let alone trying to sample all of the above options. No matter which way you go, there is definitely no fear about being able to make the most of VMworld from afar.

Start PowerShell’ing your Backups

VeeamOn was such a great and educational show. For my small part of it, I wanted to share how you can automate the deployment/management of your infrastructure. Why would you want to automate? Seriously, did I really just ask that… Speed and  standardization/predictability are the primary drivers for scripting via PowerShell. Or awesomeness. Yup, we’ll stick with the fact that PowerShell is full of awesomeness.

I started my VeeamOn presentation with an example of just how awesome you can become with your PowerShell scripts. In the first part of the video you see how long it takes a person to manually configure a job. Keep in mind this is someone who pretends to know what they are doing, but still errors happen. The second example shows just how long it takes to implement the exact same job via script. So let’s take a moment to parse the Why’s of PowerShell

  • Speed. When I made this video, I had a practiced each click. It still took 2 minutes and 45 seconds to create a backup job. Conversely, the script took 30 seconds to complete. This is for one machine. Now think about if you’re setting this up for 100 machines… manually you’re looking at 4.5 and half hours. Via script, 50 minutes. Yes this is dirty math as there are many factors that go into the equation, but you can’t argue with the fact that scripting is more efficient, even factoring in the ~20 minutes I spent writing the script.
  • Standardization. I’ve worked with Veeam B&R for a number of years now and as I mentioned above, I practiced the workflow to try and take out any bias from the equation. Still I made an oops. We’re human after all. As you’ll see in the code below, by standardizing you can remove much of the human variable (bad pun intended) and produce a more consistent output.


Without further ado, here’s the code and desciptions about what’s going on.

lines 2-3: If the PsSnapIn for Veeam isn’t loaded, let’s go ahead and load it. You’re using ISE right, so you probably want it in your editor session as I talked about here
lines 6-27: When you stop and think about it, backup jobs are complex and have lots of options. This is an uber simple example and still you’ve got 16 lines of configurations. By moving this off into a function you make your code both more readable and repeatable. Thankfully the developers have used very intuitive naming conventions for the job options!
line 32: connect to the Backup and Recovery server. Full disclosure, I set the $cred variable in a previous script and got lazy. Sorry!
lines 6-27: Finally, we are getting to the action. It’s exciting! But it’s also not. We set most of the options already. At this point all we need to do is execute the various VBR cmdlets.

  • Add-VBRViBackupJob
    Short and simple, create the backup job… Or is it????  We create a job, but we also leverage:
    Find-VBRViEntity: Find the VMware entity to backup that we specified in our variables above.
    Get-VBRBackupRepository: Fairly self-explanatory, find the backup repository that the job will use.
  • Set-VBRJobOptions
    There is a flow to objects created via the VBR PowerShell cmdlets. Create the object. Set options on the object. This is the later. These are the options configured in our SetVariables function in lines 19-26.
  • Set-VBRJobAdvancedBackupOptions
    Do I need to explain what this does?
  • Set-VBRJobSchedule
    Again, pretty self-explanatory right?
  • Add-VBRViBackupCopyJob
    Remember when I screwed up in the video above? Why do you have to bring up those painful memories??? Anyway, moving on this one creates a new copy job on the previously set variables.

Each of the above cmdlets have a significant number of options to fit your environment. I’d encourage you to peruse the Veeam PowerShell Reference guide for additional options.

Here’s the code. We’ll cover more advanced workflows in our next post in this series. Stay tuned!

#VeeamOn Simple Backup Job Demo
if( ! $(get-pssnapin -Name VeeamPSSnapIn -ea SilentlyContinue)) {
Add-PSSnapin VeeamPSSnapIn

function SetVariables{
$Global:PWord= ConvertTo-SecureString -String "VMware1!" -AsPlainText -Force
$Global:cred=New-Object -TypeName "System.Management.Automation.PSCredential" -ArgumentList "lab\Administrator", $PWord



#set JobOptions
$Global:VBRJobOptions=Get-VBRJobOptions -Job $VBRBackupName
$VBRJobOptions.JobOptions.RunManually = $false
$VBRJobOptions.BackupStorageOptions.RetainCycles = 3
$VBRJobOptions.BackupStorageOptions.RetainDays = 7
$VBRJobOptions.BackupStorageOptions.EnableDeletedVmDataRetention = $true
$VBRJobOptions.BackupStorageOptions.CompressionLevel = 6
$VBRJobOptions.NotificationOptions.SendEmailNotification2AdditionalAddresses = $true
$VBRJobOptions.NotificationOptions.EmailNotificationAdditionalAddresses = "test@test.com"

#Call SetVariables function

if ( ! $(get-vbrserver -Name $VBRserver) ) {Connect-VBRServer -Credential $cred -Server $VBRserver}
sleep 5

$null=Add-VBRViBackupJob -Name $VBRBackupName -Entity $(find-vbrvientity -Tags -Name $VBRBackupEntity) -BackupRepository $(Get-VBRBackupRepository -Name $VBRRepositoryName)

$null=Set-VBRJobOptions -Job $VBRBackupName -Options $VBRJobOptions

$null=Set-VBRJobAdvancedBackupOptions -Job $VBRBackupName -EnableFullBackup $true -FullBackupDays Friday -FullBackupScheduleKind Daily

$null=Set-VBRJobSchedule -Job $VBRBackupName -DailyKind WeekDays -At 01:00

$null=Add-VBRViBackupCopyJob -DirectOperation -Name $VBRBackupCopyName -Repository $(Get-VBRBackupRepository -Name $VBRReplRepositoryName) -Entity $(find-vbrvientity -Tags -Name $VBRBackupEntity)