Script to get the binary differential replication status of all SCCM packages

Problem

An SCCM Environment I was looking at had a few hundred application packages. I needed to find out which were enabled for “Binary Differential Replication” to get to the bottom of some bandwidth issues that didn’t add up.

This option is set under “Data Source”

screenshot1

I figured there would just be a true or false variable for this check box so I ran “Get-CMPackage –id “XXXXXXX” on a package I knew had it enabled. There wasn’t.

I thought maybe the object was nested in another field somewhere so found a package where Binary Differential wasn’t enabled, saved the output of “Get-CMPackage –id “XXXXXX”” to a variable/object and enabled the feature. Saved the output of the same command to another variable/object and compared the two.

Enabling the Binary Differential option changed a PkgFlags option, some digging around on the internet lead me to see other people trying to accomplish something similar. All the solutions I came across varied in approach but all seemed to use a WMI filter.

I wanted a PowerShell native solution, Microsoft kindly produced a list of the WMI Class the other solutions were using here on MSDN.

Hexadecimal (Bit)Description
0x04000000 (26)USE_BINARY_DELTA_REP. Marks the package to be replicated by distribution manager using binary delta replication.

It turns out the PkgFlags is just a Uint32 or hex value which I should be able to check against pretty easily. the actual PkgFlags value can be obtained with get-cmpackage

Solution

My Solution to the problem is to take a list all packages in the environment and check if the “PkgFlags” value matches a bitwise operation for the value in the MSDN provided table.

$Status = @()
get-cmpackage | foreach {
    $CMPackage = New-Object System.Object

    $CMPackage | Add-Member -type NoteProperty -name Name -value $_.name
    $CMPackage | Add-Member -type NoteProperty -name Manufacturer -value $_.Manufacturer
    $CMPackage | Add-Member -type NoteProperty -name PackageID -value $_.PackageID

    #Check for USE_BINARY_DELTA_REP/0x04000000 (26)
    if ($_.pkgflags -eq ($_.pkgflags -bor 0x04000000 )) {
        "Binary Delta Replication Bit Enabled {0}" -f $_.Packageid + " " + $_.Name
        $CMPackage | Add-Member -type NoteProperty -name USE_BINARY_DELTA_REP -value "true"
    
    }else{
        "Binary Delta Replication Bit Disabled {0}" -f $_.Packageid + " " + $_.Name
        $CMPackage | Add-Member -type NoteProperty -name USE_BINARY_DELTA_REP -value "false"
    }

    $Status += $CMPackage

}#EndFor

$Status | Out-GridView

How it works

The first operation is to create new array called $Status. The idea being this will hold a complete list of all the results as we run through the available packages.

$Status = @()

We then get a list of all packages in the environment, for each package we run some checks and create a new object containing the name, manufacturer and package ID.

get-cmpackage | foreach {
$CMPackage = New-Object System.Object

$CMPackage | Add-Member -type NoteProperty -name Name -value $_.name
$CMPackage | Add-Member -type NoteProperty -name Manufacturer -value $_.Manufacturer
$CMPackage | Add-Member -type NoteProperty -name PackageID -value $_.PackageID

To calculate if the Binary Differential feature is enabled we perform a bitwise operation (-bor) on the “pkgflags” attribute to check if it matches the hex value 0x04000000 as per the table on the MSDN page. We then add another attribute to the object created above containing “True” or “False”

#Check for USE_BINARY_DELTA_REP/0x04000000 (26)
if ($_.pkgflags -eq ($_.pkgflags -bor 0x04000000 )) {
  "Binary Delta Replication Bit Enabled {0}" -f $_.Packageid + " " + $_.Name
  $CMPackage | Add-Member -type NoteProperty -name USE_BINARY_DELTA_REP -value "True"

}else{
  "Binary Delta Replication Bit Disabled {0}" -f $_.Packageid + " " + $_.Name
  $CMPackage | Add-Member -type NoteProperty -name USE_BINARY_DELTA_REP -value "False"
}

Finally, we add the object for this package to the object created in the first step, this will end up containing a list of all available packages once they have been checked

$Status += $CMPackage
}#EndFor

We can then just output this to a grid view (or to csv or similar)

$Status | Out-GridView 

PowerShell Framework Module: Connect-Office365

One of the most common things I use Powershell for is Office 356, this requires the modules be installed and connecting to one of the Office365 sessions, Skype, Exchange or the Security & Compliance center.

Follow this project on GitHub

It’s a fairly common process but including it in a script that is portable and performs environmental checks before running can be a bit of a pain. Having this log to file and be consistent in any script with Office 365 interaction is very useful to me

This script simply puts some belts and braces around importing the Office 365 modules and connecting to the service(s) inline with the Powershell Template

Setup

How does it work?

The module can be called for each service (Connect-ExOnline, Connect-SFBOnline, Connect-SCCOnline) or as a single command to connect all services (connect-office365). Assuming you go with the connect all services option it works something like this.

Connect-Office365 calls a number of other functions, First Connect-ExOnline to connect to most Office 365 interfaces then Connect-SFBOnline for the Skype for Business interface and finally Connect-SCCOnline for the security and compliance service.

Because each can run independently the first task each sub-function completes is to check if credentials have been provided and the initial connection has been made to the MSOL service. If not this gets completed and the script prompts for credentials and attempts to connect.

The specific PSSession for the function currently being called will then be created allowing access to any of the commandlets for that service.

screenshot1

All this activity is recorded and stored in the application logs folder, a successful run should look something like this:

***************************************************************************************************
Started logging at [10/11/2017 18:27:27].
***************************************************************************************************
***************************************************************************************************
[10-11-2017 18:27:27] Loading Module: Connect-AD.ps1
[10-11-2017 18:27:27] Loading Module: Connect-Office365.ps1
[10-11-2017 18:27:27] Loading Module: Get-ADFunctions.ps1
[10-11-2017 18:27:27] Loading Module: Get-O365Licenses.ps1
[10-11-2017 18:27:27] Loading Module: Set-O365Licenses.ps1
[10-11-2017 18:27:27] Loading Module: Set-SkypeProfile.ps1
[10-11-2017 18:27:27] 	Importing AD Modules
[10-11-2017 18:27:27] Checking for ActiveDirectory Module
[10-11-2017 18:27:27] 		ActiveDirectory module is already loaded
[10-11-2017 18:27:27] 	Starting Connect-Office365
[10-11-2017 18:27:27] 	Connecting to Exchange Online
[10-11-2017 18:27:28] 		Enter your Office 365 admin credentials
[10-11-2017 18:27:36] 		Creating Exchange Online PS session
[10-11-2017 18:27:40] 			Exchange Online PS session built, connecting...
[10-11-2017 18:27:51] 				Connected
[10-11-2017 18:27:52] 	Connecting to Security & Compliance Center Online
[10-11-2017 18:27:52] 		Creating Security and Compliance Center PS Session
[10-11-2017 18:27:54] 			Security and Compliance Center PS session built, connecting...
[10-11-2017 18:28:01] 				Connected
[10-11-2017 18:28:01] 	Connecting to Skype For Business Online
[10-11-2017 18:28:02] 		SkypeOnlineConnector module loaded
[10-11-2017 18:28:02] 		Creating Skype PS session
[10-11-2017 18:28:02] 			Automatic Skype For Business endpoint discovery failed, trying to use manual 'AdminDomain'
[10-11-2017 18:28:19] 			Skype PS session built, connecting...
[10-11-2017 18:28:27] 				Connected
...

Many to one mailmerge aka Manager mail merge

Sometime it’s necessary to email an individual about multiple people. Sometimes, its necessary to email loads of people about loads of people.

I needed a way to email managers about staff in their team who were receiving new equipment. As this was multiple people in multiple teams with multiple managers it was a bit out of the scope of what mail merge is designed to handle.

Follow this project on GitHub

The script takes a “source.csv” layed out like this:

IDNameDeviceManagerIDManagerNameManagerEmail
1001Joe BloggsiPhone 62001A[email protected]
1002Steve JonesiPhone 52001A[email protected]
1003Dan SmithiPhone 62002B[email protected]

For each unique manager in this list (“A” and “B”) a new email will be created based on an html template “_template.htm”. addressed to the manager and containing a formatted table with their staff.

screenshot1

Using the table above, manager “A” will recieve and email about employees 1001 & 1002, and manager “B” about employee 1003.

Setup

To Do/Extension

  1. Automatic sending The initial use case for this script need it to be sent by outlook at a given time

  2. More options!

  3. Inline processing This wasn’t required initially but it might be useful to be able to accept the sourcedata table/object via a pipe or argument

PowerShell Framework

Writing code that can be maintained by multiple people always requires some sort of structure. A lot of the frameworks I have seen are overly complicated for what is still essentially quite small scripts. So I decided to build my own

My requirements were for a small, extensible framework I can use for quick scripts, everything I found online was overly complicated. All I needed was a simple framework so all my scripts are a similar format, work in a similar way, can be moved around without too much fuss and can be easily debugged by someone else should they have issues.

Follow this project on GitHub

To start off I asked myself what I want;

1 Simplicity; This needs to be simple to read, reproduce and trace issues.

2 Troubleshooting; Tracing a script that’s run automatically in the background that failed should be just as easy to trace the fault as if I were stepping through the code. Logging will need to be important

3 Modular; I want to be able to write other modules that I can just slot in when needed. Re-usable parts from other scripts without having to rewrite bits. Things like a module for connecting to Office 365 or a suite of functions I use often.

4 Consistency: Scripts I have written two weeks ago can be difficult follow, one I wrote last year will be unrecognisable. Structure and framework will be important, I should be able to pick up any script and be able to roughly follow it because they all look and feel the same.

Structure

Structure

I’ve tried to keep things separated so they are easy to locate, easy to modify and easy to follow.

The “Verb-Driver.ps1” file is the work-horse of the framework, this is the file you run to execute the script or schedule a task to run.

The driver will set up the script environment by calling functions or initializing things.

Flow

Firstly the config file will be loaded. The Config.ps1 file by default contains the log folder, naming structure and file name format, it should also contain any variables which may be used every time the script is run.

This could be done with command line arguments, however if every time I run a script I have to enter a particular argument, that’s a waste of time. If the script runs as a scheduled task, it’ll be easier to update the config file than update the scheduled task.

Secondly the Logging module is loaded and initialised, this will trigger the log folder to be created and a log file to be created. If the folder and log file already exist the log file will be appended.

Next, any other modules should be loaded (in the template’s default state the “Sample-Functions.ps1”).

If any of these three sections are to fail for any reason they script will terminate and to avoid causing any potential damage.

Once the script is initialised and ready to perform custom actions the main script block can be run, safe in the knowledge that modules are loaded, the script is logging and config files are loaded.

The main script block is where the specific code for the project/tasks goes, this may be processing data from the config file, processing CSV files and other data and exporting some sort of result.

When writing a main script block use the following structure to write messages to the log file.

Log-write -logpath $Script:LogPath -linevalue "A message"

In it’s default state these messages will be written out to console as well as the file. This can be altered in the config file by changing the $Script:LoggingDebug variable to $false

The resultant log file will look something like this:

pshLog

The output on the PowerShell console will show the same information

pshLog

Once the Main script block has completed, any session variables will be cleaned up and the log file finished off.

pshLog

In future I would like to include some sort of unit testing, perhaps with Pester or something similar but I am still trying to learn this so it might be a while.

The logging script is cobbled together and cribbed from scripts found online. I want to re-write this to be more efficient for how I use it. I would also like to add a syslog option which would be useful for running scripts on schedule.

NOTE: PowerShell below version 4 might have issues using the framework

PowerShell Framework Module: Connect-AD

One of the most common things I use Powershell for requires the ActiveDirectory module. In its self, this isn’t an issue, I know I have it installed and that it will import automatically. The problem comes when sending or transferring the script to someone else.

Follow this project on GitHub

If the device running the script doesn’t have some modules installed or available a script will fall-over.

This script simply puts some belts and braces around importing the ActiveDirectory module inline with the Powershell Framework

Setup

How does it work?

The script will check for an RSAT installation

Flow

The logs produced should look like this

*************************************************************
Started logging at [11/10/2017 15:46:01].
*************************************************************
*************************************************************

[11/10/2017 15:46:01] Loading Module: Connect-AD
[11/10/2017 15:46:01] 	Importing AD Modules
[11/10/2017 15:46:01] Checking for ActiveDirectory Module
[11/10/2017 15:46:01] 		ActiveDirectory module is already loaded
...

Script to flash populate an Active Directory lab!

When you need to test Active Directory in a lab with sample users, creating sufficiently realistic test accounts is a time consuming and tedious process. There are a few quick scripts for creating something similar but many of them only create basic users which don’t emulate a production environment very well.

Follow this project on GitHub

This script will create users with the following attributes:

Setup

LAB.local
     \---Company
         \---Users</code>

How does it work?

The script will create a user account for each user in the “FakeUserData.csv” file with the following information:

Each user created will be done so with a random company, a random department and a random role within that department.

As a final task the script will in back through all the AD accounts in the $BaseOU organizational unit and for each of the companies set in $Companies find each department and assign everyone a random manager from the same department in the same company.

If you want the environment to do something different with the SAM account names like “first.last” you will need to update line 73 to something like

New-AdUser -SamAccountName $User.Username -Name $UserFullName -Path $UserOUPath -AccountPassword $UserPassword -Enabled $True `

to something like this:

New-AdUser -SamAccountName $NewUsername -Name $UserFullName -Path $UserOUPath -AccountPassword $UserPassword -Enabled $True `

Credit

The information stored in “FakeUserData.csv” was provided by fakenamegenerator.com and contains around 600 random users doted around Europe in the following countries: