Jul 14

Time to get Visual

Using PowerShell gives you a simple text based console to run your commands from, but sometimes you want to jazz it up a bit or make it easier for non-techie folks to run your Cmdlets.  So how do we make it purtty?

PowerShell allows the use of Windows Forms for displaying and entering information, but what do I need to make it soft and GUI?  I hear you ask.  As it happens not a lot…

  $form = New-Object Windows.Forms.Form
  $draw = $form.ShowDialog()

And that’s it. No really it is…

Well ok maybe I did mislead a little, but this code snippet will start you on your way to creating GUI application using PowerShell.

Forming your output

Ok so I maybe jumped the gun a bit, there are ways to simply put out your list information into a GUI format.  We should all know by now, and if you don’t read my earlier posts, that you can display a list of your users with this cmdlet:

  Get-MsolUser

This will in turn produce this output:

Get-MsolUser

But with the simple addition of a pipe in your command line and an output cmdlet:

  Get-MsolUser|Out-GridView

The output changes to look like this:

Get-MsolUser-GUIMuch purrtier 😀

Using a windows GUI brings it’s own rewards, you also get the additional ability of being able to set filters to shorten your list if you have a large tenancy.

Doing something worthwhile with it all

So all’s well and good but how do I put this together to make something usable.

This script will set up a form that give you three buttons which will allow you, once connected, to query your Users, Groups and Registered Domains all in a purtty windows style;

Add-Type -AssemblyName System.Windows.Forms

$form = New-Object Windows.Forms.Form
$form.Size = New-Object Drawing.Size @(400,200)
$form.StartPosition = "CenterScreen"
$form.Text = "PowerShell-O365 GUI"

function LoadForm()
{
    $layout = New-Object "System.Windows.Forms.FlowLayoutPanel"
    $layout.FlowDirection = "TopDown"
    $layout.Dock = "Fill"
    $layout.Autosize = "True"

    $panel = New-Object "System.Windows.Forms.GroupBox"
    $panel.Text = "Show"
    $panel.AutoSize = "True"
    
    $flow = New-Object "System.Windows.Forms.FlowLayoutPanel"
    $flow.Dock = "Fill"
    $flow.AutoSize = "True"

    $btn = New-Object System.Windows.Forms.Button
    $btn.add_click({Connect-MsolService})
    $btn.Text = "Connect"
    $layout.Controls.Add($btn)

    $btn = New-Object System.Windows.Forms.Button
    $btn.add_click({Get-MsolUser|Out-GridView -Title "Users"})
    $btn.Text = "Users"

    $flow.Controls.Add($btn)

    $btn = New-Object System.Windows.Forms.Button
    $btn.add_click({Get-MsolGroup|Out-GridView -Title "Groups"})
    $btn.Text = "Groups"

    $flow.Controls.Add($btn)

    $btn = New-Object System.Windows.Forms.Button
    $btn.add_click({Get-MsolDomain|Out-GridView -Title "Registered Domains"})
    $btn.Text = "Domains"

    $flow.Controls.Add($btn)

    $panel.Controls.Add($flow)
    $layout.Controls.Add($panel)
    $form.Controls.Add($layout)

}


LoadForm

$drc = $form.ShowDialog()

So this is a good start I would say;

PowerShell O365 GUI

There is nothing stopping you using the same simple functionality to allow you to create forms to add edit and update, users, and group or any other functionality that can be implemented on a command line.

The full script can be access on GitHub as PowerShellO365GUI.ps1.

Jun 28

Using Powershell to back up an SQL 2016 database and a file structure

I had a requirement recently to put together a backup that in one PowerShell script carried out a backup on both an SQL database and also a folder containing multiple pictures that were appended to on a daily basis.

I have broken down the following PowerShell script into sections to allow easier explanation of the component parts;

Firstly I needed to import the SQL PowerShell Components module to allow the scripting of the SQL commands.

Import-Module "sqlps" -DisableNameChecking

This is installed from the SQL 2016 media.

I then set up the scripting variable to make it easier to change after the script had been written.  I could have used parameters for this but decided against it as the variables weren’t going to change that often.  I will cover using parameters more extensively in a later post.

#Set Script Variables
$backupDrive = "E:"
$subfolder = "BackupSubFolder"
$logfolder = "$($backupDrive)\$($subfolder)"
$DBbfolder = "$($backupDrive)\$($subfolder)\DBBackup"
$filebfolder = "$($backupDrive)\$($subfolder)\SystemBackup"

As I was using an external USB hard drive, the first job was to get the system to check that the backup drive was actually plugged in and had registered a drive letter.

#function to check the drive exists
function Drive-Exists
{
  (New-Object System.IO.DriveInfo($backupDrive)).DriveType -ne 'NoRootDirectory'
}

The next step was to back up the database.  I wanted to use a unique date/time stamp for the filename so created a variable from the system time stamp using the “get-date” cmdlet;

#function to backup the database
function BackupDatabase
{
  $dt = get-date -format yyyyMMddHHmmss
  $bfil = "$($DBbfolder)\MyDB_db_$($dt).bak"
  Backup-SqlDatabase -InputObject 'MYSQLServer\MySQLInstance' -Database 'MyDatabase' -BackupFile $bfil | out-file $logfolder\backuplog.txt -Append
}

Please ensure that the instance in the -InputObject parameter reflects the instance to be backed up and the -Database parameter has a name that exists in the instance.

Now to pull it all together and add a little error checking;

#Start Live Script
if(Drive-Exists)
{
#Backup the SQL database
  if(Test-Path $DBbfolder)
  {
    BackupDatabase
  }
  else
  {
    Write-Host "The folder $($DBbfolder) does not exist on the $($backupDrive) drive"
  }
}
else
{
  Write-Host "Please plug in the USB drive and ensure it is identified as $($backupDrive)"
}

I use the Test-Path cmdlet to ensure the destination folder for the backup exists.  This is used twice in the script to test both the DB backup folder exists and also the file backup folder.

Finally I used Robocopy with the /MIR parameter, to backup the pictures to the external USB drive;

#Backup the data files (ROBOCOPY)
if(Test-Path $filebfolder)
{
  $fromPath = "C:\MyInitialLocation"
  $toPath = "$($filebfolder)\MyFinalLocation"
  Write-Host "Copying Files to drive $($backupDrive)"
  ROBOCOPY $fromPath $toPath /MIR /W:5 | out-file $logfolder\backuplog.txt -Append
  Write-Host "Backup successful"
  pause
  exit
}
else
{
  Write-Host "The folder $($filebfolder) does not exist on the $($backupDrive) drive"
}

#End Script

I have attached the complete script in a zip file below:

Click to download

 

Jun 21

Getting started with Office 365 and PowerShell

Before trying to use PowerShell with Office 365 certain libraries need to be installed on the client machine, these libraries are dependent on OS, for Windows 7 additional libraries are required before proceeding.

 

OS Required libraries
Windows 7; Windows 7 SP1 Windows Management Framework;
Microsoft Online Services Sign-In Assistant for IT Professionals RTW;
Windows Azure Active Directory Module for Windows PowerShell.
Windows 8.1; Windows 10 Microsoft Online Services Sign-In Assistant for IT Professionals RTW;
Windows Azure Active Directory Module for Windows PowerShell.

If you are using Windows 10 you will most likely receive the following error:

“In order to install Windows Azure Active Directory Module for Windows PowerShell, you must have Microsoft Online Services Sign-In Assistant version 7.0 or greater installed on this computer”

If this happens you should make the following changes to your registry using Regedit:

[HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\MSOIdentityCRL]
 "MSOIDCRLVersion"="7.250.4551.0"
 "MSOIDCRLVersion_Default"="7.250.4303.0"

REMEMBER – Changing your Registry can seriously damage the health of your PC, if you are not comfortable using the tool, find someone who is.

To download these libraries using the following links:

Windows Management Framework

Microsoft Online Services Sign-In Assistant for IT Professionals RTW

Windows Azure Active Directory Module for Windows PowerShell

Once these libraries are installed (in this order) you are ready to start using PowerShell to manipulate your Office 365 tenancy.

Connecting to your tenancy

Before using Powershell with Office 365 you need to connect to your tenancy;

Open your “Windows Azure Active Directory Module for Windows PowerShell” application from your desktop and enter the following command line:

Connect-MSolService

You will be prompted for a login, here you need to enter the details you use to access your tenancy.

First Powershell cmdlet

The first powershell cmdlet I used was to retrieve a list of users:

Get-MsolUser

Will produce a list of current users on your Office 365 tenancy:

Get-MsolUserThis gives you the users principal name, their display name and their current licensed status.

And so it begins, the first steps in Powershell and Office 365.