Monday, 24 March 2014

Testing SharePoint Email Alerts in UAT and DEV Environments

Introduction

This post describes installing an SMTP server (SMTP4Dev) that can be used to capture all email sent from a SharePoint farm, without delivering the email to the recipients mailbox.

This is handy to have setup in a Development or UAT (user acceptance testing) environment, where you want to test or analyse email alerts sent from SharePoint, without actually having the emails delivered to the end user.

The two scenarios presented here

1. Installing and configuring SMTP4Dev on a SharePoint server (or other server) that doesn’t already have a process listening on port 25 (e.g. the IIS SMTP service)
2. Installing and configuring SMTP4Dev on a SharePoint server that already has a service listening on port 25 (e.g. the IIS SMTP service)

Introduction to SMTP4Dev

SMTP4Dev is a console application used to receive email via SMTP. Email received by SMTP4Dev can be inspected or deleted. However, SMTP4Dev does not deliver email to a destination mailbox. It can listen on any port (the default port is 25), and will accept email while it’s running. It can be configure for anonymous or authenticated connections.

SMTP4Dev is great for being able to capture and analyse emails sent in a UAT or Development environment, without needing an email infrastructure (like Exchange and Outlook).

Downloaded SMTP4Dev here: http://smtp4dev.codeplex.com/ 

Installing SMTP4Dev

Installing and running SMTP4Dev is very simple. Follow these steps:
  1. Download SMTP4Dev from codeplex (http://smtp4dev.codeplex.com/) and save the zip file to the local file system. E.g. C:\Tools\smtp4dev.zip
  2. Extract the zip files contents to the same location
  3. Double click the smtp4dev.exe
  4. Done!

When SMTP4Dev is opened, it will start listening on Port 25 by default.



If there is already another process using port 25 (e.g. the IIS SMTP service), SMTP4Dev will show an error message about a socket address (the configure port is already in use).



Configure SharePoint Outbound email with SMTP4Dev running on Port 25

If no other process is running on port 25, SMTP4Dev will use the default SMTP port (25) to listen on. Once you start SMTP4Dev, by default it will begin to listen  for email on port 25, using anonymous authentication.

To finish the configuration, set the Outbound Email server for the farm to the FQDN (Fully Qualified Domain Name) of the SharePoint application server running SMTP4Dev.

  1. Open the Central Admin site
  2. Click on System Settings
  3. Click on Configure outgoing e-mail settings
  4. Enter the FQDN for the SharePoint application server that is running SMTP4Dev, into the Outbound SMTP Server textbox.
  5. Set the From address and Reply-to address
  6. Click OK to save the settings.

Configure SharePoint Outbound email with SMTP4Dev running on a Custom Port

If there is already another process listening port 25, you will need to configure SMTP4Dev to listen on another port. Configuring SMTP4Dev to listen on another port is easy. However, SharePoint will only send outbound email to an SMTP server listening on Port 25. So additional configuration is required to get this working.

Configuring SMTP4Dev to listen on a custom port might happen in a scenario where all of the SharePoint servers in a UAT or Development farm already have the IIS SMTP service installed (for inbound email).
Consider the following scenario:

  • The SharePoint application server that SMTP4Dev is installed on has the host name SP13App01.mydomain.com
  • The SharePoint Application server, SP13App01.mydomain.com, already has the IIS SMTP service configured (on Port 25) for Inbound Email (functionality that allows people to email a SharePoint folder).
  • The IIS SMTP service is configured to receive email for the SharePoint farms domain alias (e.g. production-sharepoint.mydomain.com)
  • The Outbound email SMTP server for the farm is set to sp13app01.mydomain.com
  • The IIS SMTP server on sp13app01 is configured to relay email for all remote domains to the SMTP smart host going.nowhere.local, over port 19876.
  • There is a dns entry in the hosts file on sp13app01 for going.nowhere.local that uses the localhost IP address 127.0.0.1
  • SMTP4Dev is configured to listen on Port 19876



In this scenario, any email sent to an address in the production-sharepoint.mydomain.com will be picked up by one of the SharePoint servers running the IIS SMTP service (on Port 25) and saved into the IIS mail Drop folder (for SharePoint to process). If an email sent to the SMTP server isn’t addressed to the production-sharepoint.mydomain.com domain, the IIS SMTP server will;

  • (WFE servers): discard the email
  • (Application server): forward (relay) the message to the SMTP smart host.

An SMTP smart host is an SMTP server that will accept email for any domain from a source SMTP server (the sending SMTP server), and forward that email to a destination SMTP server responsible for the emails domain.

In practice, the IIS SMTP on the SharePoint WFE servers should never receive email addressed to a foreign domain. However, the IIS SMTP service on the SharePoint application server will receive email addressed to the production-sharepoint.domain.com domain, as well as emails sent from the SharePoint farm itself. This is because the SharePoint server sp13app.domain.com has been set as the outgoing email server for the farm.

In this scenario, SMTP4Dev is configured as the Smarthost. It will receive all emails sent to the SharePoint Application server’s IIS SMTP service (running on Port 25) that are destined for a foreign domain (e.g. a list alert configured to send new list item alerts to a domain user).
To configure SharePoint for this type of scenario, follow these steps;
  1. Configure SMTP4Dev to listen on a custom port
  2. Add a host alias to the hosts file of the server running SMTP4Dev
  3. Configure the IIS SMTP service on the SharePoint application server to relay all email destined for a foreign domain to the host alias configured in step 2
  4. Configure the SharePoint application server as the outbound SMTP server for the farm.

1. Configure SMTP4Dev to listen on a custom port

  1. Open SMTP4Dev
  2. Click Options
  3. From the Options dialog, click the Server tab
  4. Change the Port Number to a custom port value between 1025 and 65000. In the example, we use port 19876



    Note: Whatever port you choose, make sure no other process is listening on that port. To check this, you can use the NETSTAT command, and pipe the results to the FIND command. For example, so check if a process is listening on port 19786, use the following command at a command prompt.

    Netstat –a | Find “:19876”

2. Add a host alias to the host file

Each SharePoint server in the farm is configured to send outbound email to a single SharePoint application server. The IIS SMTP service on the SharePoint application server is configured to forward (relay) all e-mail for foreign domains to a Smart Host. The “Smart Host” is actually an SMTP service (SMTP4Dev) running on itself, listening on a different port.

When you configure the Smart Host in the IIS SMTP services, the UI prevents you from adding the hostname of the current server as the Smart Host.

To work around this limitation, add an alias to the hosts file on the SharePoint application server. The alias can be anything (though it shouldn’t be a hostname used anywhere else), but the IP address must be set to the local server (127.0.0.1).

In this example, the alias is set to going.nowhere.local


3. Configure IIS SMTP service to relay email to a Smart Host

Configure the IIS SMTP service (on the SharePoint application server) to forward email (for remote domains) to itself.
  1. Open the IIS Admin 6.0 console
  2. Expand the local server
  3. Expand the SMTP Virtual Server
  4. Right click the virtual server, and click Properties
  5. Click on the Access tab
  6. Configure the Relay settings
    Note: Be careful configuring relay settings. Ensure that you restrict the list of servers (IP Addresses) allowed to use this server as an open (unauthenticated) SMTP relay.


  7. From the example we’ve been using, we are going to allow the two SharePoint WFE servers to relay through this SMTP server un-authenticated.
  8. After configuring the SMTP Relay settings, you need to configure the Smart Host that will be used to forward all foreign email to. Click on the Delivery tab.
  9. Click Outbound connections
  10. In the Outbound Connections dialog, enter the TCP port that SMTP4Dev is listening on. In the example, this is port 19876
  11. Click Ok to save the changes.
  12. From the Delivery tab, click Advanced.
  13. In the Advanced Delivery dialog, set the Smart Host. This will be the alias name you added to the hosts file. In the example, we used going.nowhere.local
  14. Click Ok to save the changes.
  15. This completes the configuration of the IIS STMP service.

In Summary

  1. The IIS SMTP Service on all SharePoint servers is configured to accept emails sent to SharePoint web applications. In the examples, the SharePoint servers accept email sent to the production-sharepoint.mydomain.local domain. E.g. an email addressed to myshareddocumentlibrary@production-sharepoint.mydomain.local
  2. The IIS SMTP Service on the SharePoint application server is configured to forward all foreign email (that is, email sent to other domains) to an SMTP Smart Host (hosted on the same server), called going.nowhere.local, over port 19876.
  3. The SMTP service listening on port 19876 is SMTP4Dev.  SMTP4Dev will receive all email sent to it, so that it can be viewed. E-mail will never reach the mailbox of the intended recipient, which is the behaviour we want.

4. Configure the Outbound email settings for the farm

The final step is to configure the farm to send all outbound email to the SharePoint application server. After doing this, the IIS SMTP service on the SharePoint application server will receive all outbound email, and forward it on to the testing SMTP service, SMTP4Dev (listening on port 19876).

References





Thursday, 6 March 2014

Get a List of Fields in a Site Collection that are using a Managed Metadata TermSet

Ever wondered how many fields are referencing a Managed Metadata Termset? It's going to be a long and boring job using the Web UI to click through every web... and every list in every web... and every field in every list, looking for all the fields referencing a particular termset. Just writing that in a sentence was long enough!

This is the sort of job where PowerShell really shines!

The example below demonstrates creating a script (with a number of functions) to recurse through a site collection, creating a report of all the fields using a termset.

Just take me to the Microsoft TechNet Gallery, so I can download the script:  Find all SPFields that are using a Managed Metadata TermSet

The basic PowerShell used to check a field is:

$termSetId = "e07cab2f-ef85-473e-a4a7-1104b5daf192"            
$field = (Get-SPWeb "http://mdysp13").Lists["Documents"].Fields["Country"]            
if($field.GetType().Name -eq "TaxonomyField"){            
 if($field.TermSetId.ToString() -eq $termSetId){            
  Write-Host "Houston, we have a match!" -foregroundcolor darkyellow;            
 }            
}

Or, for a collection of fields:

$fieldCollection = (Get-SPWeb "http://mdysp13").Lists["Documents"].Fields            
$termSetId = "e07cab2f-ef85-473e-a4a7-1104b5daf192"            
foreach($field in $fieldCollection)            
{            
 if($field.GetType().Name -ne "TaxonomyField"){            
  continue;            
 }            
 if($field.TermSetId.ToString() -ne $termSetId){            
  continue;            
 }            
 #if we get to here, we have a match!            
}

I hear you say, "That's awesome Matt, but where the hell do I get the Taxonomy TermSet ID from?!"

Well, that's quite easy.

$w = Get-SPWeb "http://mdysp13";                        
$tsession = Get-SPTaxonomySession -Site $w.Site;                        
$tsession.GetTermSets("Countries",1033) | FT Name,ID
#Or, if you want to get a term set based on the SPWeb's default language ID            
$tsession.GetTermSets("Countries",$w.Language) | FT Name,ID

Pretty cool huh?

If you want to get a list of all the termsets, then you can write a simple function to return all the termsets as a list.

function List-AllTermSets{            
 [CmdletBinding()]            
  Param(             
    [parameter(Mandatory=$true, ValueFromPipeline=$true)][Microsoft.SharePoint.SPWeb]$web            
   )            
 $termSetInfo = New-Object psobject            
 $termSetInfo | Add-Member -MemberType NoteProperty -Name "Store" -value ""
 $termSetInfo | Add-Member -MemberType NoteProperty -Name "StoreId" -value ""
 $termSetInfo | Add-Member -MemberType NoteProperty -Name "Group" -value ""
 $termSetInfo | Add-Member -MemberType NoteProperty -Name "GroupId" -value ""
 $termSetInfo | Add-Member -MemberType NoteProperty -Name "TermSet" -value ""
 $termSetInfo | Add-Member -MemberType NoteProperty -Name "TermSetId" -value ""
             
 $tsession = Get-SPTaxonomySession -Site $web.Site;            
 $tstores =  $tsession.TermStores;             
 $list = @();            
 foreach($tstore in $tstores)            
 {            
  $tgroups = $tstore.Groups;            
  foreach($tgroup in $tgroups)            
  {            
   $tsets = $tgroup.TermSets;            
   foreach($tset in $tsets)            
   {            
    $tinfo = $null;            
    $tinfo = $termSetInfo | Select-Object *;            
    $tinfo.Store = $tstore.Name;            
    $tinfo.StoreId = $tstore.ID;            
    $tinfo.Group = $tgroup.Name;            
    $tinfo.GroupId = $tgroup.ID;            
    $tinfo.TermSet = $tSet.Name;            
    $tinfo.TermSetId = $tSet.ID;            
    $list += $tinfo;            
   }            
  }             
 }            
 return $list;            
}

So, what if I want all of this scripted? A function I can call that generates a report. Well, prepare to roll up your sleeves and poise your fingers over the Ctrl+C key combo!

We need a couple of functions for this,performing the following tasks;

1. A function to get a list of all the taxonomy (managed metadata) fields in a field collection referencing a termset
2. A function to call that will report on all the taxonomy (managed metadata) fields in the web, the webs lists, and the webs sub webs, that are referencing a given termset.

I've outlined each function below. If you'd rather just download the script, download it from the Microsoft TechNet Gallery here: Find all SPFields that are using a Managed Metadata TermSet

1. Get a list of all the fields (in a field collection) using a termset

function Get-FieldsUsingTermSet            
{            
 [CmdletBinding()]            
  Param(             
    [parameter(Mandatory=$true, ValueFromPipeline=$true, Position=1)][Microsoft.SharePoint.SPFieldCollection]$fieldCollection,            
    [parameter(Mandatory=$true, Position=2)][Microsoft.SharePoint.Taxonomy.TermSet]$TermSet            
   )            
 $MetadataField = New-Object psobject            
 $MetadataField | Add-Member -MemberType NoteProperty -Name "ParentListUrl" -value ""
 $MetadataField | Add-Member -MemberType NoteProperty -Name "ParentListTitle" -value ""
 $MetadataField | Add-Member -MemberType NoteProperty -Name "FieldTitle" -value ""
 $MetadataField | Add-Member -MemberType NoteProperty -Name "FieldId" -value ""            
             
 $matches = @();            
 foreach($field in $fieldCollection)            
 {            
  if($field.GetType().Name -ne "TaxonomyField"){            
   continue;            
  }            
  if($field.TermSetId.ToString() -ne $TermSet.Id.ToString()){continue;}            
  $tf = $MetadataField | Select-Object *;            
  $tf.ParentListUrl = $field.ParentList.ParentWeb.Url;            
  $tf.ParentListTitle = $field.ParentList.Title;            
  $tf.FieldTitle = $field.Title;            
  $tf.FieldId = $field.ID;            
  $matches += $tf;            
 }            
 return $matches;            
}

2. A parent function to bring it together, that will give you some options (like recursively checking the web,  searching just web level fields)

function Get-ManagedMetadataFieldUses            
{            
 [CmdletBinding()]            
  Param(             
    [parameter(Mandatory=$true, ValueFromPipeline=$true, Position=1)][Microsoft.SharePoint.SPWeb]$web,            
    [parameter(Mandatory=$true, Position=2)][Microsoft.SharePoint.Taxonomy.TermSet]$TermSet,
    [parameter(Mandatory=$false, Position=4)][switch]$Recurse,            
    [parameter(Mandatory=$false, Position=5)][switch]$WebLevelFieldsOnly            
   )             
             
 $matches = @();             
 $matches += Get-FieldsUsingTermSet $web.Fields $TermSet;            
             
 if($WebLevelFieldsOnly -eq $false)            
 {            
  foreach($list in $web.Lists)            
  {            
   $matches += Get-FieldsUsingTermSet $list.Fields $TermSet            
  }            
 }            
             
 if($Recurse)            
 {            
  foreach($subweb in $web.Webs)            
  {            
   $matches += Get-ManagedMetadataFieldUses $subweb $TermSet $Recurse $WebLevelFieldsOnly;            
  }            
 }            
             
 return $matches            
}

Examples of using the script to create some reports.

1. Download the script from here:
2. Save the script somewhere. "C:\Temp" is a good place!
3. If you haven't already, set the PowerShell execution policy to Bypass (this will allow you to import all PowerShell scripts)

Set-ExecutionPolicy -ExecutionPolicy Bypass -Scope CurrentUser

4. Import the script into PowerShell.

Import-Module C:\Temp\Get-ManagedMetadataFieldUses.ps1

5. Run a few commands to get a termset to report on. In this example, I get a termset called "Countries"

#Get the SPWeb object            
$w = Get-SPWeb http://mdysp13;            
#Get the taxonomy session used by the SPWeb's site            
$tsession = Get-SPTaxonomySession -Site $w.Site;            
#Get all the TermSets with the name "Countries", and the web's default Language ID
$termSets = $tsession.GetTermSets("Countries",$w.Language)            
#Display the TermSets found            
$termSets | FT @{Label="Group";Expression={($_.Group).Name}},Name,ID            
#Select the first TermSet            
$termSet = $termSets[0]



6. Call the Get-ManagedMetadataFieldUses function, and store the results in the $matchingFields variable.

$matchingFields = Get-ManagedMetadataFieldUses -web $w -TermSet $termSet -Recurse

Do some reporting!!

Display all of the results in the raw format.

$matchingFields | FT



Display all of the results, grouping them by the Site. This view of the data will show you how many fields in each site (or web) are referencing the termset)

$matchingFields | Group-Object ParentListUrl



This improves on the previous command, displaying all of the results, grouping them by the Site. In this view, all the fields are listed, grouped under the site they belong to.

$matchingFields | Group-Object ParentListUrl | Select -ExpandProperty Group  | Format-Table -GroupBy ParentListUrl



Finally, group the objects into a Hash Table. This will allow you to directly reference a web URL, to a get a list of fields in that web that reference the termset.

$hashTable = $matchingFields | Group-Object ParentListUrl -AsHashTable -AsString
$hashTable."http://mdysp13" | FT ParentListTitle,FieldTitle,FieldId -AutoSize



And "even more finally", you can export your results to a CSV file for further analysis!

$matchingFields | Export-CSV  -Path C:\temp\fieldreport.csv -NoTypeInformation -Delimiter "`t"

Download the full script from the Microsoft TechNet Gallery here: Find all SPFields that are using a Managed Metadata TermSet


Friday, 31 January 2014

Create 2000 Unique Domain Accounts with Profile Photos for a Development SharePoint Environment

Introduction

I like to have a development environment that is as close to a production environment as possible. Having realistic development (or staging) environment helps business users visualise what an end product (or solution) will look like when deployed.

The following PowerShell (and accompanying name files) demonstrates creating 2000 unique Active Directory domain accounts, including setting different locations, departments, phone numbers and gender (male or female). Each domain account has a photo uploaded to Active Directory. Finally, SharePoint User Profile synchronisation is configured, to import the users and their photos.

This article makes use of name files from scrapmaker.com, and people pictures from fotolia.com

Download the full Script from the Microsoft TechNet Gallery


PowerShell script to create 2000 Active Directory uses with Profile Pictures

Process

  1. Download the name files (female names, male names, surnames)
  2. Format the documents using a notepad editor
  3. Import the name files into PowerShell
  4. Create a custom PSObject for holding people information
  5. Use the name files to create 2000 unique users (1000 males and 1000 females)
  6. Download 1000 male and 1000 female photos from fotolia
  7. Create the 2000 users (using the Active Directory PowerShell module) and upload the photos for each new user account
  8. Configure SharePoint User Profile Synchronisation to import the users from Active Directory, including each users Thumbnail photo
  9. Run Update-SPProfilePhotoStore to create the profile photo variations
  10. Index the user profiles and view the people results in SharePoint Search

1. Download the name files


To complete this exercise, you need a list of female names, male names and female names. I searched the internet, and quickly came across a site called scrapmaker, which offers various lists, including the name lists I was after. They can be downloaded here: http://scrapmaker.com/dir/names

2. Formatting the documents


Open each name file, and remove any header information. The file should only contain a single column of names.

3. Import the name files into PowerShell


Here we import the three name files into PowerShell variables. Each variable will hold a collection of names that will be used to create the random male and female users.

$ffn = Get-Content "C:\Temp\NameDb\femalefirstnames.txt"            
$mfn = Get-Content "C:\Temp\NameDb\malefirstnames.txt"            
$ln = Get-Content "C:\Temp\NameDb\surnames.txt"

4. Create a custom PSObject for holding people information


Here we create a new psobject that will hold all of the user properties we will be randomly generating.

$userobject = New-Object psobject            
$userobject | Add-Member -MemberType NoteProperty -Name "FirstName" -value "" 
$userobject | Add-Member -MemberType NoteProperty -Name "LastName" -value "" 
$userobject | Add-Member -MemberType NoteProperty -Name "SamAccount" -value ""
$userobject | Add-Member -MemberType NoteProperty -Name "Location" -value ""
$userobject | Add-Member -MemberType NoteProperty -Name "Country" -value ""
$userobject | Add-Member -MemberType NoteProperty -Name "CountryCode" -value ""
$userobject | Add-Member -MemberType NoteProperty -Name "City" -value ""
$userobject | Add-Member -MemberType NoteProperty -Name "Mobile" -value ""
$userobject | Add-Member -MemberType NoteProperty -Name "DDI" -value ""
$userobject | Add-Member -MemberType NoteProperty -Name "Ext" -value ""
$userobject | Add-Member -MemberType NoteProperty -Name "Department" -value ""
$userobject | Add-Member -MemberType NoteProperty -Name "JobTitle" -value ""
$userobject | Add-Member -MemberType NoteProperty -Name "Gender" -value ""

5. Use the name files to create 2000 unique users (approximate 1000 males and 1000 females)


Here we use the name files to create 2000 "users". Each user created is added to a userobject (defined in the above step). The PowerShell Get-Random cmdlet is used to randomly select values for names and locations.

The script is commented, providing additional information.

#Create an array to hold the user objects            
$users = $null;            
$users = @();            
#Get the number of names in each name file.            
$ffnCount = $ffn.Count;            
$mfnCount = $mfn.Count;            
$lnCount = $ln.Count;            
#Set a based number that will be used when creating the users Sam Account.            
$sabase = 1000;            
#Get the TextInfo class. This will be used with the name files to change the casing of names to title case. E.g. john will be changed to John.            
$cI = Get-Culture;            
$tI = $cI.TextInfo;            
$i=1;            
#Create 2000 random users            
while($i -le 2000){               
 #Create a new user object            
 $nu = $userobject | Select-Object *;            
 #Set a random index value for the last name            
 $lnIndex = Get-Random -minimum 0 -maximum ($lnCount -1);            
 #Make sure the row (in the last names array) contains a value            
 while($ln[$lnIndex].Length -eq 1)            
 {            
  $lnIndex = Get-Random -minimum 0 -maximum ($lnCount -1);            
 }            
 #Set the last name, using Title casing            
 $nu.LastName = $tI.ToTitleCase($ln[$lnIndex].ToLower());            
 #Create a unique value for the SAM Account            
 $nu.SamAccount = ([String]::Format("u{0}",$sabase));            
 #Randomly select the gender            
 $gender = Get-Random -minimum 0 -maximum 1;            
 #Set a random index value for the female name             
 $ffnIndex = Get-Random -minimum 0 -maximum ($ffnCount -1);            
 #Set a random index value for the male name            
 $mfnIndex = Get-Random -minimum 0 -maximum ($mfnCount -1);            
 if($gender -eq 0){            
  #Make sure the row (in the male names array) contains a value            
  while($mfn[$mfnIndex].Length -eq 1)            
  {            
   $mfnIndex = Get-Random -minimum 0 -maximum ($mfnCount -1);            
  }            
  #Set the fornename, using Title casing            
  $nu.FirstName = $tI.ToTitleCase($mfn[$mfnIndex].ToLower());            
  #Set the gender            
  $nu.Gender = "Male";            
 }            
 else{            
  #Make sure the row (in the female names array) contains a value            
  while($ffn[$ffnIndex].Length -eq 1)            
  {            
   $ffnIndex = Get-Random -minimum 0 -maximum ($ffnCount -1);            
  }            
  #Set the fornename, using Title casing            
  $nu.FirstName = $tI.ToTitleCase($ffn[$ffnIndex].ToLower());            
  #Set the gender            
  $nu.Gender = "Female";            
 }            
 #Use a random number to set the location of the user.            
 $li = Get-Random -minimum 0 -maximum 100;            
 if($li -le 25){$nu.Location = "Melbourne";$nu.City="Melbourne";$nu.Country="Australia";$nu.CountryCode="AU";}            
 if($li -gt 25 -and $li -le 40){$nu.Location = "Hong Kong";$nu.City="Hong Kong";$nu.Country="Hong Kong";$nu.CountryCode="HK";}            
 if($li -gt 40 -and $li -le 80){$nu.Location = "London";$nu.City="London";$nu.Country="England";$nu.CountryCode="UK";}            
 if($li -gt 80){$nu.Location = "New York";$nu.City="New York";$nu.Country="United States of America";$nu.CountryCode="US";}            
 #Set the users phone numbers using the unique base number $sabase            
 $nu.DDI = ([String]::Format("555-{0}",$sabase));            
 $nu.Ext = ([String]::Format("{0}",$sabase));            
 $nu.Mobile = ([String]::Format("07555-66{0}",$sabase));             
 #Set the Department of the user            
 if($i -le 20){$nu.Department = "Executive"};            
 if($i -gt 20 -and $i -le 100){$nu.Department = "Middle Management";$nu.JobTitle="Manager";};            
 if($i -gt 100 -and $i -le 200){$nu.Department = "Accounts";$nu.JobTitle="Accountant";};            
 if($i -gt 200 -and $i -le 250){$nu.Department = "Marketing";$nu.JobTitle="Marketing Executive";};            
 if($i -gt 250 -and $i -le 400){$nu.Department = "Sales";$nu.JobTitle="Salesman";};            
 if($i -gt 400 -and $i -le 450){$nu.Department = "Information Technology";$nu.JobTitle="IT Support";};             
 if($i -gt 450 -and $i -le 475){$nu.Department = "Human Resources";$nu.JobTitle="HR Support";};            
 if($i -gt 475 -and $i -le 575){$nu.Department = "Engineering";$nu.JobTitle="Engineer";};            
 if($i -gt 575 -and $i -le 675){$nu.Department = "Supervisors";$nu.JobTitle="Supervisor";};            
 if($i -gt 675 -and $i -le 875){$nu.Department = "Team Leaders";$nu.JobTitle="Team Leader";}            
 if($i -gt 875){$nu.Department = "Manufacturing"};            
 $users += $nu;            
 Write-Host "Added"$nu.FirstName $nu.LastName            
 $sabase++;            
 $i++;            
}            

6. Download 1000 male and 1000 female photos from fotolia


This part of the script uses the Internet Explorer object to search for photos on the Fotolia.com site.

Photos are downloaded from the Fotolia site by specifying a search query in the URL query string. Each page of results has approximately 100 images (search results). The Download-PhotoFromFotolia function takes a URL parameter, an internet explorer object, a base id (used to create unique file names) and directory path to save the images to. If there are no images returned from the search query, -1 is returned from the function.


function Download-PhotoFromFotolia{            
 [CmdletBinding()]            
  Param(              
    [parameter(Mandatory=$true, ValueFromPipeline=$true)][String]$Url,            
    [parameter(Mandatory=$true)][object]$InternetExplorer,            
    [parameter(Mandatory=$true)][object]$BaseImageId,            
    [parameter(Mandatory=$true)][object]$FileDirectoryPath            
   )             
 #Get the page (of search results) using the provided URL            
 $InternetExplorer.Navigate($Url)            
 while ($InternetExplorer.ReadyState -ne 4)                        
 {            
  Write-Host "Downloading page. Please wait..." -foregroundcolor DarkYellow;            
  sleep -Milliseconds 500                        
 }             
 Write-Host "Getting a collection of images.";            
 #Get a collection of all the images on the page            
 $images = $InternetExplorer.Document.getElementsByTagName("img")            
 Write-Host "Getting all of the portrait thumbnails.";            
 #Get a collection of all the images with a src attribute that starts with http://t - this will be the collection of thumbnail photos from the search results.            
 #By examining the search page using Internet Explorer tools, you can see that all of the thumbnail photos are on a CDN network, starting with the URL http://t1, or http://t2            
 #Other images on the page (logos, etc), have a src attribute starting with http://s            
 $imageCollection = $images | ?{$_.src -like "http://t*"}              
 $wc = new-object System.Net.WebClient            
 #Download each image and save it to the specified directory            
 foreach($image in $imageCollection )            
 {            
  Write-Host "Downloading"$image.Src -foregroundcolor DarkYellow            
  $filePath = ([String]::Format("{0}\{1}.jpg",$FileDirectoryPath,$BaseImageId));            
  try            
  {            
   $wc.downloadfile($image.Src,$filePath);            
   Write-Host "Successfully downloaded"$image.Src"to $filePath" -foregroundcolor Green            
   $BaseImageId++;            
  }            
  catch            
  {             
   Write-Host "Failed to downloaded"$image.Src"to $filePath. Error:"$_.Exception.Message -foregroundcolor Red            
  }              
 }             
 #Return the baseimageid (with has been incremented), if images where found on the page.             
 #If no images where found, return -1            
 if($imageCollection.Count -eq 0)            
 {return -1}            
 else            
 {return $BaseImageId;}            
}

This image illustrates how to determine what to filter the images on. This is important, so that only photo thumbnail photos are downloaded.


This section of code sets the directories to save the images to. It then calls Download-PhotoFromFotolia repeated to download 2000 female and male images.

$page = 1;            
$baseImageId = 1;            
#Get a reference to Internet Explorer            
$ie = New-Object -ComObject "InternetExplorer.Application"             
#Set the directories for storing the female and male images (create the directories if they don't exist)            
$femailPhotoDirectory = "C:\temp\femalephotos";            
if((Test-Path -Path $femailPhotoDirectory) -eq $false){New-Item -Path $femailPhotoDirectory -ItemType Directory}            
$mailPhotoDirectory = "C:\temp\malephotos";            
if((Test-Path -Path $mailPhotoDirectory) -eq $false){New-Item -Path $mailPhotoDirectory -ItemType Directory}            
#Set the baseImageId - this is used to create a unique file name for each photo downloaded            
$baseImageId = 1;            
#Attempt to download 1000 female images. The search query specifies orientation = square, contenttype = photo, using the keyword "woman"            
#The Download-PhotoFromFotolia function returns -1 if there are no more images to download.            
while($baseImageId -ge 1 -and $baseImageId -le 1000)            
{              
 $baseImageId = Download-PhotoFromFotolia -Url ([String]::Format("http://au.fotolia.com/search?colors=&filters%5Bage%5D=all&filters%5Bcollection%5D=all&filters%5Bhas_releases%5D=true&filters%5Borientation%5D=square&filters%5Bmax_price_xs%5D=all&filters%5Bmax_price_x%5D=&filters%5Bcontent_type%3Aphoto%5D=1&ca=3000000&cca=20000000&k=woman&offset={0}", $baseImageId)) -InternetExplorer $ie -BaseImageId $baseImageId -FileDirectoryPath $femailPhotoDirectory             
}            
#Reset the baseimageid            
$baseImageId = 1;            
#Attempt to download 1000 male images. The search query specifies orientation = square, contenttype = photo, using the keyword "man"            
#The Download-PhotoFromFotolia function returns -1 if there are no more images to download.            
while($baseImageId -ge 1 -and $baseImageId -le 1000)            
{              
 $baseImageId = Download-PhotoFromFotolia -Url ([String]::Format("http://au.fotolia.com/search?colors=&filters%5Bage%5D=all&filters%5Bcollection%5D=all&filters%5Bhas_releases%5D=true&filters%5Borientation%5D=square&filters%5Bmax_price_xs%5D=all&filters%5Bmax_price_x%5D=&filters%5Bcontent_type%3Aphoto%5D=1&ca=3000000&cca=20000000&k=man&offset={0}", $baseImageId)) -InternetExplorer $ie -BaseImageId $baseImageId -FileDirectoryPath $mailPhotoDirectory             
}            
#Close the Internet Explorer application            
$ie.Quit();

7. Create the 2000 users (using the Active Directory PowerShell module) and upload the photos for each new user account


The following code iterates through the array of user objects, passing the user object and a profile photo path to the Add-ActiveDirectoryUser function to create Active Directory user accounts.

$femalepictureIndex = 1;            
$femaleMaxPhotos = (Get-ChildItem -Path $femailPhotoDirectory  -Filter *.jpg).Count            
$malepictureIndex = 1;            
$maleMaxPhotos = (Get-ChildItem -Path $mailPhotoDirectory  -Filter *.jpg).Count            
foreach($u in $users)            
{             
 if($u.Gender -eq "Female")            
 {            
  if($femalepictureIndex -gt $femaleMaxPhotos){$femalepictureIndex=1;}            
  $filePath = ([String]::Format("{0}\{1}.jpg",$femailPhotoDirectory,$femalepictureIndex))              
  Add-ActiveDirectoryUser $u $filePath             
  $femalepictureIndex++;            
 }            
 else            
 {            
  if($malepictureIndex -gt $maleMaxPhotos){$malepictureIndex=1;}            
  $filePath = ([String]::Format("{0}\{1}.jpg",$mailPhotoDirectory,$malepictureIndex))            
  Add-ActiveDirectoryUser $u $filePath             
  $malepictureIndex++;            
 }             
}

The Add-ActiveDirectoryUser function. This function uses the custom user object to get the values for creating the new user account.

The function calls Ensure-OUExists to test if the destination OU exists. If it doesn't, it's created.

Finally, if the profilePhotoFilePath isn't null or empty, the function calls the Add-PhotoToUserAccount function to upload the profile photo for the user.

function Add-ActiveDirectoryUser{            
 Param(              
   [parameter(Mandatory=$true, ValueFromPipeline=$true)][object]$userObject,            
   [parameter(Mandatory=$false)][object]$profilePhotoFilePath            
  )             
 $password = ConvertTo-SecureString -String "1HopeThi$isS3cure" -AsPlainText -Force            
 $path = ([String]::Format("OU={0},OU={1},OU=Locations,DC=PANTS,DC=COM",$userObject.Department.Trim(),$userObject.City.Trim()));            
 Ensure-OUExists $path             
 $currentUser = $null;            
 try            
 {            
  $currentUser = Get-ADUSer $userObject.SamAccount -ErrorAction:SilentlyContinue;            
  Write-Host "User"([String]::Format("{0} {1}",$userObject.FirstName,$userObject.LastName))"exists." -foregroundcolor Green;            
 }            
 catch            
 {            
  Write-Host "User"([String]::Format("{0} {1}",$userObject.FirstName,$userObject.LastName))"does not exist. The user will be created." -foregroundcolor DarkYellow;            
 }             
 if($currentUser -eq $null){            
  New-ADUser -UserPrincipalName $userObject.SamAccount -SamAccountName $userObject.SamAccount -Name $userObject.SamAccount -City $userObject.City -AccountPassword $password  -Surname $userObject.LastName -OfficePhone $userObject.DDI -MobilePhone $userObject.Mobile -GivenName $userObject.FirstName -Division $userObject.Department -Department $userObject.Department  -Enabled $true  -OtherAttributes @{'ipPhone'=$userObject.Ext;'physicalDeliveryOfficeName'=$userObject.Location;'employeeType'=$userObject.Gender;'co'=$userObject.Country} -EmployeeID $userObject.SamAccount -Path $path -DisplayName ([String]::Format("{0} {1}",$userObject.FirstName,$userObject.LastName)) -Title $userObject.JobTitle -Country $userObject.CountryCode;            
  $currentUser = Get-ADUSer $userObject.SamAccount -ErrorAction:SilentlyContinue;            
  Write-Host "Created"([String]::Format("{0} {1}",$userObject.FirstName,$userObject.LastName)) -foregroundcolor Green;            
 }             
 if(([String]::IsNullOrEmpty($profilePhotoFilePath))-eq $false)            
 {            
  Add-PhotoToUserAccount $currentUser $profilePhotoFilePath            
  Write-Host "Added profile picture for "([String]::Format("{0} {1}",$userObject.FirstName,$userObject.LastName)) -foregroundcolor Green;            
 }             
}

The Ensure-OUExists function

function Ensure-OUExists{            
 Param(              
   [parameter(Mandatory=$true, ValueFromPipeline=$true)][object]$path               
  )            
 $ou = $null;            
 $domain = get-addomain            
 if($domain.DistinguishedName -eq $path){return;}            
 try            
 {            
  $ou = Get-ADOrganizationalUnit -Identity $path -ErrorAction SilentlyContinue;            
 }            
 catch            
 {             
  Write-Host "OU $path does not exist." -foregroundcolor DarkYellow;            
 }             
 if($ou -eq $null){            
  $ouParent = [String]$path.Substring($path.IndexOf(",")+1);            
  $ouName = [String]$path.Substring(0,$path.IndexOf(",")).Replace("OU=","");            
  Ensure-OUExists $ouParent            
  New-ADOrganizationalUnit -Name $ouName -Path $ouParent            
  Write-Host "Created OU: $path" -foregroundcolor Green;              
 }            
}

The Add-PhotoToUserAccount function

function Add-PhotoToUserAccount            
{            
 Param(              
   [parameter(Mandatory=$true, ValueFromPipeline=$true)][Microsoft.ActiveDirectory.Management.ADAccount]$Identity,            
   [parameter(Mandatory=$true)][object]$fileName            
  )            
 try            
 {            
  $Identity | Set-ADUser -Replace @{thumbnailPhoto=([byte[]](Get-Content $fileName -Encoding byte))}            
  Write-Host "Set $fileName as the picture for"$Identity.Name"" -foregroundcolor Green;            
 }            
 catch            
 {            
  Write-Host "Failed to set $fileName as the picture for"$Identity.Name -foregroundcolor DarkYellow;            
 }             
}

8. Configure SharePoint User Profile Synchronisation to import the users from Active Directory, including each users Thumbnail photo


In this step, we configure the User Profile Application to import the user accounts from Active Directory.

The Locations OU (which contains all of the user accounts in Sub-OU's) is selected as the source for importing user accounts.


We also need to configure the User Profile properties, to ensure the Active Directory thumbnailPhoto is imported with the user accounts into the SharePoint User Profile store.


Edit the Picture profile property.


Import the Active Directory thumbnailPhoto attribute to the Picture profile property.



Finally, we need to start full profile synchronisation.



9. Run Update-SPProfilePhotoStore to create the profile photo variations


After the profile synchronisation has completed, run Update-SPProfilePhotoStore to create the profile photo variations.

Update-SPProfilePhotoStore -CreateThumbnailsForImportedPhotos $true -MySiteHostLocation "http://ms13"

10. Index the user profiles and view the people results in SharePoint Search


After the Update-SPProfilePhotoStore cmdlet complete (this can take a while), run a full crawl against the user profile store (from the SharePoint Search Application).


After the full crawl has completed, view the search results!