Marc Lognoul's IT Infrastructure Blog

Cloudy with a Chance of On-Prem

Leave a comment

SharePoint: Yet another warm-up script

Slow start-up of ASP.Net applications after application pool recycle in general and SharePoint in particular, is a recurring issue reported by customers. Keeping the application “warm” thanks to a background job performing regular page processing is a quick and simple solution. While there are already a few scripts available on the Internet, I wanted to share my own, which is fully implemented in PowerShell. A few comment though:

  • This script, like (all or most of?) others, do not take care of having multiple WFE. when it resolves  the name of a web application, it will do it using DNS and therefore might point to another server… 8I’ll shortly publish another script that solves that limitation
  • To ease IIS log parsing for analytics, the script passes a specific user-agent string which eases exclusion of those requests in reports
  • The script triggers only the execution of default pages (default.aspx) which are located under the final / of the URL. Next version of the script will also solve that limitation

Feel free to download it from this location:

Have you seen Client Eastwood’s latest (and probably best?) movie Gran Torino? A perfect Drama/Comedy mix with great though less known actors.

Gran Torino poster

And cut!

SharePoint 2007: Using PowerShell to Retrieve all Servers in a Farm

I often have to perform maintenance or configuration activities on all servers members of a SharePoint farm suchs as archiving or parsing logs, recycling IIS application pools… While you could simply store all servers’ name in a plain text file, I always prefer a more dynamic way and as usual now, PowerShell will do the job.

Begin with loading the assembly…


Then connect to the farm:

$spFarm = [Microsoft.SharePoint.Administration.SPfarm]::Local

And finally populate an array with all servers:

$spServers = $spFarm.Servers|Select Farm,DisplayName,Address,Role

Now the problem with this array is that it also contains servers that are not real servers but rather “network names” such as the SMTP host for mail-out or the SQL database server name and instance. If you want to retrieve only server where SharePoint is installed and running, use this filter

$arrServers = $spFarm.Servers|Where {!($_.Role -eq 'Invalid' -and $_.Status -eq 'Online')}|Select Farm,DisplayName,Address,Role

Note: the Role will help you identifying a WFE from and application Server, for example. The address is the real host name used for network connectivity while the Display Name is, like its name suggests it, the text used for displaying the Central Admin.

And cut!

PS: Finally my “assistant director” Anthony (the Minghella of PowerShell, the Hopkins of automation or should I say, the Zuiker of scripting;)) has finally caught his digital pen to shoot ahem.. I mean write, his first post. Welcome online Antho!

Leave a comment

SharePoint: Uploading a file to a document Library using PowerShell

There are multiple methods to achieve this but since today it’s Friday and I’d like to watch Cloverfield’s making of before midnight :), here are two quick’n simple methods:

Method 1: Using WebDav and NET USE

Conditions: this method will work only if SharePoint is running on port 80, does not use form-based authentication and if the WebClient service is running on your client PC (or server, on which it is set to manual startup mode!)

Start PowerShell and enter the command

net use Z: “http://WSSSERVER/Site01/DocLibName/"

Then simply use the copy command:

copy c:demofile.txt Z:

Say what? Use PowerShell for that? It couldn't it work with a standard Windows command-prompt? Yes it can 🙂

Method 2: Using a standard HTTP “PUT”

Conditions: SharePoint should be configured to use form-based authentication, although the code could be adapted to support it.

Let’s use a function to tidy up the code…

Function UploadToSPDocLib($LocalPath,$spDocLibPath)


$WebMethod = "PUT"

$UploadFullPath = $spDocLibPath + $(split-path -leaf $LocalPath)

$WebClient = new-object System.Net.WebClient

$WebClient.Credentials = [System.Net.CredentialCache]::DefaultCredentials

$WebClient.UploadFile($UploadFullPath, $WebMethod, $LocalPath)


UploadToSPDocLib "c:demofile.txt" "http://WSSSERVER/Site01/DocLibName/"

While I still have other interesting ways to upload files to SharePoint, it will be the subject of another post…

And cut!

Leave a comment

SharePoint: Diagnostic Logging (ak ULS) Quick Summary

All product based on SharePoint technologies come with a built-in logging engine named Unified Logging System (ULS). It allows the applications and related component (Microsoft or third-parties) to log activity to the Windows application Event Log and/or to a log file on each server running SharePoint.

Log Location

The log files are, by default, located under C:Program FilesCommon FilesMicrosoft Sharedweb server extensions12LOGS, the file names always start with a prefix consisting in the name of the server they were generated on: <servername>-<output-format>.log.

Depending upon their configuration, some event may also be logged in the Windows Application event log.


To change the location of the log files, the following PowerShell script can be used:

$SPDiagnosticsService = [Microsoft.SharePoint.Administration.SPDiagnosticsService]::Local
$SPDiagnosticsService.LogLocation = "G:Logs"

To change the number of log files to be maintained, you can set the “LogsToKeep” property:

$SPDiagnosticsService.LogsToKeep= 24

Beware: as soon as the Update() method is invoked, the log files above the value specified will be deleted!

Note: this correspond to the setting stored in the registry at the following location: HKLM:SOFTWAREMicrosoftShared ToolsWeb Server Extensions12.0WSS. You might therefore therefore be tempted to edit it directly but MS discourage to do so and recommended to use the SharePoint API instead.

To list/set the verbosity level of each component, STSADM can be used:

To list all level (including hidden ones): stsadm.exe -o listlogginglevels [-showhidden]

To set the level for a given category: stsadm.exe -o setlogginglevel   [-category < [CategoryName | Manager:CategoryName [;…]] ] {-default | -tracelevel < trace level setting> [-windowslogginglevel] <Windows event log level setting>}

More information:

Log File Format

The log files expose the following fields:

  • Timestamp: Equivalent to the “TimeGenerated” field in the “Application” event Log
  • Process: the image name of the process logging its activity followed by its process ID (PID) between parentheses. Interestingly, IIS worker processes may also log their activity, they are therefore logged under w3wp.exe
  • TID
  • Area: This maps the “Source” field in the “Application” event Log
  • Category: this maps the “Category” field in the “Application” event Log
  • EventID: A unique internal Event ID
  • Level
  • Message
  • Correlation: may contain a link to the the EventID of another logged event


There are multiple ways to analyze ULS logs, such as:

Some Log Parser Queries applicable to ULS Logs

More Information

And cut!

Leave a comment

SharePoint: STSADM –O migrateuser to PowerShell

With the early days of SharePoint, changing user accounts in order to reflect user changes in Active Directory after domain migration, migration or split  was a nightmare.
With WSS2/SPS2003 and assuming you have compiled the excellent SPSUTIL tool , the situation was much better though not perfect (Anthony, the other Windows Director, suffered a lot from this in a previous job actually, who did not?).

With the WSS3/MOSS era, STSADM now comes with the built-in command “–o migrateuser” in order to do the job. So why bother turning this command into PowerShell? Simply because it greatly eases the automation since you can write custom scripts to scan AD, parse XML or CSV and finally update your SharePoint content DB’s accordingly. The code is incredibly simple, just like the command is:

First, load the assemblies as usual:

Then get a farm object:
$spFarm = [Microsoft.SharePoint.Administration.SPfarm]::Local

And finally use the

The first parameter being the original domain name and user name
The second parameter is the new domain nameuser name
And the third is Boolean indicating sidhistory on the new user should be used or not.

Keep in mind that while this code will modify the “Account” attribute of a user through the farm, it will not change the “Name” attribute. %If you wish to change this name and you’re lucky enough to operate MOSS, you can rely on the profile import process (see my previous post for some Powershell automation), if, on the other hand, you run WSS3, you’ll have to go through extra code that does some iterations in all your sites and modify the contents of the list named “User Information List” in each of them. The Powershell to update the list would look like:
$spUser = $spWeb.SiteUsers["NEWDOMAINNEWUSERNAME"]

A good movie match for this post could be Face/Off directed by John Woo, the "unofficial inventor" of slow motion.

Face/Off Poster

And cut!

Leave a comment

SharePoint 2007: Updating User Profile using PowerShell

Sometimes AD is not the preferred source for profiles information. Sometimes you feel like BDC is really a pain to configure. Sometime a colleague of yours comes with a plain CSV file which contains the perfect information in order to update your MOSS user profile information… Time for Jason to recover his identity 😉

First, let’s load the necessary assemblies:

Then instantiate the server context and the profile manager
$spSSP = [Microsoft.Office.Server.ServerContext]::GetContext(“MySSPName”)
$spUserManager = New-Object Microsoft.Office.Server.UserProfiles.UserProfileManager($spSSP)

Note: if you have no clue about the name of the SSP, you can check in the Central Admin web site or retrieve it from an existing web application bound to it.

Now that you have a Profile manager ready, you have to create a Profile object for each profile you wish to update. Since the code would eventually fails if no profile exists, you can also test if it exists first:
If it result returns “True”, you can go further, if not, you can skip that user or instead, create a new profile by executing:
$spUserManager.CreateUserProfile("THREADSTONEjbourne ")

Then instantiate a profile object for that user:
$spUserProfile = $spUserManager.GetUserProfile("THREADSTONEjbourne ")

And populate one or more properties:
$spUserProfile["Department"].Value = "Secret Services"

If you’re clueless about the list of properties you can update, simple retrieve them with the following command:
$spUserManager.Properties|SELECT displayName,Name

Once you’re done, save the changes using the commit method (and not update()!):

And cut!

Greets to W. N. who inspired this post 😉

Leave a comment

PowerShell: Reading an Excel Sheet using ADO.Net

Nothing new here, just a repost by popular demand… 
Many examples on the Internet show how to use the Office Automation COM object to achieve this. But under some circumstances, this is not possible because Excel is simply not installed locally.

Lets instantiate the objects we need:
$OleDbConn = New-Object "System.Data.OleDb.OleDbConnection"
$OleDbCmd = New-Object "System.Data.OleDb.OleDbCommand"
$OleDbAdapter = New-Object "System.Data.OleDb.OleDbDataAdapter"
$DataTable = New-Object "System.Data.DataTable"

Set the connection string and connect. Please pay attention to the syntax, otherwise, you’ll get cryptic errors such as “Could not find installable ISAM”. Also, the file should not be locked exclusively
$OleDbConn.ConnectionString = "Provider=Microsoft.Jet.OLEDB.4.0;Data Source=C:UserslognoulmDesktopservercfg.xls;Extended Properties=""Excel 8.0;HDR=YES"""

Optionally, to check that the connection is open, display the “State” property:

Now let’s construct a SQL query. Syntax for Excel is a little special, look at the end of this post for external references.
$OleDbCmd.Connection = $OleDbConn
$OleDbCmd.CommandText = "SELECT * FROM [Sheet1$]"

Then set the Adapter object
$OleDbAdapter.SelectCommand = $OleDbCmd

And then fill the DataTable object with the results

If everything went fine, the command above will return the number of row present is the DataTable object. To display the “raw” contents, just enter

To show the first line (aka Row), use this $DataTable.Rows[0]
And how to display a given field in that row? Just use the field header. In my XLS, one header is for example “Name”

More information can be found here:

Note: This was not tested using an XLSX files but with a standard XLS instead
Note 2: DataSet can be used instead of DataTable object but I prefer this one in favor of greater simplicity (this is to be used by sysadmins, not developersJ)
Note 3: I blogged about reading but you can update and insert too, see references above for details

And cut!

PowerShell: Sending an SMTP Mail (With a goodie)

Nothing new here, just a repost by popular demand… 

We need an SMTP Client object and a Mail Message object:

$SmtpClient = new-object

$MailMessage = New-Object

Then a basic configuration of their properties (you may have to adapt to fit you environment’s needs)

$SmtpClient.Host = “mysmtpserver.mycompany.local”

$mailmessage.from = “notification-services@mycompany.local”


$mailmessage.Subject = “Daily Uptime Reports”

$mailmessage.Body = “Server01. mycompany.local: 99.999% uptime”

And now, the special touch that makes Outlook display the message with a “bell” icon à la SharePoint notification:

$mailmessage.Headers.Add(“message-id”, “<3BD50098E401463AA228377848493927-1>”) 

All you need is to add 3BD50098E401463AA228377848493927 before the actual message ID and separate them with a dash  And then send the message


If you want your email to look even more eye-candy, enabled and add HTML to the body before sending

$mailmessage.IsBodyHtml = 1

You can even read from a file using get-content and generate the body

$mailmessage.Body = Get-Content .report.html

Find more information over here:

And cut!