SharePoint 2013 Setup – SQL Server Has An Unsupported Version

This is actually a fairly common error to run into. The scenario is pretty straightforward, I ran into it when running SharePoint 2013 on Windows 2008 R2 SP1 with all the update goodness installed and SQL Server 2008 R2 SP1 with CU 6 and all the rest of the additional shit required to install. Following, run the SP config wizard, and get to the point where you specify the database server / instance,  and boom you run into:

SQL server at domain\instance has an unsupported version [some version here]

Which is fine, pretty easy error. The part that might not make a ton of sense, or didn’t to me anyways, is I was at 10.50.2817.0 and it was bitching about 10.50.1600.1. Clearly, I am ahead on version increment so I shouldn’t hit that particular problem. But wait, it can get weirder. You can select one instance on the same server, and it will work, if you select a different instance it might fail!

The problem isn’t related to SharePoint, but how SQL server processes cumulative updates. Long story short, whenever you have multiple instances on a SQL box you need to reapply the patches after generating the new instance. For whatever reason it won’t automatically update. Which is a bunch of shit IMHO. So in essence, if you run into this, what I did to fix it was reapply  SP1, CU4 and CU11.


Free Software – SharePoint Kerberos Buddy – Detect And Repair Kerberos Issues

The next best thing to a SharePoint security consultant! Kind of.

Kerberos authentication with SharePoint, and some of the middle-tier issues it presents (particularly when examining orthodox double-hope scenarios) can become both arduous, and frankly redundant to fix. A lot of the time it is typical that a SharePoint consultant is presented with a remediation project where the Kerberos environment is malformed, and the specific issue can be attributed to a wide variety of components, on different machines with different roles etc. As such, it makes sense to provide some level of automation for the detection of such misconfiguration since the actions taken for detection are pretty rhythmic. Since Kerberos is a pretty black-and-white technology (either works or it doesn’t!), pushing recommendations for a fix based on a large set of data capture from all subscribed tiers is pretty feasible. Obviously not going to be 100% since environments are also particular, but it can be pretty close and still give relevant, useful advice.

In itself the data capture presents a fundamental issue, since the machines that are involved in a SharePoint/Kerberos mixed-tier environment can potentially be exclusive (meaning, not on the same box so flowing authentication through), delegation problems can arise on four (this can be a lot more, but from a bare-bones BI environment perspective with full break-out of roles) main tiers: the client, the SharePoint WFE, the SSRS machine (obviously when not in integrated mode), and the SSAS machine. The latter two of these becoming vital when implementing a business intelligence solution where PowerPivot and PerformancePoint are relevant issues to consider and certain services like SSAS are off-loaded. Considering account break-out to isolate services to particular identities, the delegation scheme becomes even more complex. So there are a lot of things to consider, and a lot of things that can go wrong. Therefore all tiers have to be considered and data provided from each:

After examining what I feel is a reasonable breadth of Kerberos issues over the past few years there are a lot of common things that can be automatically checked, and solution advice automatically written that help to solve those issues. Some of these are as small as tolerated machine time differences and others as complex as port checks for clustered or balanced SSAS instances in SharePoint/BI environment. For example, consider 20 of the things that the tool will automatically check:

  1. Client Internet Explorer Settings
  2. Client Delegations
  3. Machine Time Differentiation Between Tiers
  4. Proper Domain Trusts
  5. OLE DB Provider (SSRS, SSAS) Types And Versions
  6. Required Data Warehouse Instance 
  7. Provider Versioning Checks
  8. Malformed Provider Strings
  9. HTTP Host Name Checks
  10. IP Address Conflicts
  11. Duplicate SPN’s
  12. Malformed SPN’s (Both Those That Are Causing Errors As Well As Unnecessary Ones)
  13. SharePoint Application Pool Account Delegation
  14. Authentication Provider Types
  15. Configuration Files (SSRS, SSAS, SharePoint)
  16. Connection String Verification
  17. Named Instance Pre-Req’s
  18. SQL Browser Settings
  19. Cluster/Port Resolution
  20. Kerberos.dll Bug-Fix Existence

So, it’s pretty robust as that is just a subset of the checks, there are others that I am forgetting. I am sure I will be adding onto it at a later time as well.

When you first open the tool, you will be presented with the primary analysis screen that will offer very little enabled controls. However notice the fine icon use that is sorta relevant to the word Kerberos, but I think it’s actually just three dogs looking different directions. A majority of the interface buttons will be not enabled since no analysis files are present yet, firstly the tool must be run on all relevant tiers within the environment.

SharePoint Kerberos Main Form

Holistic analysis cannot occur unless properly formatted *.sharepointkerberos files (nothing fancy about the file type just the name I choose when going between a BinaryFormatter) have been generated on all tiers, as you will see shortly when present these files will enable the Analyze option in the primary analysis form. 

Firstly, you will select which role that the machine you are running the SharePoint Kerberos Buddy on (it is possible to run remotely, but it introduces a bunch of possible problems that are not handled currently within the tool) is targeting, either the Client, SharePoint WFE, stand-alone SSRS, or SSAS role. This is accomplished by selecting the Configure Profiling button at the top of the application which will display a new setup screen to adjust role targetting. Once this button is selected, you will be presented with the Kerberos Delegated Environment Role Selection screen:

For the Client option, no further information is required however do not run the tool on the SharePoint WFE *as the client test*. Meaning, collect the data for the client analysis results on a domain authenticated (the tool will tell you if it detects it is running under a local account) client machine that is used to access SharePoint during normal business operations. Kerberos testing for the delegated machine will heavily skew the results if collected on the SharePoint machine. Once the IE Client option is selected, simply selecte the Initialize button. This will close the current configuration form and enable the Client Results button on the main form.

Selecting the Client Results button will display the collated, formatted result in a seperate form:

For the SharePoint tier, a URL must be provided. Best case scenario your most commonly accessed URL is also the default path in your SharePoint AAM settings, and this is the one you should use in this instance. The application doesn’t really dig it when the AAM settings are all over the map. Since your SPN registration will follow certain service protocols based on the bindings you configured for SharePoint, this has to coordinate to the appropriate URL. Select the Configure Profiling option and enter the URL in the SharePoint URL box:

Click the  Intialize  button and then the SharePoint Results button will become enabled. Then you can view the holistic results in the common informational display form:

For a Stand-Alone SSRS instance, the selection will be provided for you and the SSAS settings for how Kerberos functions require a well-formed data source connection string. Once the role is selected and the prerequisite information is populated, simply click the Initialize button.

At this point, I would imagine that someone is wondering what the hell is the Initialize button is actually doing. All it is doing is generating files that you will have to bring over into one, cohesive Analysis directory:

These files also sponsor the pooling of information from the ad-hoc result button clicks. For example final warehouse analysis provides the following information that gathers the required information from the aforementioned:

Once the data capture is initialized, a series of prompts will be visible while the SharePoint Kerberos Buddy is collecting information from both custom routines in itself as well as using a bunch of applications that are distributed natively with the OS and associated server platforms. This information is written to standard output for information purposes, however importantly is dumped into the Analysis directory located in the program installation folder.

These files are the information required for final analysis to occur, and why the Analyze button in the applications primary interface is not available until all requisite files are present.

The file types that are created are pretty standard, the important ones are built as custom file type *.sharepointkerberos. Complete analysis might not be required depending on whether the exact error is caught beforehand by examining the ad-hoc output from the tool, which may or may not point to your direct error, usually it won’t.

After the tool is run, the MSFT reference articles will be pooled and a list of the potential errors (both information, warnings and operation blocking errors) will be written out. These errors will have both a suggested resolution as well as the link to the MSDN/TechNet support site which verifies that this is indeed a practical action to take. I really wanted to make the hyperlinks clickable, but I forgot the whole time I was using a TextBox control instead of a RichTextBox, that will probably be the first thing I change around.

If you want to read more about why you are experiencing the error or the brief resolution path I am suggesting through the tool isn’t enough, just follow the MSDN link! The tool is available on the following CodePlex site:


Programmatically Using Windows Search Service 4 to Search Network Shares

Having had to search for this for hours, combine about 50 different approaches, I decided to write a small blog post on using Windows Search Service 4 in order to search remote shares. First question you probably have is, why the hell would I use Windows Search Servicewhen I can just use the SharePoint search?

There are a couple reasons.

First and foremost is the consideration that TDE (Transparent Data Encryption) might introduce unacceptable overhead for a 2010 environment that requires encryption for data-at-rest (all data in computer storage while excluding data that is traversing a network or temporarily residing in computer memory to be read or updated). While getting practical times related to performance stuff like this is dependent on several factors, you can expect between 3% to 30%, 3% being on the low end for low I/O Low Usage- High Performance scenarios and 30% for the other end of the spectrum.

Second is cost. An Enterprise license of SQL is not cheap. For one off applications where security is a concern this can be a deal breaker. Such applications generally mandate that a separate, orphaned database instance be used. Thus the cost incursion might be something that a company cannot absorb.

And there are other reasons…but I get diverted!

So if your documents are being indexed on a remote share the objects that are of most importance are:

CSearchManager – Provides methods for controlling the search service.
CSearchCatalogManager – it’s…you know the catalog manager.
CSearchQueryHelper – allows a conversion from Advanced Query Syntax (AQS) to a SQL query. So you aren’t pulling your hair out writing your multiple query types.

To query the data source, you are going to use:

OleDbConnection – provide a unique connection to the search data source
OleDbCommand – Represents a SQL statement to execute against a the search data source
OleDbDataReader – used to read the search data quickly

Alrighty, now let’s talk about how all these things are going to play together.

First thing is first, and you have to create an object that you can build a collection up of that you can hydrate with the properties that you are interested in. Something like:


public class Result
public string FullPath { get; set; }
public int Rank { get; set; }


Nothing fancy about this, we are just creating a holding type to populate a strongly typed collection with our relevant return values so we can bind it to some display vehicle at some other point in the application. The reason that we need it is because it is going to be a part of our method return type. Since there is nothing fancy about the properties we just throw in auto-properties.

Now onto the searching part of the application, using those particular, aforementioned objects from before. The first thing to do is hydrate a reference to the search service instance by creating a new CSearchManager object, off of which a new CSearchCatalogManager can be spawned using the CSearchManager.GetCatalog method, which in turn points to the local SystemIndex. Using the new CSearchCatalogManager object the GetQueryHelper method can be invoked to create a new CSearchQueryHelper to start query specifications with its numerous properties. The CSearchQueryHelper.GenerateSQLFromUserQuery is awesome because you can simply pass in the string parameter that represents the content being searched, and then to grab the remote index we slightly modify the query content before using it.  


queryHelper.QueryWhereRestrictions = string.Format(“AND (\”SCOPE\” = ‘file://{0}/{1}’)”, ServerName, Share);




selectCommand = selectCommand.Replace(“FROM \”SystemIndex\””, string.Format(“FROM \”{0}\”.\”SystemIndex\””, ServerName));



private const string ServerName = “<Your Server Name>”;
private const string Share = “<Your Network Share Name>”;

public List<Result> SearchFolder(string strContent)
var items = new List<Result>();
if (string.IsNullOrEmpty(strContent))
return items;
var manager = new CSearchManager();
CSearchCatalogManager catalogManager = manager.GetCatalog(“SystemIndex”);

CSearchQueryHelper queryHelper = catalogManager.GetQueryHelper();
queryHelper.QueryContentLocale = 1033;
queryHelper.QueryMaxResults = 100;
queryHelper.QueryKeywordLocale = 1033;
queryHelper.QuerySelectColumns = “System.ItemPathDisplay,System.Search.Rank,System.ItemNameDisplay”;
queryHelper.QueryWhereRestrictions = string.Format(“AND (\”SCOPE\” = ‘file://{0}/{1}’)”, ServerName, Share);
queryHelper.QuerySorting = “System.ItemPathDisplay ASC “;
queryHelper.QueryContentProperties = “System.Search.Contents,System.ItemNameDisplay”;

string selectCommand = queryHelper.GenerateSQLFromUserQuery(strContent);
selectCommand = selectCommand.Replace(“FROM \”SystemIndex\””, string.Format(“FROM \”{0}\”.\”SystemIndex\””, ServerName));
using (var conn = new OleDbConnection(queryHelper.ConnectionString))
using (var command = new OleDbCommand(selectCommand, conn))
using (OleDbDataReader wdsResults = command.ExecuteReader())
if (wdsResults != null)
while (wdsResults.Read())
var aResult = new SearchResultItem
FullPath = wdsResults.GetString(0),
Rank = wdsResults.GetInt32(1),

return items;


And that’s it! Hope I save someone some headaches!