Validation of DateTime in UK format in C# MVC

In a recent project I was using the bootstrap datetime picker from here:

And had it on my view like this:

@Html.TextBoxFor(m => m.EmailDate, “{0:dd/MM/yyyy HH:mm}”, new { @class = “form-control txtModal” })

@Html.ValidationMessageFor(model => model.EmailDate)

Every time I tried to submit the form I would get a jquery validation error telling me that EmailDate was not a valid date. There was a lot of material relating to this problem on Stackoverflow but much was erroneous or misleading. Many suggest that you use data annotations on the model to set the display format…but this only helps a bit if you use an EditorFor rather than a TextBoxFor helper function in the view. It does not get round the validation problem.

The essential problem to overcome is that be default jquery validation expects a date to be in US format (mm/dd/yyy). So, we have to extend the validation logic to allow for UK date formats (dd/mm/yyyy). Once we have done that, we need to cater for the time element that may have been selected using the datetimepicker. The solution below will work for a datepicker or a datetimepicker and can be altered to match any date format as it uses Regex expressions. Add the following to your view script section:

$.validator.addMethod(“date”, function (value, element) {
var stamp = value.split(” “);
var d = new Date();
var validDate = /^(0?[1-9]|[12][0-9]|3[01])[\/\-](0?[1-9]|1[012])[\/\-]\d{4}$/.test(stamp[0]);
var validTime = /^(([0-1]?[0-9])|([2][0-3])):([0-5]?[0-9])(:([0-5]?[0-9]))?$/i.test(stamp[1]);
return this.optional(element) || (validDate && validTime);
}, “Please enter a valid date and time.”);

We split the datetime value in to two parts and Regex each part separately and then combine the results to avoid having an invalid date and time parts.

Creating a cross site request using jQuery, Ajax and C# MVC

I recently had to create an add-in that allowed dynamic searching. Sometimes this is known as typeahead. This was to be used on our core workflow application which was developed in ClassicASP and is still running happily. Given the size of the data set and the performance required I developed the add-in using C#/MVC that returns a JSONP data set to the calling application for it to render as a dynamic list, any item of which can then be clicked on to use as a shortcut.

There were a couple of challenges with this approach

  • How to deal with CORS
  • How to get AJAX to get the data

In order to deal with the CORS problem, I decided to use JSONP, which is JSON with Padding. To get my controller to return a correctly formatted list it was necessary to create a custom action result class derived from the inbuilt JsonResult class – taken from this fine blog:

My controller can then use Linq to SQL to create the data list and return it via:

JsonpResult result = new JsonpResult(amodel);
return result;

The next part is to create the AJAX element so that it does not fall foul of CORS. There is a lot of information regarding this bit on the Internet but not always brought together is a way that makes sense. The key part is to include “callback=?” on the url. I also needed to pass other data items – in my case passed through html5 data tags as there was a version for our live system and test system. My jQuery ended up looking like this:

    $(document.body).on(‘keyup’, ‘#search’, function () {
        var search = $(‘#search’).val();
        var datafield = $(this);
        var domain =‘domain’);
        var analytics =‘analytics’);
        var url = analytics + ‘/Search/GetCases?callback=?’;
        // Only hit the server if the search string is greater than or equal to 3 characters.
        if(search.length >= 3){
    $.getJSON(url, { q: search, associate: associate, domain: domain }, function (data) {
    var output = ‘<ul>’;
    $.each(data,function(key, val){
                    if (val.Url == “#”){
                        output += val.CaseData;
                    else {
                        output += ‘<a href=”‘ + val.Url + ‘”>’ + val.CaseData + ‘</a>’;
You can see that I have appended ?callback=? to the url being passed to the getJSON call as well as passing in additional data items. I could have used a standard AJAX call with type: GET, in which case I could have used:


dataType: ‘jsonp’,

jsonp: ‘callback’

The callback function can be named explicitly and used in the function(data) part.

With all those bits wired up to my search box and a bit of CSS to style the returned data I ended up with an incredibly performant seach function.

I plan to continue to extend my C#/MVC server app to  offer a number of other centralised features – AJAX file uploads, document encryption and decryption and image manipulation – all of which our very difficult to do in ClassicASP.

Problem with a linked server to MySQL

I was recently faced with an issue whereby following a failover in our clustered environment, the failover node had a slightly different setup for the ODBC connection part of the linked server. The problem was manifesting itself with an error message that stated that: “Commands out of sync; you can’t run this command now” when attempting to insert a record in to the MySQL database.

The resolution to this is to go to the ODBC connector and on the Cursors setting page make sure that: Forward Only Cursor option in the OBDC options is un-ticked.

Moving SQL Server datafiles

I’ve recently had to work with our IT Department in changing the layout of the SQL Server instance created when they installed Service Manager – at the time of the install the SAN was offline so everything ended up being dumped on the C:\ drive. After some very poor performance from Service Manager they asked me to look at why it was running so slowly. Documented below are the steps necessary to move database and logfile locations in case you are presented with the same issues.

  1. Get the current physical and logical locations

USE master


SELECT name AS LogicalFileName, physical_name AS FileLocation , state_desc AS Status

FROM sys.master_files

WHERE database_id = DB_ID(‘ServiceManager’);

which gave me:

LogicalFileName            FileLocation                                                                                Status
SM_DATA                     C:\Program Files\Microsoft SQL Server\MSSQL11.SCSMR2\MSSQL\DATA\ServiceManager.mdf                                                                                ONLINE
SM_LOG                       C:\Program Files\Microsoft SQL Server\MSSQL11.SCSMR2\MSSQL\DATA\ServiceManagerlog.ldf                                                                              ONLINE

2. Take the database offline:

USE master




3. Move the data and log files physically to their new locations using the Windows OS

4. Use ALTER DATABASE to MODIFY the filename for each file that has been moved – anly one file can be done at a time

USE master





FILENAME = ‘E:\Microsoft SQL Server\MSSQL11.SCSMR2\MSSQL\DATA\ServiceManager.mdf’); — New file path

USE master





FILENAME = ‘F:\MSSQL11.SCSMR2\MSSQL\DATA\ServiceManagerlog.ldf’); — New file path


USE master



6.Verify the new physical locations

USE master


SELECT name AS FileName, physical_name AS CurrentFileLocation, state_desc AS Status

FROM sys.master_files

WHERE database_id = DB_ID(‘ServiceManager’);

Why I’d like to be mentored by Paul Randall

My knowledge of SQL Server and my approach to learning about it is rather like this WordPress site…started but very much a “work in progress”!

Having started as an Oracleite and now a SQL Serverite, I’ve had to dump chunks of memory and try and fill the holes with new knowledge and learn some painful lessons along the way. But where and how to start a logical and methodical approach to learning? Years ago it was possible to know a lot about the product as its boundaries were well defined. But now, there are so many strands to the product that it is impossible to be master of all.

I like to learn but my approach is a bit scatter gun…what to focus on and what order to learn things in? I’m hoping that Paul can help bring order to the chaos. I have some ideas but they maybe a bit off-piste.

I’m a great fan of audio books – either downloaded from my county library or bought online. I’ve worked my way through a number of long series (Aubrey/Maturin, Hornblower, Lord of the Rings, Sharpe etc) in my hourly commute to work. Many of the books have been recommendations in Paul’s blog. Not too sure I’d like to spend the hour listening to a broadcast about SQL Server though….

SharePoint 2010 and IE10

Following on from my previous post relating to setting problems with the setting of the master page content type to a value other then IE=8, I found some more useful information. The post I read actually referred to a problem with Silverlight following someone changing the content type to IE=Edge, so I wondered if the same solution was true. To read the original blog click here.

The originally released version of ASP.NET 2.0 (which SharePoint runs ontop of) contains a bug in the ‘browser definition file’ which reads the UserAgent HTTP header sent by user’s browsers to detect what browser the user is using. It can’t interpret the new UserAgent string sent by IE 10. This has the knock-on effect that SharePoint also doesn’t know what the user’s browser is, hence all the JavaScript problems.


First of all, check that this is indeed the problem in your environment by firing up IE 10, press F12 to bring up the developer tools, and force the User Agent string to IE 8.

All we need to do now is apply the hotfix to ASP.NET which updates the browser definition files (see KB2600100).

correlation 951335c3-feed-427a-b1c3-79bc2d1e0b7f

I was using the Amreim Twitter web part quite happily when suddenly they stopped work in the middle of January 2015. On checking the Amreim site I found I was not the only one. After a few days the recommendation was to change the supported doc type in the SharePoint masterpage from IE-8 to a higher version (I chose IE-edge). This was necessary because Twitter removed support for older IE versions in an update. The good news was that my Twitter feeds started showing up on my intranet page again. The bad news was that many simple operations – adding users to groups etc resulted in a correlation error (). On checking the logs it referred to not knowing about the xsd element. On Googling that I found that it was related to SharePoint not handling the move to IE-edge properly. My current work around is to do the admin tasks using FireFox, Chrome or Safari while I determine the best setting for the Twitter feeds to work and the error to disappear. I’ll update at some point.

Unable to open Document Library in Explorer View

If you are unable to open document library in Explorer view follow these steps:

(your client does not support opening this list with windows explorer)

  1. Add your site as a trusted site list in your Browser tools option
  • Start Internet Explorer.
  • On the Tools menu, click Internet Options, and then click the Security tab.
  • Click Trusted sites, and then click Sites.
  • In the Trusted sites dialog box, type the URL (http://servername:port) of your site in the Add this website to the zone box, and then click Add.
  • Click Close, and then click OK.
  1.  Make sure the Web client service is Started.
  2. If your are using the server as a client then install Desktop Experience Feature (from Server Manager add the feature and then re-boot)

Removing features from a content database in SharePoint 2010 using PowerShell

The great thing about the Health Analyzer in SharePoint 2010 is that it will report on a number of potential issues with the server farm, which may cause a problem later whilst applying a cumulative update or service pack. Resolving these issues in advance will help to prevent an update failing when you run the SharePoint Configuration Wizard.

One of these problems may occur when a solution is removed from the farm before the corresponding features were deactivated from site collections and sites. The Health Analyzer will place this issue in the “Configuration” category with the title “Missing server side dependencies”.

The error message reported will look similar to this one:

[MissingFeature] Database [WSS_Content] has reference(s) to a missing feature: Id = [d3fc1457-1a7b-46b1-a049-1fbef0db7415], Name = [AE Documents Rollup Web Part], Description = [Rolls up all new site Documents], Install Location = [AEDocsRollupWebpart]. The feature with Id d3fc1457-1a7b-46b1-a049-1fbef0db7415 is referenced in the database [WSS_Content], but is not installed on the current farm. The missing feature may cause upgrade to fail. Please install any solution which contains the feature and restart upgrade if necessary.

As shown above, this message reports a content database name (WSS_Content) and feature ID (d3fc1457-1a7b-46b1-a049-1fbef0db7415), but not the sites or site collections where the feature exists. In addition to this, even if you did know where the feature was activated, it will not appear anywhere in the UI for you to deactivate because the solution has been removed from the farm.

Using the PowerShell script developed by Phil Childs ( you can not only locate the feature in the farm but also remove it permanently.

Open PowerShell in Admin mode and paste the following:

Add-PSSnapin -Name Microsoft.SharePoint.PowerShell

function Remove-SPFeatureFromContentDB($ContentDb, $FeatureId, [switch]$ReportOnly)


$db = Get-SPDatabase | where { $_.Name -eq $ContentDb }

[bool]$report = $false

if ($ReportOnly) { $report = $true }

$db.Sites | ForEach-Object {

Remove-SPFeature -obj $_ -objName “site collection” -featId $FeatureId -report $report

$_ | Get-SPWeb -Limit all | ForEach-Object {

Remove-SPFeature -obj $_ -objName “site” -featId $FeatureId -report $report




function Remove-SPFeature($obj, $objName, $featId, [bool]$report)


$feature = $obj.Features[$featId]

if ($feature -ne $null) {

if ($report) {

write-host “Feature found in” $objName “:” $obj.Url -foregroundcolor Red




try {

$obj.Features.Remove($feature.DefinitionId, $true)

write-host “Feature successfully removed from” $objName “:” $obj.Url -foregroundcolor Red


catch {

write-host “There has been an error trying to remove the feature:” $_




else {

#write-host “Feature ID specified does not exist in” $objName “:” $obj.Url



To identify where the feature is installed on the farm use:

Remove-SPFeatureFromContentDB -ContentDB “WSS_Content” -FeatureId “d3fc1457-1a7b-46b1-a049-1fbef0db7415” –ReportOnly

To remove the feature from the farm use:

Remove-SPFeatureFromContentDB -ContentDB “WSS_Content” -FeatureId “d3fc1457-1a7b-46b1-a049-1fbef0db7415”