Skip to content

Windows 10 Upgrade on Surface Pro 3 – Sending email issue FIXED

windows-10

I very rarely blogabout anything other than SQL Server but felt that with a lot of the SQL Community using SP3’s that this may actually be helpful to some.

After playing around in a VM for a while now I decided to upgrade my SP3 to Windows 10 which comes as a free upgrade. I was surprised that after making the schoolboy error of believing my charger was actually charging the unit during the upgrade and the battery running flat 30% through the upgrade, it fired back up and started off from that point with no issues whatsoever! I honestly thought that was it and it would be a factory reset.

I won’t go on about what’s changed, what I like and what I don’t like, you can read plenty of that around on the internet and it is out of the scope of this blog post.

However, I did discover one issue which I know several others have had when upgrading their SP3. After the upgrade I could no longer send emails via (IMAP) outlook. I could send via every other device so knew it was specific to the upgrade. I managed to find a couple of others on the windows forums with the same issue and they provided the solution which I thought I would share.

The fix is actually very simple, all you need to do is open a command prompt, run sfc /scannow and wait ~10mins.

The System File Checker will scan your Windows system files for corruption and attempt to repair them, below is the output from my run:

sfc

I won’t go into details of the log file but the repaired file was related to the “Multilingual User Interface” (mui) files, which are translation files used to support different languages within windows. Mine in particular was mlang.dll.mui.

So, after a mild panic the fix was relatively straight forward……once I knew what the problem actually was!

SQLNorthEast Usergroup 2015 dates announced (preliminary)

Mike and I have been extremely busy over the Xmas period and we’ve finally sorted dates for our 2015 instalment of the SQLNorthEast SQL Server UserGroup (@SQLNE) in Newcastle. The great news is that after much negotiation we have managed to get agreement in principal to use the same venue for our events which is fantastic news!

Please see www.sqlne.com for info on our next meeting and registration. Due to sqlpass website restrictions we cannot display all the dates for 2015 but a quick search on Eventbrite will give you the relevant details.

The dates are as follows:

Feb: Tue 10th (with Chris Adkin – double session)
March: Tue 24th (with Erin Stellato from SQLSkills and Peter Shaw)
April: Tue 28th (with Neil Hambly)
June: Tue 2nd (with Steve Powell)
July: Tue 7th (with Annette Allen)
Sept: Tue 8th (TBC)
Nov: Tue 24th (TBC)

As you can see we have already lined up a number of fantastic speakers including a special remote session from the world class sqlskills Principal Consultant Erin Stellato!

2014 was a great year for us and due to the success of our second SQLRelay we’re going to be sending out a survey to find a bit more information about the needs and wants of our delegates to help us set our content for 2015. We have a very mixed bag of experience as well as SQL Server areas so any help or ideas we can get is a great help. If you wish to complete this now then you can find it at SurveyMonkey.

If you have any thoughts or ideas regarding how we can improve then please get in touch via email or twitter and we’ll endeavour to incorporate it.

Merge csv files – quick PowerShell snippet

During a bit of work I’ve been doing this evening for SQLRelay 2014 I used something I have in my arsenal of PowerShell scripts which I thought I’d share because I love it’s simplicity :) It’s nothing big and fancy but something that is extremely useful. Tasked with merging a large number of csv files there (as always) is a quick and easy way to do this with PowerShell:

Get-ChildItem *.csv |
ForEach-Object {Import-Csv $_} |
Export-Csv -NoTypeInformation WhateverYouWantToCallTheFile.csv

There are ways to make this a little more dynamic which I will update the post with in the coming weeks….

2013 in review

The WordPress.com stats helper monkeys prepared a 2013 annual report for this blog.

Here’s an excerpt:

The concert hall at the Sydney Opera House holds 2,700 people. This blog was viewed about 12,000 times in 2013. If it were a concert at Sydney Opera House, it would take about 4 sold-out performances for that many people to see it.

Click here to see the complete report.

Output SQL Server data from multiple tables to Tab Delimited text files using Powershell

I had a request this morning for something I though was actually very simple:

Client: “Can you extract all data for these particular tables including column headers to a tab delimited .txt file?”
Chris: “Sure, no problem, I’ll just run bcp querying sys.tables using a COALESCE loop to output the statements”
Client: “Top stuff, let me know when it’s done”

So, away I went generating my script which took a matter of minutes and run it…….where’s the column headers? Bugger, forgot that bcp doesn’t output column headers without doing some funky stuff by creating a header record in a separate file and merging that with the file of data.

With this in mind I knew creating a SSIS package (or using export data to generate – very manual unless I delved into the realms of BIML) could do this but I thought I’d have a look at powershell invoking sqlcmd.

Again, this all seemed to be going very well until I came to outputting the data to a tab delimited .txt file. As far as I’m aware Powershell does not have an Export-Txt so I had to look into how I can use the Export-Csv to actualy output to .txt tab delimited as opposed to comma separated and found the parameter -delimiter “`t” – Excellent!!! Added this in and run the script…………and the first row consisted of “#TYPE System.Data.DataRow” – wft!?!?!?!?!

Quick search on my search engine of choice showed that there is a parameter that you can pass in to remove this from the export -NoTypeInformation.

Run it again with -NoTypeInformation and everything worked as expected apart from all column headers and data had quotes (“) around them which was not part of the requirement. Unfortunately (as far as I know) there is no switch, parameter or the likes that does this so I had to change the Export-Csv to ConvertTo-Csv and run a Replace on ‘”‘ with ” which managed to do the trick.

I’ve included the script below which can be tailored to your needs:

$server = 'ServerInstanceHere'
$database = 'DBNameHere'
$path = 'c:\work\ToDelete\'
$query = "SELECT name FROM sys.tables WHERE name in (
 'TableNameHere_1',
 'TableNameHere_2'
 --etc etc
 )"
$queryToOut = "SELECT * FROM $TableName"

#Get list of table names to output data
$Tables = invoke-sqlcmd -query $query -database $database -serverinstance $server
foreach ($Table in $Tables)
{
 $TableName = $Table["name"]
 write-host -ForegroundColor Green "Creating File $TableName.txt"
 invoke-sqlcmd -query $queryToOut -database $database -serverinstance $server | `
 #Convert as opposed to Export to replace quotes if required
 ConvertTo-Csv -NoTypeInformation -delimiter "`t" | `
 ForEach-Object {$_ -Replace('"','')} | `
 Out-file $path$TableName.txt
 #Export-Csv -NoTypeInformation -delimiter "`t" -path $path$TableName.txt
}

Apologies for the formatting but the powershell script tag doesn’t seem to format it the way I’m wanting it to :(

SQL Server NorthEast – New Usergroup!!

SQLBits in Nottingham was where it all began. A short conversation on whether there were any plans for a SQL usergroup in Newcastle with Richard Douglas (@SQLRich) and whether there was scope for me to begin setting one up quickly moved onto conversations with Chris Testa-O’Neill (@ctesta_oneill ) and eventually Jonathan Allen (@fatherjack). This became quite a lengthy chat regarding the ins, outs, ups, downs of setting up such a thing.

This was back in May. Four months of venue hunting, speaker negotiations and marketing led to the first ever SQL Usergroup in Newcastle – #sqlnortheast :)

The schedule was set up for Gavin Campbell and Neil Hambly to make their merry way up to the north east and give two fantastic sessions. So with venue sorted, speakers sorted, food sorted, attendees sorted, we were all set. Boooooom! Then the bombshell hits, a few days before the UG Neil anounces he can’t make it :(

With the first SQL UG in the north east hanging in the balance, up steps a very good friend of mine Chris McGowan (@ckwmcgowan), who was willing to make the trip from Manchester at such short notice and save the day! With 18 people regstered, this was about 17 more than I was expecting. With no initial indication as to the level of interest we could generate in the Northeast I was over the moon with the uptake. The integration between the group was phenominal and there was such a broad range of knowledge and skills ranging from hardcore sql internals DBA types to developers to Azure – made for great conversations.

So, September 3rd came along and surprisingly all seemed to be going well. Both speakers turned up on time, food turned up and most importantly 15 people turned up on the evening which was gobsmacking. All in all it turned out to be an extremely good evening / night. Few beers with everyone afterwards on the Quayside led to far too many beers with Chris and Gavin back at the hotel bar – wasn’t a pretty sight the next morning!

What an experience and from the excellent all round feedback received from the attendees, this is something that they hope will continue.

Michael Robson (@heymiky) and myself are currently trying to work out dates for next year and organise speakers. We’ve had a bit of a break from the UG due to taking on a leg of sqlrelay in Newcastle (November 25th) but we do have a “SQL on the Lash” evening session set up for December to end the year on a high.

I’ll be reporting back with the how it all goes with sqlrelay and with any further anouncements on dates for sqlnortheast UG in 2014.

Few pics from the session:


photo (4)

photo (5)

photo (6)

Making sure your Triggers fire when they should

As some of you may be aware, triggers are not my favourite thing in the world but like most things, it does have its place.

Whilst onsite with one of my clients, one of the processes fires a trigger on insert which ultimately runs a SSRS subscription to email a report. All sounding fairly feasible so far. However, this process is also used as part of a batch process overnight which would run a separate insert statement (actually another stored procedure in another job step) instead of the “onDemand” insert. Ok, still doesn’t sound like too much of an issue.

Now, they started experiencing occasional failures of this job during the day with the error relating to the fact that the SSRS subscription job was being called when it already was running. Interesting, this in theory shouldn’t ever happen because the process either ran the jobs based on the batch process or the one off onDemand.

Stepping through the process, it led me to an AFTER INSERT trigger. Upon opening it I spotted the issue straight away. Something that as I’ve found over the years as a consultant, a lot of DBA’s and developers have failed to understand that (from MSDN ):

These triggers fire when any valid event is fired, regardless of whether or not any table rows are affected.This is by design.

So, the issue was that step 3 ran a procedure which ultimately ran an insert statement for the onDemand insert, step 4 ran a procedure to insert for the overnight batch process which as it happens doesn’t have any records to insert but will in fact fire the trigger to run the SSRS subscription again! There is a number of ways to fix this but I’ve tended to stick with a basic check of the “inserted” table for results and RETURN out if no records are there to process.

I’ve supplied a bit of test code below for people to try this out.

Lets create a test table and an audit table:

USE tempdb
GO

IF  EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'[dbo].[TestTable]') AND type in (N'U'))
DROP TABLE [dbo].[TestTable]
GO
CREATE TABLE [dbo].[TestTable]
(
	TestTableID INT IDENTITY(1,1),
	TestTableDescr VARCHAR(20)
)
GO

IF  EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'[dbo].[AuditTrigger]') AND type in (N'U'))
DROP TABLE [dbo].[AuditTrigger]
GO
CREATE TABLE [dbo].[AuditTrigger]
(
	AuditTriggerID INT IDENTITY(1,1),
	AuditTriggerDescr VARCHAR(20),
	DateCreated DATETIME
)
GO

INSERT INTO dbo.TestTable (TestTableDescr)
VALUES ('Test1'), ('Test2'), ('Test3');

SELECT * FROM dbo.TestTable;

Now lets create the trigger with no checking:

USE [TempDB]
GO

IF  EXISTS (SELECT * FROM sys.triggers WHERE object_id = OBJECT_ID(N'[dbo].[trTestTable]'))
DROP TRIGGER [dbo].[trTestTable]
GO

CREATE TRIGGER [dbo].[trTestTable] ON [dbo].[TestTable]
   AFTER INSERT
AS 
BEGIN

	--Log the fact the trigger fired
	INSERT INTO [dbo].[AuditTrigger] (AuditTriggerDescr, DateCreated)
	SELECT 'Trigger Fired', GETDATE()
	
END
GO

Test Inserting a record that exists:

--Valid Insert
INSERT INTO dbo.TestTable (TestTableDescr)
SELECT TestTableDescr
FROM dbo.TestTable
WHERE TestTableDescr = 'Test1';

SELECT  *
FROM    [dbo].[AuditTrigger];

Test Inserting a record that doesn’t exist:

--Not a Valid Insert
INSERT INTO dbo.TestTable (TestTableDescr)
SELECT TestTableDescr
FROM dbo.TestTable
WHERE TestTableDescr = 'Test4';

SELECT  *
FROM    [dbo].[AuditTrigger];

You’ll now see that there are 2 entries in the AuditTrigger table due to the fact that the trigger fired even though no records were actually valid to insert.

So, lets amend the trigger to check for valid inserts:

USE [TempDB]
GO

IF  EXISTS (SELECT * FROM sys.triggers WHERE object_id = OBJECT_ID(N'[dbo].[trTestTable]'))
DROP TRIGGER [dbo].[trTestTable]
GO

CREATE TRIGGER [dbo].[trTestTable] ON [dbo].[TestTable]
   AFTER INSERT
AS 
BEGIN
	
	--Check to see if any records were inserted
	IF NOT EXISTS (SELECT 1 FROM INSERTED) 
		RETURN 
		
	--Log the fact the trigger fired
	INSERT INTO [dbo].[AuditTrigger] (AuditTriggerDescr, DateCreated)
	SELECT 'Trigger Fired', GETDATE()
	
END
GO

and test the inserts again:

Test Inserting a record that exists:

--Valid Insert
INSERT INTO dbo.TestTable (TestTableDescr)
SELECT TestTableDescr
FROM dbo.TestTable
WHERE TestTableDescr = 'Test2';

SELECT  *
FROM    [dbo].[AuditTrigger];

Test Inserting a record that doesn’t exist

--Not a Valid Insert
INSERT INTO dbo.TestTable (TestTableDescr)
SELECT TestTableDescr
FROM dbo.TestTable
WHERE TestTableDescr = 'Test4';

SELECT  *
FROM    [dbo].[AuditTrigger];

No record will have been inserted with the final insert statement!

Lets clean up our tempdb:

USE [TempDB]
GO

--Clean up
IF  EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'[dbo].[TestTable]') AND type in (N'U'))
DROP TABLE [dbo].[TestTable]
GO
IF  EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'[dbo].[AuditTrigger]') AND type in (N'U'))
DROP TABLE [dbo].[AuditTrigger]
GO

Hopefully this will help point out the misconception that triggers only fire when records are actually inserted :)

As per usual, I’d like to hear peoples thoughts/experiences on this topic.

Follow

Get every new post delivered to your Inbox.