SQLBits in Nottingham was where it all began. A short conversation on whether there were any plans for a SQL usergroup in Newcastle with Richard Douglas (@SQLRich) and whether there was scope for me to begin setting one up quickly moved onto conversations with Chris Testa-O’Neill (@ctesta_oneill ) and eventually Jonathan Allen (@fatherjack). This became quite a lengthy chat regarding the ins, outs, ups, downs of setting up such a thing.
This was back in May. Four months of venue hunting, speaker negotiations and marketing led to the first ever SQL Usergroup in Newcastle – #sqlnortheast
The schedule was set up for Gavin Campbell and Neil Hambly to make their merry way up to the north east and give two fantastic sessions. So with venue sorted, speakers sorted, food sorted, attendees sorted, we were all set. Boooooom! Then the bombshell hits, a few days before the UG Neil anounces he can’t make it
With the first SQL UG in the north east hanging in the balance, up steps a very good friend of mine Chris McGowan (@ckwmcgowan), who was willing to make the trip from Manchester at such short notice and save the day! With 18 people regstered, this was about 17 more than I was expecting. With no initial indication as to the level of interest we could generate in the Northeast I was over the moon with the uptake. The integration between the group was phenominal and there was such a broad range of knowledge and skills ranging from hardcore sql internals DBA types to developers to Azure – made for great conversations.
So, September 3rd came along and surprisingly all seemed to be going well. Both speakers turned up on time, food turned up and most importantly 15 people turned up on the evening which was gobsmacking. All in all it turned out to be an extremely good evening / night. Few beers with everyone afterwards on the Quayside led to far too many beers with Chris and Gavin back at the hotel bar – wasn’t a pretty sight the next morning!
What an experience and from the excellent all round feedback received from the attendees, this is something that they hope will continue.
Michael Robson (@heymiky) and myself are currently trying to work out dates for next year and organise speakers. We’ve had a bit of a break from the UG due to taking on a leg of sqlrelay in Newcastle (November 25th) but we do have a “SQL on the Lash” evening session set up for December to end the year on a high.
I’ll be reporting back with the how it all goes with sqlrelay and with any further anouncements on dates for sqlnortheast UG in 2014.
Few pics from the session:
Just a very quick blog today regarding an issue that has arisen with one of my clients. During Integration it became apparent that one table in particular was failing during the weekly consistency checks, the error being output:
DBCC results for ‘sys.extended_index_1696529623_384000′.
There are 313423 rows in 1627 pages for object “sys.extended_index_1696529623_384000″.
DBCC results for ‘schema.Table’.
There are 312246 rows in 12192 pages for object “schema.Table”.
Msg 0, Level 11, State 0, Line 0
A severe error occurred on the current command. The results, if any, should be discarded.
Msg 0, Level 20, State 0, Line 0
A severe error occurred on the current command. The results, if any, should be discarded.
A bit of background. The server is running SQL Server 2008R2 SP1 CU2 and the database in question is still in compatibility 90 (SQL Server 2005). The table in question has a spatial index on a Geography column.
So, how do we fix this? Well there’s a couple of options.
- Change the compatibility to 100
- Install SQL Server 2008R2 SP1 CU3…
This is a documented issue (kb 2635827) and the fix can be found on Microsoft’s Support Pages.
FIX: Access violation when you run a DBCC CHECKDB command against a database that contains a table that has a spatial index in SQL Server 2008 or in SQL Server 2008 R2
As to which fix we deploy, well that’s for tomorrow’s fun and games
I finally pulled my finger out and took the last exam of the MCSE SQL Server 2012 – Data Platform certification last week and passed with a score of 876 which is pretty respectable in my opinion
The exam consisted of 3 sections:
So those of you who are clever enough to work out that’s a grand total of 45 questions. I’m pleased to say that once again the quality of the questioning was to Microsoft’s usual standard, not in terms of difficulty but more in terms of complete irrelevance!
I’m bound by NDA so can’t go into specifics but I’ll try and provide and example of this complete and utter irrelevance without giving the game away:
Q: What is your favourite holiday destination?
A (Select one of the following):
Hmmmm, bit of a tricky one here. From what I can tell, none of the answers bear any relevance whatsoever to the question…….hmmmmm……..suppose I’ll have to take a random guess and hope that the answer I select is the one they’ve set as being correct!!!
It was just a good job i wasn’t on the threshold of pass/fail as this could potentially have been the difference. From the actual question, it was to be a simple answer but I suppose I’ll never know whether or not my random guess worked out or not :(. I just hope the comments I left regarding this issue are taken up and fixed so others don’t have the same issue.
All in all I felt comfortable throughout the exam but do know for a fact that my knowledge of Assemblies / CLR’s has slipped significantly and I need a refresher.
So that’s it. SQL Server 2012 MCSE done and dusted and no more exams……well for the next 3 years at least. So whats next? Not sure, there is the MCSM exams but I have to be honest, I’ve got no formal certifications in SQL Server 2005/2008 but I know for a fact that I have far more knowledge and experience of its features than I do for SQL Server 2012 so I suppose I kind of backed up my original issue I have with these exams. Yes I studied and yes I did learn a lot from doing so but in no way shape or form would I class myself as an “Expert” in SQL Server 2012 – I don’t believe anyone could!
As per usual, any thoughts or comments are welcome.
Well I finally got around to completing the MCSA aspect of the SQL Server 2012 Certification and I’m pleased to say i passed with flying colours. As some of you may be aware I managed to nab and pass 3 of the Beta exams (70-461, 70-462 and 70-465) back in April and decided to see the MCSE through.
I really wasn’t sure how this exam was going to go as I’ve been working a lot recently with MDS 2012 and SSIS 2008 and revised the new 2012 features but went in with no real expectations. The exam consisted of 55 questions, again ranging from multiple guess, select the 3 things you’d do in order to a new feature i’ve not seen before and that is a drag n drop facility of a SSIS control flow which I thought was nifty.
The area I thought I’d struggle on was DQS but in fact found that aspect relatively simple, the difficult area for me was the “select the 3 things you’d do in order” relating to the new Project Deployment area of SSIS 2012. I’ve done a fair bit of “tinkering” with this over the last few months but its obvious I’m not as prolific as I thought as I found certain questions difficult to get my head around what it was suggesting in the answers. I obviously did ok in this area (according to the score sheet) but at the time i was sweating a bit.
Anyone wanting hints and tips, I obviously can’t go into detail but I’d definitely brush up on the new features of SSIS 2012!!!
Oh, and anyone wanting to know, the pass mark is 700 – none of the Beta exams told you this and I know some have said it was actually 800……
Now onto 70-464 - Developing Microsoft SQL Server 2012 Databases, to complete the SQL Server 2012 MCSE certification!!!!
I wouldn’t class myself as an expert in SSIS but I certainly know my way around but came across something today which I thought I’d share. As with a lot of things there are “many ways to skin a cat”, none of which is something I’ll go into at the moment but what i will concentrate on is updating columns in a table where the data has changed in the source.
One of the projects I’m currently working on requires this very process and when i set about doing so I created the T-SQL Merge statement to do the business. However, the question was raised as to why I didn’t use SSIS’s built in component Slowly Changing Dimension (SCD)? I didn’t really have an answer other than personal preference but decided to delve into it a bit further and compare the performance of each method.
As a test, I created a source table with a Key and Name column:
USE TempDB; IF EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'dbo.iSource') AND type in (N'U')) DROP TABLE dbo.iSource; CREATE TABLE dbo.iSource ( ID INT, Name varchar(100) ); IF EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'dbo.iTarget') AND type in (N'U')) DROP TABLE dbo.iTarget; CREATE TABLE dbo.iTarget ( ID INT, Name varchar(100) );
and populated it with some dummy data:
INSERT INTO dbo.iSource (ID,Name) SELECT TOP 10000 ROW_NUMBER() OVER (ORDER BY t.object_id) AS rownumber ,'Name_'+convert(varchar(4),ROW_NUMBER() OVER (ORDER BY t.object_id)) FROM sys.tables t CROSS JOIN sys.stats s; INSERT INTO dbo.iTarget (ID,Name) SELECT TOP 10000 ROW_NUMBER() OVER (ORDER BY t.object_id DESC) AS rownumber --Done in descending order ,'Name_'+convert(varchar(4),ROW_NUMBER() OVER (ORDER BY t.object_id)) FROM sys.tables t CROSS JOIN sys.stats s; SELECT ID, Name FROM iSource; SELECT ID, Name FROM iTarget;
So we now have a source and target table with different Names and we’ll look to update the iTarget table with the information coming from iSource.
Method 1 – MERGE Statement
MERGE dbo.iTarget AS target USING ( SELECT ID, Name FROM dbo.iSource ) AS source (ID, Name) ON (target.ID = source.ID) WHEN MATCHED AND target.Name <> source.Name THEN UPDATE SET Name = source.Name WHEN NOT MATCHED THEN INSERT (ID, Name) VALUES (source.ID, source.Name);
Using this method simply in SSMS for simplicity, profiler output 2 rows for Batch Starting and Batch Completing, CPUTime of 125ms and Duration of 125ms and it updated 6678 records. Top stuff, as expected.
Method 2 – SSIS SCD Component
I rebuilt the tables to put them back to where we started and set about creating the same thing in SCD setting ID as the business key and Name as the changing attribute and not setting inferred members, below is a screen dump of the outcome of this:
I clear down the profiler and run the ssis package and the outcome is quite astounding.
The profiler output 13456 rows including 6678 rows of queries like this:
exec sp_executesql N'SELECT [ID], [Name] FROM [dbo].[iTarget] WHERE ([ID]=@P1)',N'@P1 int',8
as well as 6678 rows of queries similar to this:
exec sp_execute 1,'Name_3304',3304
Total Duration of 37 seconds (yes that’s seconds not ms!!)…….and this is on a table of only ~7k rows!
Well I’ll be damned, the SCD basically runs a cursor looping each record checking for a match on ID and updating that record if so. I can’t actually believe that MS have built a component which performs in this way.
So, to answer the question asked ” why I didn’t use SSIS’s built in component Slowly Changing Dimension (SCD)?”, I now have a definitive answer, it doesn’t perform!
I’m sure SCD has its place but for me, the requirements and the datasets I’m working on I think I’ll stick with MERGE for now…..
NOTE: This was done on SQL Server 2008R2 Developer Edition running on Windows 7 Ultimate, not sure if SQL Server 2012 has improved the SCD performance but I’ll leave that for another day.
Being a 2011 finalist I felt I should try and rally all those who truly are exceptional to get their nominations in and quick sharp as the closing date is getting close.
I was lucky enough to be nominated for this award last year and wasn’t going to follow it through as I felt I didn’t really stand a chance but when I sat and thought about it, if someone is willing to think of you as being exceptional at what you do, enough so to nominate you then why not, what’s the worst that can happen!!??!!
The level of talent out the is phenomenal and the 4 guys I was up against last year are up there with the best in the world. Don’t let that put you off though, I feel that this award is very much focused towards those in the USA and not many actually make it through to the finals from the UK (Kevan Riley Blog / Twitter and myself I think are the only two!) so I think we need to give a bigger push this year and try and get more than one finalist from the UK
If you haven’t been nominated by one of your peers then nominate yourself, there’s no rule saying you can’t and in fact Redgate encourage it.
Get entered, the questions answered and cross your fingers!