SQLGeordie's Blog

Helping the SQL Server community……where i can!

Developing Microsoft SQL Server 2012 Databases (70-464) – My Thoughts — October 30, 2012

Developing Microsoft SQL Server 2012 Databases (70-464) – My Thoughts

I finally pulled my finger out and took the last exam of the MCSE SQL Server 2012 – Data Platform certification last week and passed with a score of 876 which is pretty respectable in my opinion 😉

The exam consisted of 3 sections:

  • 1 – 6 Scenario based questions
  • 2 – 7 Scenario based questions
  • 3 – 32 Generic questions
  • So those of you who are clever enough to work out that’s a grand total of 45 questions. I’m pleased to say that once again the quality of the questioning was to Microsoft’s usual standard, not in terms of difficulty but more in terms of complete irrelevance!

    I’m bound by NDA so can’t go into specifics but I’ll try and provide and example of this complete and utter irrelevance without giving the game away:

    Q: What is your favourite holiday destination?

    A (Select one of the following):

  • Green
  • Green with a bit of Yellow
  • Green with a bit of Blue
  • Sky Blue with Pink dots
  • Hmmmm, bit of a tricky one here. From what I can tell, none of the answers bear any relevance whatsoever to the question…….hmmmmm……..suppose I’ll have to take a random guess and hope that the answer I select is the one they’ve set as being correct!!!

    It was just a good job i wasn’t on the threshold of pass/fail as this could potentially have been the difference. From the actual question, it was to be a simple answer but I suppose I’ll never know whether or not my random guess worked out or not :(. I just hope the comments I left regarding this issue are taken up and fixed so others don’t have the same issue.

    All in all I felt comfortable throughout the exam but do know for a fact that my knowledge of Assemblies / CLR’s has slipped significantly and I need a refresher.

    So that’s it. SQL Server 2012 MCSE done and dusted and no more exams……well for the next 3 years at least. So whats next? Not sure, there is the MCSM exams but I have to be honest, I’ve got no formal certifications in SQL Server 2005/2008 but I know for a fact that I have far more knowledge and experience of its features than I do for SQL Server 2012 so I suppose I kind of backed up my original issue I have with these exams. Yes I studied and yes I did learn a lot from doing so but in no way shape or form would I class myself as an “Expert” in SQL Server 2012 – I don’t believe anyone could!

    As per usual, any thoughts or comments are welcome.

    Implementing a Data Warehouse with Microsoft SQL Server 2012 exam (70-463) – My Thoughts — September 13, 2012

    Implementing a Data Warehouse with Microsoft SQL Server 2012 exam (70-463) – My Thoughts

    Well I finally got around to completing the MCSA aspect of the SQL Server 2012 Certification and I’m pleased to say i passed with flying colours. As some of you may be aware I managed to nab and pass 3 of the Beta exams (70-461, 70-462 and 70-465) back in April and decided to see the MCSE through.

    I really wasn’t sure how this exam was going to go as I’ve been working a lot recently with MDS 2012 and SSIS 2008 and revised the new 2012 features but went in with no real expectations. The exam consisted of 55 questions, again ranging from multiple guess, select the 3 things you’d do in order to a new feature i’ve not seen before and that is a drag n drop facility of a SSIS control flow which I thought was nifty. 

    The area I thought I’d struggle on was DQS but in fact found that aspect relatively simple, the difficult area for me was the “select the 3 things you’d do in order” relating to the new Project Deployment area of SSIS 2012. I’ve done a fair bit of “tinkering” with this over the last few months but its obvious I’m not as prolific as I thought as I found certain questions difficult to get my head around what it was suggesting in the answers. I obviously did ok in this area (according to the score sheet) but at the time i was sweating a bit.

    Anyone wanting hints and tips, I obviously can’t go into detail but I’d definitely brush up on the new features of SSIS 2012!!! 

    Oh, and anyone wanting to know, the pass mark is 700 – none of the Beta exams told you this and I know some have said it was actually 800……

    Now onto 70-464 – Developing Microsoft SQL Server 2012 Databases, to complete the SQL Server 2012 MCSE certification!!!!

    SSIS SCD vs MERGE Statement – Performance Comparison — July 3, 2012

    SSIS SCD vs MERGE Statement – Performance Comparison

    I wouldn’t class myself as an expert in SSIS but I certainly know my way around but came across something today which I thought I’d share. As with a lot of things there are “many ways to skin a cat”, none of which is something I’ll go into at the moment but what i will concentrate on is updating columns in a table where the data has changed in the source.

    One of the projects I’m currently working on requires this very process and when i set about doing so I created the T-SQL Merge statement to do the business. However, the question was raised as to why I didn’t use SSIS’s built in component Slowly Changing Dimension (SCD)? I didn’t really have an answer other than personal preference but decided to delve into it a bit further and compare the performance of each method.

    As a test, I created a source table with a Key and Name column:

    USE TempDB;
    
    IF  EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'dbo.iSource') AND type in (N'U'))
    	DROP TABLE dbo.iSource;
    
    CREATE TABLE dbo.iSource
    (
       ID INT,
       Name varchar(100)
    );
    
    IF  EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'dbo.iTarget') AND type in (N'U'))
    	DROP TABLE dbo.iTarget;
    	
    CREATE TABLE dbo.iTarget
    (
       ID INT,
       Name varchar(100)
    );
    

    and populated it with some dummy data:

    INSERT INTO dbo.iSource (ID,Name)
    SELECT TOP 10000
    ROW_NUMBER() OVER (ORDER BY t.object_id) AS rownumber
    ,'Name_'+convert(varchar(4),ROW_NUMBER() OVER (ORDER BY t.object_id))
    FROM sys.tables t
    CROSS JOIN sys.stats s;
    
    INSERT INTO dbo.iTarget (ID,Name)
    SELECT TOP 10000 
    ROW_NUMBER() OVER (ORDER BY t.object_id DESC) AS rownumber --Done in descending order
    ,'Name_'+convert(varchar(4),ROW_NUMBER() OVER (ORDER BY t.object_id))
    FROM sys.tables t
    CROSS JOIN sys.stats s;
    
    SELECT ID, Name FROM iSource;
    SELECT ID, Name FROM iTarget;
    

    So we now have a source and target table with different Names and we’ll look to update the iTarget table with the information coming from iSource.

    Method 1 – MERGE Statement

    MERGE dbo.iTarget AS target
    	USING (
    	SELECT ID, Name
    	FROM dbo.iSource
    	 ) AS  source (ID, Name)
    		ON (target.ID = source.ID)
    		WHEN MATCHED AND target.Name <> source.Name 
    		THEN 
    			UPDATE SET Name = source.Name
    	 WHEN NOT MATCHED THEN 
    		 INSERT (ID, Name)
    		 VALUES (source.ID, source.Name); 
    

    Using this method simply in SSMS for simplicity, profiler output 2 rows for Batch Starting and Batch Completing, CPUTime of 125ms and Duration of 125ms and it updated 6678 records. Top stuff, as expected.

    Method 2 – SSIS SCD Component
    I rebuilt the tables to put them back to where we started and set about creating the same thing in SCD setting ID as the business key and Name as the changing attribute and not setting inferred members, below is a screen dump of the outcome of this:

    BEFORE:

    I clear down the profiler and run the ssis package and the outcome is quite astounding.

    DURING/AFTER:

    The profiler output 13456 rows including 6678 rows of queries like this:

    exec sp_executesql N'SELECT [ID], [Name] FROM [dbo].[iTarget] WHERE ([ID]=@P1)',N'@P1 int',8

    as well as 6678 rows of queries similar to this:

    exec sp_execute 1,'Name_3304',3304

    Total Duration of 37 seconds (yes that’s seconds not ms!!)…….and this is on a table of only ~7k rows!

    Well I’ll be damned, the SCD basically runs a cursor looping each record checking for a match on ID and updating that record if so. I can’t actually believe that MS have built a component which performs in this way.

    So, to answer the question asked ” why I didn’t use SSIS’s built in component Slowly Changing Dimension (SCD)?”, I now have a definitive answer, it doesn’t perform!

    I’m sure SCD has its place but for me, the requirements and the datasets I’m working on I think I’ll stick with MERGE for now….. 🙂

    NOTE: This was done on SQL Server 2008R2 Developer Edition running on Windows 7 Ultimate, not sure if SQL Server 2012 has improved the SCD performance but I’ll leave that for another day.

    It’s that time of year…..Exceptional DBA Awards 2012 — June 25, 2012

    It’s that time of year…..Exceptional DBA Awards 2012

    Being a 2011 finalist I felt I should try and rally all those who truly are exceptional to get their nominations in and quick sharp as the closing date is getting close.

    I was lucky enough to be nominated for this award last year and wasn’t going to follow it through as I felt I didn’t really stand a chance but when I sat and thought about it, if someone is willing to think of you as being exceptional at what you do, enough so to nominate you then why not, what’s the worst that can happen!!??!!

    The level of talent out the is phenomenal and the 4 guys I was up against last year are up there with the best in the world. Don’t let that put you off though, I feel that this award is very much focused towards those in the USA and not many actually make it through to the finals from the UK (Kevan Riley Blog / Twitter and myself I think are the only two!) so I think we need to give a bigger push this year and try and get more than one finalist from the UK 🙂

    If you haven’t been nominated by one of your peers then nominate yourself, there’s no rule saying you can’t and in fact Redgate encourage it.

    Get entered, the questions answered and cross your fingers!

    Good luck!!!!

    And the results are in…..SQL Server 2012 beta exams —

    And the results are in…..SQL Server 2012 beta exams

    Well after being a bit late in trying to book the beta exams I managed to get three of the five I needed for the MCSE data platform booked, taken and I’m pleased to say passed!

    Unfortunately due to the fact I did 461, 462 and 465 means I don’t actually come away with any certification as I need 463 for the MCSA and then the 464 to complete the MCSE.

    From what others in the field have said about 463, it’s very SSIS orientated which as it happens works out well for me as the project I’m currently working on is primarily SSIS so when things quieter down a bit I’ll look to get it booked. As for 464, I may very well look to batter that one out around the same time so it’s done n dusted.

    I’ve never really been a big fan of Microsoft certification and my mindset hasn’t changed much. The questioning is still vague at times and in my opinion done in such a way that it’s a test of whether you can read a question and do what Microsoft believe is the best way to do things but I still don’t feel the questions always give enough information for you to give the best solution. In the real world there are a hell of a lot more questions I’d be asking in some of the scenarios before I could make a correct decision.

    Anyway, enough whinging. Time to do some proper work 🙂

    Querying Microsoft SQL Server 2012 Beta exam (70-461 / 71-461) – My Thoughts — April 13, 2012

    Querying Microsoft SQL Server 2012 Beta exam (70-461 / 71-461) – My Thoughts

    Well I’ve now done the final SQL Server 2012 exam I managed to get a slot booked for. The Querying Microsoft SQL Server 2012 exam wasn’t going to be my strongest subject as I’m more of a DBA than Developer but i felt it went quite well.
    The exam consisted of 55 questions, varying in structure from multiple guess to drag n drop. There were only about 5 or 6 questions i left comments about relating to the content not being clear, typo’s or in one instance an actual mistake in the question so all in all a better setup than the Administrator exam I took first off (70-462 / 71-462).

    The biggest issue I found was down to my own fault. I didn’t revise on the syntax of the new 2012 T-SQL functionality. Don’t get me wrong, I know i’ve got a lot of them right but with some, although I knew the answer was down to 2 of the 4, I didn’t know it well enough to be 100% certain as there was only 1 word different in the syntax which I’m a bit disappointed with……..but no-one to blame but myself 🙂

    I’m still not sure whether the pass mark is 70% or 80% and hoping I’ve answered enough of the non-2012 questions correctly to scrape through.

    As always, I’d be interesting to hear other peoples thoughts on any of the 2012 exams they’ve taken so far…..

    Designing Database Solutions for Microsoft SQL Server 2012 Beta exam (70-465 / 71-465) – My Thoughts — April 6, 2012

    Designing Database Solutions for Microsoft SQL Server 2012 Beta exam (70-465 / 71-465) – My Thoughts

    After sitting the Administering Microsoft SQL Server 2012 Databases Beta exam (71-462) on Monday, I was still a little disappointed with Microsofts approach to questioning for these exams. So I went into this exam with pretty much the same mindset that the questions were going to be vague and in some cases completely wrong.

    Much to my surprise, I found the questions in this exam far far better. The exam itself was split into sections. There were 44 in total, 26 the standard multiple choice and 5 further scenario based sections, each with either 3 or 4 questions. Section one was much the same as the 71-462 exam but I felt the questions were in the majority, more concise and in my opinion gave enough information to make a valid judgement when answering. I did leave a few comments as there were a few of the questions that could do with a bit more work and had a couple of typo’s.

    The scenario sections again provided enough information to select the relevant answers, the only criticism of these sections were on question 2 of my second scenario, there was a major typo on the answers of question which didn’t pry me away from the answer but requires sorting.

    Now, the main thing that really got me with this exam was the amount of SQL Azure questions in section one. Is was not mentioned as a skill measured so needs looking into in my opinion, either add it as a skill measured or remove it from the exam.

    A much more enjoyable exam than the administrator, mainly due to the higher level of quality in the questioning resulting in far fewer comments being left and for me, I love the scenario based questions!!

    As always, I’d be interesting to hear other peoples thoughts on any of the 2012 exams they’ve taken so far…..

    Administering Microsoft SQL Server 2012 Databases Beta (70-462 / 71-462) – My Thoughts… — April 4, 2012

    Administering Microsoft SQL Server 2012 Databases Beta (70-462 / 71-462) – My Thoughts…

    On Monday I did my first Microsoft SQL Server exam since I did my SQL Server 2000 exams many moons ago and I can’t believe the quality of questioning hasn’t changed one bit! Don’t get me wrong, it wasn’t all plain sailing but for me, what made it difficult was the vagueness of the questions and in some cases, blatant mistakes! I understand the SQL Server 2012 exams are still in Beta but some of these errors are not simple spelling mistakes – although there were in fact quite a few typo’s in my exam.

    As an example (without giving the game away), one question refers to SQL Server 2012 and SQL Server 2000, but the answers did not refer to SQL Server 2000 at all. I re-read the question over and over to see if I was missing something but I’m confident that it was a mistake in the question.

    The exam itself consisted of 56 questions and I felt the allotted time was sufficient for this but what I didn’t find sufficient was the time allocated for leaving comments which from speaking to others in the field they’ve felt the same. During the exam I wrote comments down for around 25 of the questions in prep for leaving comments at the end but you’re only given 10 (or was it 15?) minutes to actually enter the comments into the system and with the machine (or could have been the application) being as slow as it was, this meant I had to type quicker than I’ve ever typed in my life :).

    I didn’t get through all the comments so picked out the ones I felt needed highlighting the most and got those done first and managed to get through about 12 of them before the time ran out. I was a little disappointed with this as for me that time should be sufficient for examinees to relay their comments fully so Microsoft can take the points / mistakes on board and rectify them before they launch the live exams in June.

    I was also a little surprised that I wasn’t given the test score there and then (have to wait until they actually go live), not quite sure why Microsoft have chosen to do it that way but I’m sure they have their reasons. I’m guessing its so people can’t relay an answers to others who are taking the exam but no reason was given so I can only speculate.

    So, all in all a good test of your administrative knowledge but in my opinion there’s a lot of work to be done to rectify the issues before they go live. It would be interesting to hear other peoples thoughts on any of the 2012 exams they’ve taken so far…..

    How to output from invoke-sqlcmd to Powershell variable — February 3, 2012

    How to output from invoke-sqlcmd to Powershell variable

    Sorry for another Powershell post but I’ve been doing a lot of it recently and coming up with (what i think are) a few nifty tricks.

    One of the issues I encountered recently was with Kerberos delegation whilst trying to automate Log Shipping. What I was trying to do was use an OPENROWSET query to run against the Primary and Secondary servers in order to obtain the Primary_id and Secondary_id in order to pass to the script to be ran on the monitor server. However, seeing as the environment was not setup for Kerberos I encountered the “double-hop” issue.

    Enabling Kerberos delegation for the service account would be too high a risk without thorough testing so wasn’t an option in this instance so I decided to look into using invoke-sqlcmd against each of the servers to get the IDs required and pass it to the monitor script.

    So how did I go about doing this you ask, well its actually really simple. After a bit of googling I came across this blog by Allen White which gave me a starting block.

    Firstly, you have to amend your TSQL script to SELECT the parameter you want to output and use within the rest of the script, something like this:

    TSQL snippet to be ran against the Primary Server:

    --Cut down version of the script for readability
    EXEC @SP_Add_RetCode = master.dbo.sp_add_log_shipping_primary_database 
    		@database = N'$(Database)' 
    		...
    		,@primary_id = @LS_PrimaryId OUTPUT --This is what we want
    		,@overwrite = 1 
    		,@ignoreremotemonitor = 1 
    
    --Need to output this in order for powershell to take it and use it in the monitor script
    SELECT @LS_PrimaryId as LS_PrimaryId 
    
    

    Do the same for the script to run on the secondary server but obviously for the secondary_id 🙂

    So, now you’ve setup the TSQL side of things, you need to then call these from Powershell and assign the output parameter to a Powershell variable like so:

    
    $script = "LogShip_Primary.sql"
    $PrimaryID = Invoke-Sqlcmd -InputFile $ScriptLocation$script -Variable Database=$DatabaseName, etc etc etc -ServerInstance $PrimaryServer 
    
    $script = "LogShip_Secondary.sql" 
    $SecondaryID = Invoke-Sqlcmd -InputFile $ScriptLocation$script -Variable Database=$DatabaseName, etc etc etc -ServerInstance $SecondaryServer
    
    

    So, relatively simple. Basically your setting the output to a Powershell variable. keeping things tidy, re-assign it to another variable and something to note is that the output is actually a DataTable object. Make sure you use the name of the alias you used in your last TSQL statement.

    
    $PID = $PrimaryID.LS_PrimaryId
    $SID = $SecondaryID.LS_SecondaryId 
    
    

    Once this is done then you can use this in your script to run against the monitor server

    
    $script = "LogShip_Monitor.sql" 
    Invoke-Sqlcmd -InputFile $ScriptLocation$script -Variable Database=$DatabaseName, etc etc etc, PrimaryID=$PID, SecondaryID=$SID -ServerInstance $MonitorServer
    
    

    And there you have it, nice n simple! All you then have to do is wrap it in a foreach loop for the databases you want to setup and a nice and simple automated logshipping build script.

    Obviously I’ve omitted a lot of the setup / checking of scripts etc from this post as I don’t want to be doing all the work for you!

    Enjoy 🙂

    So then, what’s the definition of an object……..? — January 25, 2012

    So then, what’s the definition of an object……..?

    Not blogged for a while due to client and project commitments but something which has surprised me when speaking with colleagues both past and present is that when I mention the built in function OBJECT_DEFINITION, the majority of DBA’s haven’t heard of it, never mind used it. So i felt it necessary to dust off the blog typing fingers and see if i can enlighten 🙂

    So, I though it be a good idea to enlighten a few people to how it can be used by giving real world examples.

    Firstly, a short definition (no pun intended) from BOL (http://msdn.microsoft.com/en-us/library/ms176090.aspx) as to what exactly this function does:

    Returns the Transact-SQL source text of the definition of a specified object.

    Its as simple as that!

    Pass in the Object_ID which it expects to be in the current database context and it spits out the text. I’ll show you a couple of examples of how it works in comparison to how I’ve seen the same thing done but by using sp_helptext as well as some of the other system tables.

    I’ll not beat around the bush and get straight into a few examples and old skool alternatives as there’s not really much more i can say about the function itself:

    Example 1 – OBJECT_DEFINITION

    SELECT OBJECT_DEFINITION(OBJECT_ID('usp_StoredProcedureName'))
    


    Example 2 – sp_helptext

    EXEC sp_helptext 'usp_StoredProcedureName'
    


    Example 3 – Using system tables to search (this is a common way I’ve seen this done)

    SELECT  o.[name]
    	  , o.type_desc
    	  , sc.[text]
    FROM  sys.objects o
    INNER JOIN syscomments sc ON o.[object_id] = sc.id
    WHERE o.type_desc = 'SQL_STORED_PROCEDURE'
    	  AND o.[name]  = 'usp_StoredProcedureName'
    


    Example 4 – OBJECT_DEFINITION for multiple objects

    SELECT [object_ID], [Name], OBJECT_DEFINITION([object_ID]) AS ProcText
    FROM sys.procedures
    


    Example 5 – OBJECT_DEFINITION for multiple with filtering

    SELECT [object_ID], [Name],  OBJECT_DEFINITION([object_ID]) AS ProcText
    FROM sys.procedures
    WHERE OBJECT_DEFINITION([object_ID]) LIKE '%CATCH%'
    


    Example 6 – OBJECT_DEFINITION to Script out Procedures

    SET NOCOUNT ON;
    
    DECLARE @strSQL NVARCHAR(MAX)
    
    SET @strSQL = ''
    SELECT @strSQL += OBJECT_DEFINITION([object_ID])+CHAR(10)+'GO'+CHAR(10)
    FROM sys.procedures
    
    SELECT @strSQL
    


    Now this can be used for all programmability objects within SQL Server, not just procedures so the same works for Views, functions, triggers etc

    Again from BOL here is a full list:

    C = Check constraint

    D = Default (constraint or stand-alone)

    P = SQL stored procedure

    FN = SQL scalar function

    R = Rule

    RF = Replication filter procedure

    TR = SQL trigger (schema-scoped DML trigger, or DDL trigger at either the database or server scope)

    IF = SQL inline table-valued function

    TF = SQL table-valued function

    V = View

    So there you have it, short n snappy blog today and I really hope that it helps give people a new insight into how to get object text.