Archive for the ‘ DBA ’ Category

Technical Debt, Normalization Debt, Requirements Debt, Git-r-done Debt, OH MY!

An individual that I respect a great deal recently commented on a blog post that I wrote in a way that I didn’t see coming.  It’s interesting, how when one takes a moment to get out of what’s in their brain and onto paper the meaning that it has to others.  His comment struck a chord with me and put into words something that I talk about all of the time, but he did it in a manner that was concise, relevant and easy to digest.

The basic premise revolves around “debt”, technical debt to be specific.  In my initial blog post I was attempting to make the case for the critical nature of Finding Functional Dependencies as they relate to a business and the nature of that business as it’s persisted, stored and understood in the form of data.  The original content that spurred this post can be found in chapter three of this book “SQL Server MVP Deep Dives” and was authored by Hugo Kornelis.

Cutting to the chase, the point of this post is partially to share my opinion but it’s also the hope that it will resonate with a larger audience and we can begin to clearly communicate the danger with technical debt. 

As defined in 1992 by Ward Cunningham , technical debt was described as:  “Shipping first time code is like going into debt.  A little debt speeds development so long as it is paid back promptly with a rewrite.  The danger occurs when the debt is not repaid.  Every minute spent on not-quite-right code counts as interest on that debt.  Entire engineering organizations can be brought to a stand-still under the debt load of an unconolidated implementation, object-oriented or otherwise.”

Over the years I have had countless conversations about incurring the debt, as defined above, with a boss, a client, a co-worker, the business and even CxO level executives.  The outcome is rarely any different from one time to another and it typically boils down to “is it good enough?”.  I like to compare this question to that of a moped in the Rocky Mountains on a beautiful July afternoon…. Is it enough?  Heck yeah, it’s perfect!  I get the sunlight reflecting from the Ray-Bans, the wind through my hair, absolutely nothing inhibiting my view of the beautiful Rocky Mountains.  Then, as time goes on and it’s the middle of December and you find yourself in -32 degree weather with the wind blowing at 50 mph and your wife telling you that it’s time to head to the emergency room as you are an expecting father.  Suddenly that moped is just a bit short of being adequate.  What to do?  Well, there’s no time to head to the car dealership.  There’s no time to put studs on the moped and build a little canopy around it.  There’s no time for anything other than what you have immediately available.  In this case, not enough thought had been put into the original decision. 

Granted, that story is a bit beyond silly when it comes to the amount of thought that most organizations and individuals put into their technical, architectural and design considerations.  What I’ve seen isn’t the amount of respect that a decision is given but rather the context and understanding of the subject.  Another example that illustrates this well outside of the technical arena begins just like the first story above.  Young man moves from South America.  Let’s say he’s moving from Quito, Ecuador to Vail, Colorado.  This guy brings his lovely new bride to Vail, CO, they buy a little cabin and decide that transportation would be in need.  He thinks about the elevation and decides that since Vail, CO has an altitude of 8,200 feet and Quito Ecuador is roughly 9,895 feet in altitude then he shouldn’t have to worry about cold weather… after all, the higher in the atmosphere you go, the colder it gets and since he is decreasing in altitude, he should be fine.  A logical deduction but unfortunately he has a missing data element that he’s unaware of that completely changes the equation.  Latitude.  A simple little word that’s easy to understand for those folks who live in an area where it changes their sleeping patterns, eating patterns, etc… but for this guy, having come from Quito, Ecuador, which happens to be on the equator; latitude had no bearing or effect.

I hope that those illustrations helped to get the point across that technical debt is not something to be trifled with.  For the moment that you need the capital that should have been invested in your solution, product or service offering and you find youself in debt, it will undoubtably be at a very inopportune time where the answers might not be acceptable.

Over the years I have had the pleasure of arguing this exact scenario before but without the language or the concise understanding of what I was trying to communicate.  I would often default to my position on modular design and loosely coupled methodologies, especially in an iterative development process.  While I don’t have empirical proof on this, it appears to me that as technology continues to mature at a overwhemlingly rapid pace, the amount of technical debt is being incurred not only in design considerations but at an even more expensive level; the feeling of being “cool”.  I must admit, like every driven and successful IT person out there, I LOVE to be working on the newest, fastest, best stuff I can find.  I often find myself asking the question; is this sound in light of what this client needs?  Is this sound in light of where they are?  Would it make more sense to S.T.O.P. for a moment, address what isn’t fun in order to do the right thing?  Perhaps moving the “cloud” isn’t as important as addressing the Normalization debt that has already been incurred.  Perhaps that new third-party application really won’t solve all of the problems the sales guy said it would… perhaps it will simply be one more thing added to the VISA. 

tick.tock.  Every moment that goes by, every line of code that’s written against the initial short-cut is interest.  That’s a lot of interest.  Stop compounding your debts interest and be the unpopular one who stands up for doing it right, the first time.  Part of that process, in my experience, is to first-and-foremost admit that you don’t know it all and more importantly admit that you might not have all of the facts necessary to make the correct decision.  Don’t let that be a reason to shy away from this.  Let that be the reason to embrace it!  Think of what can be learned, think of what you can contribute and more importantly think of how driven and excited you will be knowing that you are paying down your companies technical debt!

 tick.tock.

Advertisements

Meta Data – Cardinality Explored

Last night I had the privilege of speaking at the Colorado Springs SQL Server Users Group. The subject was Meta Data – Cardinality Explored.

This topic is a very enjoyable one to me. Along with the DMV’s that SQL Server now provides, the meta-data about your database implementations is critical to their scalability, stability, accuracy and performance.

Over the years, I have found that cardinality is not a topic that is often understood. I also believe that it’s not discussed all that often. It is an absolutely critical data-set for the developers, DBA’s, testers and performance testers out there.

Attached to this post you will find my sql script and PowerPoint slide deck. I’d recommend starting with the slide deck… walk through it and when you see “demo”, open up the sql file and learn how this result set can make an impact with the databases that you are responsible for or those that you are developing against.

Have fun and if you have questions about the presentation or the sql code I put together, feel free to comment!

Thanks again to the terrific folks down in Colorado Springs for the warm welcome and not falling asleep as I droned on and on. =)

CSSQLUG_June09
MetaData – Cardinality Explored – SQL

Schema Management (DDL Triggers)

Over the years I have used a variety of tools for schema management.  The best one that I really relied on for quite a while was from RedGate.  They offer schema compare and data compare tools that really make quick work of determing the delta against two environments.

Recently I’ve been working with DB Pro to manage this and thus far I’ve been very, very impressed.  However, it has some downside; primarly the cost.  Some of my clients just can’t justify / stomach the dollars involved.  So, in this gap I have been using DDL Triggers.  It’s an automated log, stored in a database (table) that captures ddl changes (Create, Drop, Alter).   This concept is different than a traditional trigger in that it only fires for database events, not data events.  Meaning that the scope of this trigger is at the schema or the structure level, not the data or the dml level.

The value that it has provided some of my clients is simply the certainty that they know what happened in a specific environment and it saves a great deal of time when it comes to debugging, troubleshooting or otherwise trying to figure out why something that worked perfectly yesterday in the database does not work correctly today.

There are a few steps that need to be taken to make this work.  The first is to determine if you want to store this data in it’s own database; or store it in a table in an existing database.  It can be done either way.  In the code below you will see that I’ve created a stand alone db with one table in it.

CREATE DATABASE [DDL_ChangeLog]

GO

use ddl_changelog;

CREATE TABLE DDLChangeLog (

DDLChangeLogID [int] IDENTITY(1,1) NOT NULL,

eventtype nvarchar(100),

objectname nvarchar(100),

objecttype nvarchar(100),

databasename nvarchar(100),

schemaname nvarchar(100),

SystemUser nvarchar(255),

AppName nvarchar(255),

HostName nvarchar(255),

tsql nvarchar(MAX),

createddate datetime,

CONSTRAINT [PK_DDL_Changelog] PRIMARY KEY CLUSTERED

(DDLChangeLogID ASC))
The code above will simply create a database and a table.

Once you have that in place, then choose the database that you would like to track ddl changes on and create this database trigger.

 

CREATE TRIGGER [trgLogDDLEvent] ON DATABASE

FOR DDL_DATABASE_LEVEL_EVENTS

AS

DECLARE @data XML

SET @data = EVENTDATA()

IF @data.value(‘(/EVENT_INSTANCE/EventType)[1]’, ‘nvarchar(100)’) <> ‘CREATE_STATISTICS’

INSERT INTO ddl_changelog..DDLChangeLog

(EventType, ObjectName, ObjectType, DatabaseName, SchemaName, SystemUser, AppName, HostName, tsql , createddate)

VALUES (

@data.value(‘(/EVENT_INSTANCE/EventType)[1]’,

                           ‘nvarchar(100)’),

@data.value(‘(/EVENT_INSTANCE/ObjectName)[1]’,

                           ‘nvarchar(100)’),

@data.value(‘(/EVENT_INSTANCE/ObjectType)[1]’,

                           ‘nvarchar(100)’),

@data.value(‘(/EVENT_INSTANCE/DatabaseName)[1]’,

                           ‘nvarchar(100)’),

@data.value(‘(/EVENT_INSTANCE/SchemaName)[1]’,

                           ‘nvarchar(100)’),

system_user , app_name (),host_name(),

@data.value(‘(/EVENT_INSTANCE/TSQLCommand)[1]’,

                           ‘nvarchar(max)’) ,

getdate()) ;

GO

ENABLE TRIGGER [trgLogDDLEvent] ON DATABASE

GO
So, let’s test this out.

With the database that you just created the trigger on, let’s do something simple like this:

            SELECT 1 as test into TestTable;

           SELECT * from TestTable;

           DROP TABLE TestTable;

SELECT * FROM DDL_ChangeLog..DDLChangeLog

wow!  there is data there, not just any data, but it should have returned two rows; even though there are three statements above in our test.  This is because that trigger only fired for the Create table (select into) and the drop table statements.  The trigger will not fire for the select statement.

I initially started using these for my own reasons / purposes; however as time has gone on, I’ve found that there are other audiences that LOVE knowing this can be tracked… Auditors love this functionality, QA loves to be able to verify that the blackbox they are testing has some controls / audit processes around it.  Management loves it because they can have hard data to go back to the business on in the event of an outage, mistake or whatever.  I’ve found that it’s typically best to be as transparent as possible with changes / modifications and the state of the databases that I’m responsible for.  In this manner, I’m able to better serve my clients and in turn that are more likely to believe / trust my opinion when it counts.

Policy Management

In Novemeber, I had the opportunity to present at DevConnections.  I spoke on two topics, Policy Management and Plan Guides.  Both of these are new / unique to SQL Server 2008 and I really love both of the additions to the SQL Server platform.  

In this blog post I have attached two files; one of them is my powerpoint slide deck and the other is the script I used for the demo’s.  

This was a very difficult talk to give as the concept is/was relatively new and the amount of information available was scarce when I was developing the content.   

So, what is Policy Management?  Microsoft’s definition is as follows :  “Policy-Based Management is a policy-based system for managing one or more instances of SQL Server 2008.  Use this with SQL Server Management Studio to create policies that manage entities on the server, such as the instance of SQL Server, databases, and other SQL Server objects.”  What this definition leaves out is something of great importance.  This new feature CAN be used with down versions, providing that the facet you want to throw a policy around is availalable in that version.  Meaning, if you want to monitor datafiles or schema changes in a SQL Server 2005 instance, you can!  

I found there to be three components to Policy Management.  

  1. Policy Management  — Creation of conditions and policies
  2. Explicit Administration — One to many managed targets
  3. Automated Administration — Automate enforcement of policies

For each of these there are components – Refer to the slide deck and Microsoft’s site for more information.  

One of the biggest helps that I had when leaning Policy Management was the following CTE:   

The results of this query provide all of the available facet’s and which mode then can be run for.  This is a huge help as going through the gui to see if something is availalble for a certain type of mode can take a while.   

USE MSDB;
with automatedpolicyexecutionmode (modeid, modename)
as
          (select * from (values (0, ‘On Demand’), 
                              (1,’Enforce Compliance’), 
                              (2, ‘Check on Change and Log’),
                              (4, ‘Check on Schedule and Log’))
as em(modeid, modename))
   

So far I have successfully implemented policies against a host of the facets offered.  Microsoft is going to continue to build on this inital release as time goes on; so look to new additions of facets as time goes on.  

Enjoy Policy Based Management and may it save you a great deal of time, decrease your daily / weekly checklist of items and enable you to provide further value to your organization.  

SELECT dmf.management_facet_id as facetid, dmf.name as facetname, apemode.modename
FROM syspolicy_management_facets as dmf
       INNER JOIN automatedpolicyexecutionmode as apemode on dmf.execution_mode & apemode.modeid=apemode.modeid
ORDER BY dmf.name, apemode.modename  

If you walk through the slide deck and the demo script; you can quickly see how easy it is to create a policy.  Once you have one created (IN DEVELOPMENT),  =), then evaluate the policy in two ways.  The first way you will want to ensure that the evaluation is successful.  The second time, break it, so that the evaluation of the policy is false.  Once you have done this, run the query below.  

 USE MSDB;
GO
SELECT a.execution_date AS 'Date Run' , 
                  c.name AS 'Policy' , 
                  a.target_query_expression AS 'Policy Failure Targets' , 
                  d.name as 'Condition',
                  d.description as 'Condition Description', 
                  d.facet as 'Condition Facet'
FROM syspolicy_policy_execution_history_details_internal a
              INNER JOIN syspolicy_policy_execution_history_internal b ON a.history_id = b.history_id
              INNER JOIN syspolicy_policies_internal c ON b.policy_id = c.policy_id
                           INNER JOIN syspolicy_conditions d ON c.condition_id = d.condition_id
WHERE a.result = 0
ORDER BY a.execution_date DESC, c.name ASC, d.name ASC 

Attachements for this post ::  

Script: policymanagementscript2  — The majority of the t-sql in this document can be found below.  

Slide Deck: sql-connections-conference-slides-policy-management2  


USE msdb;
GO
WITH automatedpolicyexecutionmode (modeid, modename)
AS
 (select * from (values  (0, 'On Demand'),
    (1,'Enforce Compliance'),
    (2, 'Check on Change and Log'),
    (4, 'Check on Schedule and Log'))
          as em(modeid, modename))
     SELECT dmf.management_facet_id as facetid, dmf.name as facetname, apemode.modename
     FROM syspolicy_management_facets as dmf
        INNER JOIN automatedpolicyexecutionmode as apemode on dmf.execution_mode & apemode.modeid=apemode.modeid
     OREDER BY dmf.name, apemode.modename

— Demo 1
— Greenfield development
— Standard Naming convention
— Create Conditions
— Create Policy 


use DemoDB
go

-- Successful Object Creation
-- Table
CREATE TABLE UserTable_test_me3 (test int)
GO

-- Procedure
CREATE Procedure PRC_Test_proc AS SELECT GETDATE()
GO

-- View
CREATE VIEW UserView_Test AS SELECT test from UserTable_test_me3
GO

-- Failed Object Creation due to Policy enforcement
-- Table
CREATE TABLE test_me3 (test int)
GO

Create Procedure Test_proc AS select GETDATE()
GO

Create View VW_Test as select test from UserTable_test_me3
GO

Create View Test as select test from UserTable_test_me3
GO

-- Fails
EXEC sp_rename 'UserTable_test_me3', 'test_me3'
GO 

--  Check MSDB for policy execution Failures

USE msdb;
GO

     SELECT a.execution_date AS 'Date Run' , c.name AS 'Policy', a.target_query_expression AS 'Policy Failure Targets',
                     D.name as 'Condition', D.description as 'Condition Description', D.facet as 'Condition Facet'
      FROM syspolicy_policy_execution_history_details_internal a
        INNER JOIN syspolicy_policy_execution_history_internal b ON a.history_id = b.history_id
        INNER JOIN  syspolicy_policies_internal c  ON b.policy_id = c.policy_id
        INNER JOIN syspolicy_conditions d  ON c.condition_id = d.condition_id
    WHERE a.result = 0 AND a.execution_date >= GETDATE() -1
    ORDER BY a.execution_date DESC, c.name ASC, a.target_query_expression ASC;
GO

-- Existing environment somewhat different ::

select * from msdb.dbo.syspolicy_policies_internal order by date_created desc;
GO
-- Delete the policy
 -- Can use SSMS to do this as well.
Declare @Object_set_ID_Delete INT
set @Object_set_ID_Delete = (select object_set_id from msdb.dbo.syspolicy_policies_internal where name = 'Standard Naming Convention')

EXEC msdb.dbo.sp_syspolicy_delete_policy @name= 'Standard Naming Convention'
EXEC msdb.dbo.sp_syspolicy_delete_object_set @object_set_id = @Object_set_ID_Delete
GO
-- Delete the Condition
 -- Can use SSMS to do this as well.
 EXEC msdb.dbo.sp_syspolicy_delete_condition @name = 'Standard - Naming Convention'

-- Create an Existing environment

Create Database Demo_existing
GO

USE Demo_existing
GO
create table test_me3 (test int)
GO

Create Procedure Test_proc AS select GETDATE() as The_Time_is_Now
GO

Create View VW_Test as select test from test_me3
GO

Create View Test as select test from test_me3
GO

—  Re-Create Condition and Policy
— Using t-sql versus SSMS
  

Declare @condition_id int
EXEC msdb.dbo.sp_syspolicy_add_condition @name=N'Standard - Naming Convention', @description=N'Enforcement of Standard Naming Convention

The Object you attempted to create was rolled back.

This Policy prevents the following naming standards ::

Object names MAY NOT begin with "tbl" OR "sp" OR "vw"

AND

Ojbect Names MUST begin with "UserTable" OR "PRC" OR "UserView"

', @facet=N'IMultipartNameFacet', @expression=N'<Operator>
  <TypeClass>Bool</TypeClass>
  <OpType>AND</OpType>
  <Count>2</Count>
  <Group>
    <TypeClass>Bool</TypeClass>
    <Count>1</Count>
    <Operator>
      <TypeClass>Bool</TypeClass>
      <OpType>OR</OpType>
      <Count>2</Count>
      <Operator>
        <TypeClass>Bool</TypeClass>
        <OpType>OR</OpType>
        <Count>2</Count>
        <Operator>
          <TypeClass>Bool</TypeClass>
          <OpType>NOT_LIKE</OpType>
          <Count>2</Count>
          <Attribute>
            <TypeClass>String</TypeClass>
            <Name>Name</Name>
          </Attribute>
          <Constant>
            <TypeClass>String</TypeClass>
            <ObjType>System.String</ObjType>
            <Value>tbl%</Value>
          </Constant>
        </Operator>
        <Operator>
          <TypeClass>Bool</TypeClass>
          <OpType>NOT_LIKE</OpType>
          <Count>2</Count>
          <Attribute>
            <TypeClass>String</TypeClass>
            <Name>Name</Name>
          </Attribute>
          <Constant>
            <TypeClass>String</TypeClass>
            <ObjType>System.String</ObjType>
            <Value>vw%</Value>
          </Constant>
        </Operator>
      </Operator>
      <Operator>
        <TypeClass>Bool</TypeClass>
        <OpType>NOT_LIKE</OpType>
        <Count>2</Count>
        <Attribute>
          <TypeClass>String</TypeClass>
          <Name>Name</Name>
        </Attribute>
        <Constant>
          <TypeClass>String</TypeClass>
          <ObjType>System.String</ObjType>
          <Value>sp%</Value>
        </Constant>
      </Operator>
    </Operator>
  </Group>
  <Group>
    <TypeClass>Bool</TypeClass>
    <Count>1</Count>
    <Operator>
      <TypeClass>Bool</TypeClass>
      <OpType>OR</OpType>
      <Count>2</Count>
      <Operator>
        <TypeClass>Bool</TypeClass>
        <OpType>OR</OpType>
        <Count>2</Count>
        <Operator>
          <TypeClass>Bool</TypeClass>
          <OpType>LIKE</OpType>
          <Count>2</Count>
          <Attribute>
            <TypeClass>String</TypeClass>
            <Name>Name</Name>
          </Attribute>
          <Constant>
            <TypeClass>String</TypeClass>
            <ObjType>System.String</ObjType>
            <Value>UserTable%</Value>
          </Constant>
        </Operator>
        <Operator>
          <TypeClass>Bool</TypeClass>
          <OpType>LIKE</OpType>
          <Count>2</Count>
          <Attribute>
            <TypeClass>String</TypeClass>
            <Name>Name</Name>
          </Attribute>
          <Constant>
            <TypeClass>String</TypeClass>
            <ObjType>System.String</ObjType>
            <Value>UserView%</Value>
          </Constant>
        </Operator>
      </Operator>
      <Operator>
        <TypeClass>Bool</TypeClass>
        <OpType>LIKE</OpType>
        <Count>2</Count>
        <Attribute>
          <TypeClass>String</TypeClass>
          <Name>Name</Name>
        </Attribute>
        <Constant>
          <TypeClass>String</TypeClass>
          <ObjType>System.String</ObjType>
          <Value>PRC%</Value>
        </Constant>
      </Operator>
    </Operator>
  </Group>
</Operator>', @is_name_condition=0, @obj_name=N'', @condition_id=@condition_id OUTPUT
Select @condition_id

GO 

—-
—   Condition was created with an ID of XX.
—- 

Declare @object_set_id int
EXEC msdb.dbo.sp_syspolicy_add_object_set @object_set_name=N'Standard Naming Convention_ObjectSet_1', @facet=N'IMultipartNameFacet', @object_set_id=@object_set_id OUTPUT
Select @object_set_id

Declare @target_set_id int
EXEC msdb.dbo.sp_syspolicy_add_target_set @object_set_name=N'Standard Naming Convention_ObjectSet_1', @type_skeleton=N'Server/Database/StoredProcedure', @type=N'PROCEDURE', @enabled=False, @target_set_id=@target_set_id OUTPUT
Select @target_set_id

EXEC msdb.dbo.sp_syspolicy_add_target_set_level @target_set_id=@target_set_id, @type_skeleton=N'Server/Database/StoredProcedure', @level_name=N'StoredProcedure', @condition_name=N'', @target_set_level_id=0
EXEC msdb.dbo.sp_syspolicy_add_target_set_level @target_set_id=@target_set_id, @type_skeleton=N'Server/Database', @level_name=N'Database', @condition_name=N'', @target_set_level_id=0

EXEC msdb.dbo.sp_syspolicy_add_target_set @object_set_name=N'Standard Naming Convention_ObjectSet_1', @type_skeleton=N'Server/Database/Synonym', @type=N'SYNONYM', @enabled=False, @target_set_id=@target_set_id OUTPUT
Select @target_set_id

EXEC msdb.dbo.sp_syspolicy_add_target_set_level @target_set_id=@target_set_id, @type_skeleton=N'Server/Database/Synonym', @level_name=N'Synonym', @condition_name=N'', @target_set_level_id=0
EXEC msdb.dbo.sp_syspolicy_add_target_set_level @target_set_id=@target_set_id, @type_skeleton=N'Server/Database', @level_name=N'Database', @condition_name=N'', @target_set_level_id=0

EXEC msdb.dbo.sp_syspolicy_add_target_set @object_set_name=N'Standard Naming Convention_ObjectSet_1', @type_skeleton=N'Server/Database/Table', @type=N'TABLE', @enabled=False, @target_set_id=@target_set_id OUTPUT
Select @target_set_id

EXEC msdb.dbo.sp_syspolicy_add_target_set_level @target_set_id=@target_set_id, @type_skeleton=N'Server/Database/Table', @level_name=N'Table', @condition_name=N'', @target_set_level_id=0
EXEC msdb.dbo.sp_syspolicy_add_target_set_level @target_set_id=@target_set_id, @type_skeleton=N'Server/Database', @level_name=N'Database', @condition_name=N'', @target_set_level_id=0

EXEC msdb.dbo.sp_syspolicy_add_target_set @object_set_name=N'Standard Naming Convention_ObjectSet_1', @type_skeleton=N'Server/Database/UserDefinedFunction', @type=N'FUNCTION', @enabled=False, @target_set_id=@target_set_id OUTPUT
Select @target_set_id

EXEC msdb.dbo.sp_syspolicy_add_target_set_level @target_set_id=@target_set_id, @type_skeleton=N'Server/Database/UserDefinedFunction', @level_name=N'UserDefinedFunction', @condition_name=N'', @target_set_level_id=0
EXEC msdb.dbo.sp_syspolicy_add_target_set_level @target_set_id=@target_set_id, @type_skeleton=N'Server/Database', @level_name=N'Database', @condition_name=N'', @target_set_level_id=0

EXEC msdb.dbo.sp_syspolicy_add_target_set @object_set_name=N'Standard Naming Convention_ObjectSet_1', @type_skeleton=N'Server/Database/UserDefinedType', @type=N'TYPE', @enabled=False, @target_set_id=@target_set_id OUTPUT
Select @target_set_id

EXEC msdb.dbo.sp_syspolicy_add_target_set_level @target_set_id=@target_set_id, @type_skeleton=N'Server/Database/UserDefinedType', @level_name=N'UserDefinedType', @condition_name=N'', @target_set_level_id=0
EXEC msdb.dbo.sp_syspolicy_add_target_set_level @target_set_id=@target_set_id, @type_skeleton=N'Server/Database', @level_name=N'Database', @condition_name=N'', @target_set_level_id=0

EXEC msdb.dbo.sp_syspolicy_add_target_set @object_set_name=N'Standard Naming Convention_ObjectSet_1', @type_skeleton=N'Server/Database/View', @type=N'VIEW', @enabled=False, @target_set_id=@target_set_id OUTPUT
Select @target_set_id

EXEC msdb.dbo.sp_syspolicy_add_target_set_level @target_set_id=@target_set_id, @type_skeleton=N'Server/Database/View', @level_name=N'View', @condition_name=N'', @target_set_level_id=0
EXEC msdb.dbo.sp_syspolicy_add_target_set_level @target_set_id=@target_set_id, @type_skeleton=N'Server/Database', @level_name=N'Database', @condition_name=N'', @target_set_level_id=0

EXEC msdb.dbo.sp_syspolicy_add_target_set @object_set_name=N'Standard Naming Convention_ObjectSet_1', @type_skeleton=N'Server/Database/XmlSchemaCollection', @type=N'XMLSCHEMACOLLECTION', @enabled=False, @target_set_id=@target_set_id OUTPUT
Select @target_set_id

EXEC msdb.dbo.sp_syspolicy_add_target_set_level @target_set_id=@target_set_id, @type_skeleton=N'Server/Database/XmlSchemaCollection', @level_name=N'XmlSchemaCollection', @condition_name=N'', @target_set_level_id=0
EXEC msdb.dbo.sp_syspolicy_add_target_set_level @target_set_id=@target_set_id, @type_skeleton=N'Server/Database', @level_name=N'Database', @condition_name=N'', @target_set_level_id=0
GO
Declare @policy_id int
EXEC msdb.dbo.sp_syspolicy_add_policy @name=N'Standard Naming Convention', @condition_name=N'Standard - Naming Convention', @execution_mode=1, @is_enabled=True, @policy_id=@policy_id OUTPUT, @object_set=N'Standard Naming Convention_ObjectSet_1'
Select @policy_id

-- The next piece will Update the Targets for the policy
GO
EXEC msdb.dbo.sp_syspolicy_update_target_set @target_set_id=128, @enabled=TRUE
EXEC msdb.dbo.sp_syspolicy_update_target_set_level @target_set_id=128, @type_skeleton=N'Server/Database/StoredProcedure', @condition_name=N''
EXEC msdb.dbo.sp_syspolicy_update_target_set_level @target_set_id=128, @type_skeleton=N'Server/Database', @condition_name=N''

EXEC msdb.dbo.sp_syspolicy_update_target_set @target_set_id=129, @enabled=TRUE
EXEC msdb.dbo.sp_syspolicy_update_target_set_level @target_set_id=129, @type_skeleton=N'Server/Database/Synonym', @condition_name=N''
EXEC msdb.dbo.sp_syspolicy_update_target_set_level @target_set_id=129, @type_skeleton=N'Server/Database', @condition_name=N''

EXEC msdb.dbo.sp_syspolicy_update_target_set @target_set_id=130, @enabled=TRUE
EXEC msdb.dbo.sp_syspolicy_update_target_set_level @target_set_id=130, @type_skeleton=N'Server/Database/Table', @condition_name=N''
EXEC msdb.dbo.sp_syspolicy_update_target_set_level @target_set_id=130, @type_skeleton=N'Server/Database', @condition_name=N''

EXEC msdb.dbo.sp_syspolicy_update_target_set @target_set_id=131, @enabled=TRUE
EXEC msdb.dbo.sp_syspolicy_update_target_set_level @target_set_id=131, @type_skeleton=N'Server/Database/UserDefinedFunction', @condition_name=N''
EXEC msdb.dbo.sp_syspolicy_update_target_set_level @target_set_id=131, @type_skeleton=N'Server/Database', @condition_name=N''

EXEC msdb.dbo.sp_syspolicy_update_target_set @target_set_id=132, @enabled=TRUE
EXEC msdb.dbo.sp_syspolicy_update_target_set_level @target_set_id=132, @type_skeleton=N'Server/Database/UserDefinedType', @condition_name=N''
EXEC msdb.dbo.sp_syspolicy_update_target_set_level @target_set_id=132, @type_skeleton=N'Server/Database', @condition_name=N''

EXEC msdb.dbo.sp_syspolicy_update_target_set @target_set_id=133, @enabled=TRUE
EXEC msdb.dbo.sp_syspolicy_update_target_set_level @target_set_id=133, @type_skeleton=N'Server/Database/View', @condition_name=N''
EXEC msdb.dbo.sp_syspolicy_update_target_set_level @target_set_id=133, @type_skeleton=N'Server/Database', @condition_name=N''

EXEC msdb.dbo.sp_syspolicy_update_target_set @target_set_id=134, @enabled=TRUE
EXEC msdb.dbo.sp_syspolicy_update_target_set_level @target_set_id=134, @type_skeleton=N'Server/Database/XmlSchemaCollection', @condition_name=N''
EXEC msdb.dbo.sp_syspolicy_update_target_set_level @target_set_id=134, @type_skeleton=N'Server/Database', @condition_name=N''
GO
EXEC msdb.dbo.sp_syspolicy_update_policy @policy_id=62, @execution_mode=1, @schedule_uid=N'00000000-0000-0000-0000-000000000000'
GO

–Verify that the new policy is working as intended 

-- Will FAIL 
<pre>create table TEST_Post_NewPolicy (testcol int)
GO

-- Will suceed
create table UserTable_Post_NewPolicy (testcol int)
GO
-- Find an object to alter
Select * from sysobjects where uid = 1 and xtype in ('v','u','p')

-- Alter object to another invalid naming convention

-- Will FAIL
Alter Procedure Test_proc
AS select GETDATE()  as [When]

--Will Suceed
Drop Procedure Test_proc

-- Alter an Invalid object to a valid naming convention

-- Successful
EXEC sp_rename 'Test', 'UserView_Test'
GO 

-- Check the execution history

USE msdb;
GO

     SELECT   a.execution_date AS 'Date Run' , c.name AS 'Policy', a.target_query_expression AS 'Policy Failure Targets',
                  D.name as 'Condition', D.description as 'Condition Description', D.facet as 'Condition Facet'
     FROM syspolicy_policy_execution_history_details_internal a
        INNER JOIN syspolicy_policy_execution_history_internal b
           ON a.history_id = b.history_id
        INNER JOIN  syspolicy_policies_internal c
         ON b.policy_id = c.policy_id
       INNER JOIN syspolicy_conditions d
        ON c.condition_id = d.condition_id
     WHERE a.result = 0 AND a.execution_date >= GETDATE() -1
     ORDER BY a.execution_date DESC, c.name ASC, a.target_query_expression ASC;
GO

With the sql above and the slide deck that is attached to this post, you should be well on your way to creating your own Policies.

vConference – This week!

This is the week that the vConference will air.  SQL Server WorldWide Users Group (SSWUG) puts on the event and this year I was chosen to speak at the event.

If you have not already registered, you can do so here : https://www.vconferenceonline.com/upcoming.asp 

I have a discount code for you to use as well : SVPJLOSP09

The recording of the event was very different from an in person conference.  It took me a few minutes to get used to the cameras and not have feedback from an audience.  That said, I’m happy with my presentations even though I moved a bit too fast and I felt that I didn’t focus on some of the points and bring them back around the way I was hoping for. 

At any rate, enjoy the conference, there is no better value for the money! 

Later this week I’ll have a couple of posts up about my sessions at Dev Connections.  They will get a lot more technical and it’s my hope, be of use for you.

Time

 

Over the past five or six years I’ve put a great deal of thought into starting a blog… What would I write?  Would it be relevant?  Is it a waste of time?  etc… Many of you who read this have probably asked yourselves the same questions. 

As it turns out, I don’t have a great deal of margin or spare time in my daily life.  Even so, I believe that this discipline will help clients, associates, friends and perhaps even my mom better understand what I do for a living.

This blog will be technical in nature and more specifically it will focus on data becoming realized in an organization. 

I’ve been doing independent consulting for the most part over the past 7 years.  My focus over the past 10 years has been with relation database management systems (RDBMS).  I’ve had the opportunity to work with some of the best and brightest in the industry.  My specific focus is with SQL Server, from ground up development to migrations, performance tuning and everything in between.  I’ve thoroughly enjoyed it and learned a great deal.    

Ok, so onto some content worth reading. 

Much like this blog, I’ve found that time is often hard to come by.  I don’t mean time to watch the new episode of the apprentice or time to get in a round of golf after a work day and before the sun sets, but time on your servers. 

Time to dig into your systems and provide value above and beyond your job description.

Time to analyze your environment with cutting edge technology / architecture.

Time to upgrade your skill set to take advantage of the new technology.

Time to tune your databases and identify recurring trends.

Time to think … versus reacting.

Time ….

Over the years I have learned that unlike the majority of commodities in this world, time is not a forgiving entity. 

As this blog matures, I will detail on the tools that I use and the methods employed to best manage the time that I have and have it be the most productive time spent.  I will also write about anything going on in my life as a small business owner and consultant.

I look forward to posting all things data related and humbly thank you for taking the time to read.

 

%d bloggers like this: