Posts Tagged ‘ Schema management ’

Schema Management (DDL Triggers)

Over the years I have used a variety of tools for schema management.  The best one that I really relied on for quite a while was from RedGate.  They offer schema compare and data compare tools that really make quick work of determing the delta against two environments.

Recently I’ve been working with DB Pro to manage this and thus far I’ve been very, very impressed.  However, it has some downside; primarly the cost.  Some of my clients just can’t justify / stomach the dollars involved.  So, in this gap I have been using DDL Triggers.  It’s an automated log, stored in a database (table) that captures ddl changes (Create, Drop, Alter).   This concept is different than a traditional trigger in that it only fires for database events, not data events.  Meaning that the scope of this trigger is at the schema or the structure level, not the data or the dml level.

The value that it has provided some of my clients is simply the certainty that they know what happened in a specific environment and it saves a great deal of time when it comes to debugging, troubleshooting or otherwise trying to figure out why something that worked perfectly yesterday in the database does not work correctly today.

There are a few steps that need to be taken to make this work.  The first is to determine if you want to store this data in it’s own database; or store it in a table in an existing database.  It can be done either way.  In the code below you will see that I’ve created a stand alone db with one table in it.



use ddl_changelog;


DDLChangeLogID [int] IDENTITY(1,1) NOT NULL,

eventtype nvarchar(100),

objectname nvarchar(100),

objecttype nvarchar(100),

databasename nvarchar(100),

schemaname nvarchar(100),

SystemUser nvarchar(255),

AppName nvarchar(255),

HostName nvarchar(255),

tsql nvarchar(MAX),

createddate datetime,


(DDLChangeLogID ASC))
The code above will simply create a database and a table.

Once you have that in place, then choose the database that you would like to track ddl changes on and create this database trigger.







IF @data.value(‘(/EVENT_INSTANCE/EventType)[1]’, ‘nvarchar(100)’) <> ‘CREATE_STATISTICS’

INSERT INTO ddl_changelog..DDLChangeLog

(EventType, ObjectName, ObjectType, DatabaseName, SchemaName, SystemUser, AppName, HostName, tsql , createddate)












system_user , app_name (),host_name(),


                           ‘nvarchar(max)’) ,

getdate()) ;



So, let’s test this out.

With the database that you just created the trigger on, let’s do something simple like this:

            SELECT 1 as test into TestTable;

           SELECT * from TestTable;

           DROP TABLE TestTable;

SELECT * FROM DDL_ChangeLog..DDLChangeLog

wow!  there is data there, not just any data, but it should have returned two rows; even though there are three statements above in our test.  This is because that trigger only fired for the Create table (select into) and the drop table statements.  The trigger will not fire for the select statement.

I initially started using these for my own reasons / purposes; however as time has gone on, I’ve found that there are other audiences that LOVE knowing this can be tracked… Auditors love this functionality, QA loves to be able to verify that the blackbox they are testing has some controls / audit processes around it.  Management loves it because they can have hard data to go back to the business on in the event of an outage, mistake or whatever.  I’ve found that it’s typically best to be as transparent as possible with changes / modifications and the state of the databases that I’m responsible for.  In this manner, I’m able to better serve my clients and in turn that are more likely to believe / trust my opinion when it counts.

%d bloggers like this: