At CodePaLousa Day #1, I attended a session, Tools and Adoption Patterns for Automating Database Deployment, about doing automated change in databases. That’s more than I need to do right now, however, it gave me a +1 on how I’m currently managing change, which is my default method if the customer doesn’t provide one.
What I do:
- For every change, I start or append to a script file.
- When a change gets to prod or “moves on” to branch, I start a different script file.
- Every script file must be re-runnable.
- Every script file must have a precondition check – something that is true of the database before it can run against it.
- For SqlServer, if the precondition check fails, I turn noexec on to prevent the rest of the script from running.
- For SqlServer, DDL statements are wrapped in exec() so that I don’t get compilation errors.
Here’s a sample, with names changed to protect the client:
SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO -- This is something from the most recent previous round of changes, which should already exist. PRINT 'Precondition Check' if (not exists (select 1 from information_Schema.COLUMNS where TABLE_NAME='Quadrant' and COLUMN_NAME='ImperialCode')) begin PRINT ' Failed -- skipping rest of script' set noexec on end else begin PRINT ' Passed' end GO -- In support of TA26 (revision 1683): PRINT 'Table NoRebelsAllowedRule' if (not exists (select 1 from INFORMATION_SCHEMA.TABLES where TABLE_NAME='NoRebelsAllowedRule')) begin CREATE TABLE [dbo].[NoRebelsAllowedRule]( [NoRebelsAllowedRuleID] [int] IDENTITY(1,1) NOT NULL, [CatalogID] [int] NOT NULL, [DeletedOn] [datetime] NULL, [DeletedByUserID] [int] NULL, [CreatedOn] [datetime] NOT NULL, [CreatedByUserID] [int] NOT NULL, [UpdatedOn] [datetime] NULL, [UpdatedByUserID] [int] NULL, CONSTRAINT [PK_NoRebelsAllowedRule] PRIMARY KEY CLUSTERED ( [NoRebelsAllowedRuleID] ASC )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY] ) ON [PRIMARY] PRINT ' Created' end else begin PRINT ' Already Exists' end GO PRINT 'Foreign Key FK_NoRebelsAllowedRule_Catalog' if (not exists ( select 1 from information_schema.REFERENTIAL_CONSTRAINTS where constraint_name='FK_NoRebelsAllowedRule_Catalog' ) ) begin ALTER TABLE [dbo].[NoRebelsAllowedRule] WITH CHECK ADD CONSTRAINT [FK_NoRebelsAllowedRule_Catalog] FOREIGN KEY([CatalogID]) REFERENCES [dbo].[Catalog] ([CatalogID]) PRINT ' Created' end else begin PRINT ' Already Exists' end GO PRINT '04/23/13 Replacing Catalog.CannotFireBlasters with BlasterBehaviorID' if (not exists ( select 1 from information_schema.COLUMNS where table_name='Catalog' and column_name='BlasterBehaviorID' ) ) begin alter table [dbo].[Catalog] add BlasterBehavior INT NOT NULL DEFAULT '0'; PRINT ' Added BlasterBehaviorID' end else begin PRINT ' BlasterBehaviorID Already Exists' end if (exists (select 1 from information_Schema.COLUMNS where table_name='Catalog' and column_name='CannotFireBlasters' ) ) begin exec (N'update [dbo].[Catalog] set BlasterBehaviorID=0 where CannotFireBlasters=0'); exec (N'update [dbo].[Catalog] set BlasterBehaviorID=3 where CannotFireBlasters=1'); PRINT ' Migrated Data from CannotFireBlasters' alter table [dbo].[Catalog] drop column CannotFireBlasters PRINT ' Dropped old column CannotFireBlasters' end else begin PRINT ' Old column CannotFireBlasters does not exist' end GO set noexec off GO
Since the above sample, I’ve started adding dates and work item numbers into the script as well, to make things a bit more traceable. I figured as long as I had a sequence going, I’d append to the same script, though in an environment with multiple developers, you may want to partition more.
Why I didn’t..
- RedGate Sql Compare tools: I don’t own them.
- Sql Server Project in Visual Studio: When you actually get around to using them against real environments, there are a ton of problems around “Server” projects verses “Schema” projects (or some such) which are worse in 2012 than they used to be.
- EF Migrations: Haven’t spent the time to learn them, desire to be productive, sticking with what I already know and can do quickly.
Hey Sunny,
Do you use source control for this? And do you use a templating or custom program to generate these scripts?
I agree that implementing ground-breaking patterns for huge, SQL-driven applications could be a bit too much overhead for the average data-driven application, but I do find myself missing source control and I’d really like to reduce the number of steps to make SQL deployment and source control less tedious. For example, instead of 1. generating scripts in SQL, 2. wrapping them the way you like, 3. adding them to source control, we just have step 1: generate a wrapped script and add it to source control.
When I worked at Atria, a friend of mine (Michael, who also works at igNew) developed a little app that would export scripts to the particular format that we used when deploying database app. There, however, we didn’t even use source control for code (scary huh? but it still worked). Maybe if we aren’t willing to shell out the big bucks (anything at Red Gate is too expensive), we’re just destined to write our own or deal with the tedium of source controlling database scripts. I personally use SqlDelta at LeapFrog, so the process of merging/changing is separate from update scripts and source control, but I still do have the scripts in source control just for the sake of having the history.
Yes, I check the change scripts into source control. Right next to the DTO’s that they fill out so its obvious they exist.
No, I don’t use a templating program to generate them. Other than “copy definition to clipboard”, paste, and modify.
Since I wrote the post, I’ve learned to put each “feature” into its own script.. I had to promote a performance thing from Dev to Release without taking anything else. It merged up with the rest of the code, I applied it, it worked like a charm.
One of the factors to think about is “how often will i create a change” and “how often will I apply it”. If the changes are few and far between, then hand rolling things is fine. However, if you keep track of hours spent on this sort of stuff, all it takes is 20 hours at $100/hr and you’ve now justified some pretty big tools. Not to mention the reduction in stress.
Back in my Perl and MySQL days, I worked at an organization where people would routinely “hack” production to fix something without going through the environments. I took on the role of “differ” – I wrote code to inspect all the code and schema’s on the various servers, diff them with each other, and create a grid of what was different — to determine what the changes were. Then I would take the nerf bat and beat somebody on the head with it and tell them to get their code into CVS. Which sounds horrible, but its just “change management”. Usually, source control is the agreed-to-authority; in this case, “what was released” was the authority, and source control needed to catch up.
SqlDelta? Filing away under “stuff to pull out the next time I need something like that” .. thanks!
Lol, the entire paragraph starting with, “Back in my Perl…days…” pretty much explains why Michael wrote that little app we used: we had to keep close watch on production as well, due to hot fixes that were sometimes necessary, but would have been rolled back due to a production update. What you wrote seems like it was very similar. He added in some context menu options, I believe, to open up diff/merge programs (I think his only hooked into existing programs that could be started with parameters – it didn’t do any diffing itself).
When two different people write the same thing, I always see an open source project 😀
Good point you make about how often something is done and where in the time/cost continuum it falls. I think instead of just saying, “but…but I want it,” those numbers may prove useful to plead my case next time.