I recently wrote some code with the typical data access layer as an interface to SQLServer. The code I wrote doesn’t include a UI and has various operations occurring in its pipeline including authentication, model validation, data aggregation and so forth. So now that the code is written it’s time for Unit Testing right?
Anybody that has had to write Unit Tests has at some point encountered issues with testing scenarios that include some kind of server. A typical server issue when associated with a Unit Test is the need to build and/or reset the state of the server so that data is “just right” so that the Unit Test can perform its job. That server might be IIS, SQLServer, SharePoint or something else but as soon as any server is introduced into the mix there is an immediate desire to pull back and start mocking things to remove the server issues.
Unfortunately that means (when dealing with a database server) that stored procedures, table/column definitions and things like unique or check constraints, SQL data access layer and transaction/nested transaction commit/aborts won’t get tested.
Desiring to test these things along with everything else in the pipeline I thought that LocalDB might be the answer to my issues. This article is about implementing that.
Generating/Updating the LocalDB Database
To start the process I created everything I need in a local SQLServer instance. This makes it easy to implement and test since there are very good tools to do this.
Once I had all of it working in my local SQLServer I created a Visual Studio 2013 SSDT ‘SQL Server Database Project’. The great thing about this project is how it makes it so easy to move changes from one database to another. In my case, every time I made a change to the database, I needed to move the changes to a the VS database project (source control), SQL Azure database AND to my Unit Test LocalDB. Doing all of this for any change takes less than 5 minutes each time.
One quirk I ran into with the ‘SQL Server Database Project’ is that schema compare wouldn’t connect to the LocalDB that was in my Unit Test project. I eventually ended up leaving the LocalDB in the APP_DATA folder in a Web project where I would make the updates and then file copy the database into the Unit Test project.
The database files are stored at the root of the Unit Test project. I would prefer to store them in a folder but idiosyncrasies with the [DeploymentItemAttribute] and changes in behavior on how the Unit Test runs in Release versus Debug caused me to just leave it in the root of the project and configure the [DeploymentItemAttribute] to also copy the database files into the root of the test location rather than a folder.
Configure the LocalDB Database
Mark the database files as ‘Content’ and ‘Copy Always’ and attribute each Unit Test with:
[DeploymentItem(@"MyDatabase.mdf")] [DeploymentItem(@"MyDatabase_log.ldf")]
The connection string I left in the app.config for the Unit Test project and it presented the next challenge. When a Unit Test is run the actual on disk location of the database file will vary. Combine this with the need to use AttachDbFilename in the LocalDB connection string and you could create some interesting code pulling out connection string, figuring out directories and using string.Format to doctor the connection string before use. However the location in the code that actually pulls the connection string from the configuration was deep within the SQL data layer and I didn’t want to try to modify the code to work with both Unit Test and Production. Thankfully I found the answer to this at http://stackoverflow.com/questions/12244495/how-to-set-up-localdb-for-unit-tests-in-visual-studio-2012-and-entity-framework. Combining using ‘|DataDirectory|’ in the connection string in the app.config with the following code in each test class solved that problem.
[ClassInitialize] public static void ClassSetup(TestContext context) { AppDomain.CurrentDomain.SetData( "DataDirectory", Path.Combine(context.TestDeploymentDir, string.Empty)); }
Reset Database State for Each Test
So now the Unit Test can use the LocalDB but I also need to reset the database state before each Unit Test runs. I could figure out a scenario like detaching the database (if it’s attached), recopying the database files and reattaching it before each test but I thought it would be easier to just use the same database and in [TestInitialize] I could just truncate all of the tables.
Unfortunately all of the tables have identity columns, foreign keys, check constraints and all of the usual things you find in a database. This meant I couldn’t run a SQL script in [TestInitialize] to just truncate all of the tables.
I then decided I’d delete all of the rows from each table and use DBCC CHECKIDENT to reset the identity columns so I could guarantee row ids in objects that were inserted into the SQL tables. This led me down an interesting path.
Look at the documentation on DBCC CHECKIDENT and you’ll find the following in the documentation
The highlighted text is inconsistent with the behavior of SQLServer 2014 and LocalDB v11.0. I didn’t test with any other versions of SQLServer or LocalDB so I don’t know if they have these issues as well but the actual behavior (you can decide for yourself which SQL is obeying the documentation as it’s still not clear to me) when using “DBCC CHECKIDENT(‘MyTable’, RESEED, 0)” after ‘DELETE FROM MyTable’ is:
- SQLServer 2014 – The next row inserted has a row id of 1
- LocalDB v11.0 – The next row inserted has a row id of 0!!!!
What? I didn’t even know it was possible to have a Row ID of 0. After many trials and tribulations and wondering if I needed to rethink using LocalDB I came up with the following SQL script that is run in [TestInitialize] (it runs before every test):
BEGIN TRY BEGIN TRAN T1 DECLARE @ID_TO_CHECK BIGINT DELETE FROM MyTable DBCC CHECKIDENT('MyTable', RESEED, 0) SAVE TRANSACTION T2 INSERT INTO MyTable(Title) VALUES('test') SELECT @ID_TO_CHECK = MAX(MyTableId) FROM MyTable IF (@ID_TO_CHECK > 0) BEGIN ROLLBACK TRANSACTION T2 DBCC CHECKIDENT('MyTable', RESEED, 0) END ELSE DELETE FROM MyTable -- Do more tables COMMIT TRAN T1 END TRY BEGIN CATCH DECLARE @Error INT DECLARE @ErrorMessage NVARCHAR(4000) DECLARE @ErrorSeverity INT DECLARE @ErrorState INT SELECT @ErrorMessage = ERROR_MESSAGE(), @ErrorSeverity = ERROR_SEVERITY(), @ErrorState = ERROR_STATE() ROLLBACK TRAN T1 RAISERROR(@ErrorMessage, @ErrorSeverity, @ErrorState) END CATCH
This makes sure that when the test runs the next row that is inserted into the table will have a Row ID of 1 (NOT 0!!).
Nice article on LocalDB. I remember struggling with that a while ago. But why don’t you use NDbUnit to set up the test data?
“So now that the code is written it’s time for Unit Testing right?”
No, unit tests should be written while you code.
“Anybody that has had to write Unit Tests has at some point encountered issues with testing scenarios that include some kind of server.”
No, those are INTEGRATION tests. There is a massive difference.
“No, those are INTEGRATION tests. There is a massive difference.”
Depends on the code intent do begin with. If it’s a library atop another db library then you want unit tests asserting that your code works as intended when it is used on a db engine. Thus, unit tests.