This is such a simple thing that even asking here is making me feel stupid but since I have been stuck on this for long time, I will ask it here. I am working on a data-tier application in visual studio. I have usual things like tables, stored procs and some post deployment data. By default, data tier application comes with Scripts/Post-Deployment folder. Inside this folder there is a file called Script.PostDeployment.sql. Just to be little more organised, I am creating folders inside Post-Deployment as StaticData and TestData. My insert statements for data creating are locatied inside these folders. So, based on this structure, I am adding following code to my Script.PostDeployment.sql:
/*
Post-Deployment Script Template
--------------------------------------------------------------------------------------
This file contains SQL statements that will be appended to the build script.
Use SQLCMD syntax to include a file in the post-deployment script.
Example: :r .\myfile.sql
Use SQLCMD syntax to reference a variable in the post-deployment script.
Example: :setvar TableName MyTable
SELECT * FROM [$(TableName)]
--------------------------------------------------------------------------------------
*/
:r .\StaticData\States.sql
:r .\TestData\Logins.sql
The problem is the above code does not work. For some starnge reason, the deploy command just ignores the paths and looks for States.sql and Logins.sql in Scripts/Post-Deployment and not in appropriate subfolders. Anyone else encountered anything similar? Very simple issue, but taking me forever to get around this. I have tried my best to explain, but ask questions and I can try to make things clearer. Thanks!
I took a look at your sample code. When I had tried to reproduce this, I was using a SQL 2008 database project in Visual Studio 2010, but what your project is a data-tier application, and that is very different; when I switched to using a data-tier application, I was able to reproduce what you're seeing.
Data-tier applications produce DAC packages that contain the definitions of objects and also contain user-defined scripts, like the pre and post deployment scripts. Now, I'm not 100% certain (I haven't used DAC packages before, so I'm basing this on observation and research), but I'm guessing that the file structure of the DAC package doesn't support sub-folders under the Script\Post-deployment folder; I am assuming it has a pretty strict folder structure internally. Consequently, the DACCompiler appears designed to strip out just the filenames from your file references in the post-deployment script, and it ignores the directory path.
There is a whitepaper on data-tier applications here. In it is a section on adding a post-deployment script to the package, and in that section are some best practices, including the following:
• When you work in Solution Explorer, it is recommended that you include all post-deployment commands in the Script.PostDeployment.sql script file. This is because only one post-deployment file is included in the DAC package. In other words, you should not create multiple files.
Now, technically, that's what the :r command does, but you may find it easier to just embed the commands directly into the file manually.
It's also possible that this is simply a bug in the design of the DACCompiler.
Here's what I recommend that you do:
http://connect.microsoft.com/SQLServer and recommend that the product team
address it in a future version of the product. This is a great place to make these kinds of recommendations, because the feedback goes to the product team, the user community at large can vote on feedback to increase its weight, and the product team can communicate back to you with information about the feedback.
And, of course, you could hold out and see if somebody else has a different answer, and if there is, great! But I'm guessing if nobody else has responded yet, then probably there isn't one; I certainly couldn't find anything in my digging.
I hope overall this information is helpful. I wish I could give you a way to have it work now, but I think your best bet is to work within the limitations of the current design and post feedback to Connect.
Good luck.