I'm doing some work on an SSIS package, and I'm also updating the database schema it uses. The package isn't loading and VS appears to be hung. It says in the status bar that is validating the package. Is this what offline mode is for?
It's been like this for 45 minutes. Software today should detect it's in this state and then give the user to skip the validation. How long does it take to drop a column on a table with 150,208,363. I guess when you have 8 indexes where the column is included as part of a covering index it can take a while.
Especially when running on virtualized machines....darn you EC2!
Showing posts with label sql. Show all posts
Showing posts with label sql. Show all posts
Saturday, December 3, 2011
Thursday, October 6, 2011
Adding a new GUID column in SSIS
I just discovered there is no way in SSIS (at least in SQL2005), to add a new column of type guid. I have found some different work arounds one of which I will share quickly.
- Add a new Script Component, to your data flow and set the type to transformation.
- Edit the component and go to the Inputs/Outputs tab. Under the output add a new column. Make sure it has a data type of Guid (FYI, I have seen references the type is called DT_GUID; however, my version of VS is not functioning properly. My data type field is all black and when I click in it, the edit dialog closes. I set the data type by finding the output column in the xml and setting the dataType attribute to "guid".
- Go to the script tab and in the method add Row.<YourColumnName>
=Guid.NewGuid()
A very easy work around.
Thursday, October 1, 2009
Just trust me!!
Today I was working with my intern, we are using DotNetNuke to run a marketing site so that our product team can go in and tweak some things. We have custom workflow in there, and a requirement that we want to save the results of the workflow into our own table so we can report on it latter.
The workflow component has some facilities to do this, so he creates a table and a stored proc, and gives it a shot. He is instantly greated with a lovely error in the front end Invalid Syntax Near xyz
I tell him that's not good, as it indicates a potential for a Sql injection attack. Fast forward three hours, and he is arguing with me that its safe, that it can't be injected, and demands I prove to him it can be done.
Not wanting to take the time or effort to figure out the magic sequence, as this isn't something I do everyday (Contrary to popular belief, I don't sit around trying to hack). I tell him to trust me, parsing errors are easy to inject into. He doesn't buy it, and I can see he is going to be stubborn until I prove him wrong.
Fast forward ten minutes, and I found that injecting with this forms component was as simple as entering a string like the following for the last field:
Gahh Interns!
The workflow component has some facilities to do this, so he creates a table and a stored proc, and gives it a shot. He is instantly greated with a lovely error in the front end Invalid Syntax Near xyz
I tell him that's not good, as it indicates a potential for a Sql injection attack. Fast forward three hours, and he is arguing with me that its safe, that it can't be injected, and demands I prove to him it can be done.
Not wanting to take the time or effort to figure out the magic sequence, as this isn't something I do everyday (Contrary to popular belief, I don't sit around trying to hack). I tell him to trust me, parsing errors are easy to inject into. He doesn't buy it, and I can see he is going to be stubborn until I prove him wrong.
Fast forward ten minutes, and I found that injecting with this forms component was as simple as entering a string like the following for the last field:
injectComing' select * from aspnet_users--Watching with profiler, we quickly saw three key events. First was a Batch Starting which looked something like this:
exec myProc '','','','injectComing' select*from aspnet_users--';Second was SQL Stmt Starting Event:
exec myProc '','','','injectComing'Third was another Sql Stmt Starting Event:
select*from aspnet_users--';Needless to say now he is working on a costume module so we can execute the Sql as a parameterized query and avoid all this ass ache. Moral of the story? While you shouldn't always trust everyone, you should trust your boss who has 10 more years of experience. And if you don't you try and prove him wrong, don't demand your boss to prove himself.
Gahh Interns!
Wednesday, August 26, 2009
Duration in RPC:Completed is Misleading
I discovered something today while troubleshooting a long running query today. We have a specific dialog in our application which in a specific case takes 60 seconds to open up. I fire up SQL Profiler and click run just to see the standard events and verify if this a DB issue or not.
I find that there is a specific query which does take 60 seconds to run according to the RPC:Completed event (We are using SQL Server 2005). So I pull that query out and run it inside of SSMS, it completes in under a second. These are the issues I hate having to troubleshoot. So the first thing was to rule out our web application, so I create a little .net windows form app and I basically run this:
The query runs fine no issue. Now we are actually using nHibernate so I wanted to simulate us processing the result set. I add a small Thread.Sleep(5) (my result set has 1500 items in it so a small sleep duration is plenty). I run the query and wait about eight seconds.
Then I go and look at SQL Profiler RPC:Completed (And Stmt:Completed events). The duration for the events is 8 seconds. Now I am asking myself if I opened a ticket with Microsoft burned a support incident for an nHibernate issue or not.
The moral of the story? Well don't always trust what you think the tool is measuring. And if your interested, when I ran my queries loading the entities through nHibernate with the windows form, the query again completed extremly fast. So I am still hunting this one down.
Update
I found and resolved the issue. It wasn't an issue with SQL but a configuration issue which caused the system to try and log to an invalid file thousands of times which slowed the system down.
This is very eye-opening as one of my fundamental assumptions about SQL Profiler and how to interpret the results has been crushed.
I find that there is a specific query which does take 60 seconds to run according to the RPC:Completed event (We are using SQL Server 2005). So I pull that query out and run it inside of SSMS, it completes in under a second. These are the issues I hate having to troubleshoot. So the first thing was to rule out our web application, so I create a little .net windows form app and I basically run this:
sqlCmd.Connection.Open();
using (var reader = sqlCmd.ExecuteReader())
{
int i = 0;
while (reader.Read())
{
i++;
}
MessageBox.Show(i.ToString());
}
The query runs fine no issue. Now we are actually using nHibernate so I wanted to simulate us processing the result set. I add a small Thread.Sleep(5) (my result set has 1500 items in it so a small sleep duration is plenty). I run the query and wait about eight seconds.
Then I go and look at SQL Profiler RPC:Completed (And Stmt:Completed events). The duration for the events is 8 seconds. Now I am asking myself if I opened a ticket with Microsoft burned a support incident for an nHibernate issue or not.
The moral of the story? Well don't always trust what you think the tool is measuring. And if your interested, when I ran my queries loading the entities through nHibernate with the windows form, the query again completed extremly fast. So I am still hunting this one down.
Update
I found and resolved the issue. It wasn't an issue with SQL but a configuration issue which caused the system to try and log to an invalid file thousands of times which slowed the system down.
This is very eye-opening as one of my fundamental assumptions about SQL Profiler and how to interpret the results has been crushed.
Thursday, April 2, 2009
Setting up SSRS on server with multiple sites
I had to set up SSRS on a server today, which has about six or seven different sites. I went through and configured SSRS, bound the virtual directories to a free IP address, fired up the browser and kaboom!

Firing up google was very frustrating as all the solutions I was finding told me to check permissions. The issue here is most likely due to SSRS defaulting to using http://localhost to access the web services. (If anyone from Microsoft or knows whose responsible for this piece of code it'd be great to give a little bit more detail). How do you tell SSRS to use a different server?
There is a config file in %SQL Install Folder%\MSSQL.*\Reporting Services\ReportManager\RSWebApplication.config.. In side this file you will see some xml which looks like
<ui>
<reportserverurl></reportserverurl>
<reportservervirtualdirectory>ReportServer</reportservervirtualdirectory>
<reportbuildertrustlevel>FullTrust</reportbuildertrustlevel>
</ui>
Perfect right so I enter in the URL for my site, fire up the report manager and....kaboom! This time the site doesn't even load and we get the following error from our log files:
w3wp!diagnostic!6!4/2/2009-12:03:16:: Error loading configuration file: The configuration file contains an element that is not valid. The ReportServerUrl element is not a configuration file
This is a horrible error message and doesn't really indicate what the issue. Turns out that you can have either ReportServerUrl or ReportServerVirtualDirectory. Enter the fully qualified URL for ReportServerUrl and remove the ReportServerVirtualDirectory node. Now we fire up the ReportManager in your favorite web browser...and...it works!
I could have saved many many hours and advil, had the developers done proper error handling. Ahh well hopefully you find this article before you've wasted to many hours.

Firing up google was very frustrating as all the solutions I was finding told me to check permissions. The issue here is most likely due to SSRS defaulting to using http://localhost to access the web services. (If anyone from Microsoft or knows whose responsible for this piece of code it'd be great to give a little bit more detail). How do you tell SSRS to use a different server?
There is a config file in %SQL Install Folder%\MSSQL.*\Reporting Services\ReportManager\RSWebApplication.config.. In side this file you will see some xml which looks like
<ui>
<reportserverurl></reportserverurl>
<reportservervirtualdirectory>ReportServer</reportservervirtualdirectory>
<reportbuildertrustlevel>FullTrust</reportbuildertrustlevel>
</ui>
Perfect right so I enter in the URL for my site, fire up the report manager and....kaboom! This time the site doesn't even load and we get the following error from our log files:
w3wp!diagnostic!6!4/2/2009-12:03:16:: Error loading configuration file: The configuration file contains an element that is not valid. The ReportServerUrl element is not a configuration file
This is a horrible error message and doesn't really indicate what the issue. Turns out that you can have either ReportServerUrl or ReportServerVirtualDirectory. Enter the fully qualified URL for ReportServerUrl and remove the ReportServerVirtualDirectory node. Now we fire up the ReportManager in your favorite web browser...and...it works!
I could have saved many many hours and advil, had the developers done proper error handling. Ahh well hopefully you find this article before you've wasted to many hours.
Monday, July 21, 2008
Fixing users in SQL2005 after restoring a DB
After you restore a database in SQL Server 2005 to a different server, you will need to fix the login associated with the users in the database this can be done by using the following stored procedure:
If you don't do this you will get an error while trying to login as the user. This needs to be done for each user in the database.
exec sp_change_users_login 'AUTO_FIX','UserName'
If you don't do this you will get an error while trying to login as the user. This needs to be done for each user in the database.
Monday, April 14, 2008
Rebuilding Indexes in SQL2005
I have been developing on Microsoft's SQL Server since SQL 2000; however, I always had the luxury of having a dedicated IT team who was in charge of managing the production environments. Now that I am working for a startup company and am the sole developer, build engineer, and production team, I have to handle this aspect of my job.
I ran into what I will call a bug in SQL 2005 that I can't believe has made it into SP2. And while as you will see this bug is very minor it is a good example, of a poor design of a management tool and error handling.
I am setting a new nightly maintneance plan and set it up with the following steps:
All this is telling me is that a step failed. It provides no indication of what failed or why. I brough up the report from the maintneance package it told me to check the SQL Agent Job history to find out why it failed. Which as I showed is not helpful.
So I resorted to the only option I felt I had, I disabled every step and ran each one building upon each other. This is when I noticed the issue. I had setup the Rebuild Index task to "Keep index online while rebuilding" option selected.
Online indexing is a new feature in SQL 2005 that helps improve your uptime since you can do maintneance on your indexes without taking them offline. So I had though let's select this option.
This option doesn't work for my environment because we are using SQL Server Standard edition. Here's my gripe about the error handling here. Why didn't the error messages indicate that the rebuidling indexing failed? Better question is: Why is this package even allowed to be created?
Editors Note
Since writting this I have discovered that better error information is available in MSSQL\LOG directory. While this is useful, and in fact in this case it does say exactly the problem, I feel my post still holds true. The SQL Management Studio should have provided me better error information, or even better disabled that option entirely.
I ran into what I will call a bug in SQL 2005 that I can't believe has made it into SP2. And while as you will see this bug is very minor it is a good example, of a poor design of a management tool and error handling.
I am setting a new nightly maintneance plan and set it up with the following steps:
- Check Database Integrity (Right now our DB's are so small that I am going to do this nightly but some people recommend weekly is sufficent)
- Rebuild Indexes
- Full Backup of Databases
- Cleanup Old Backups
- Notify Operator
I go ahead and run the backup, and I notice my databases are backuped to the location I specify, yet the UI tells me that the job has failed. I open up the job details and I see the following error message:
Date 4/14/2008 9:30:06 AMLog Job History (Nightly
Maintneance)
Step ID 1Server XXXXX
Job Name Nightly Maintneance
Step Name Subplan
Duration 00:01:27
Sql Severity 0Sql
Message ID 0Operator Emailed Operator Net
sent Operator Paged Retries
Attempted 0
MessageExecuted as user: XXXXXX
The package execution failed. The step failed.
All this is telling me is that a step failed. It provides no indication of what failed or why. I brough up the report from the maintneance package it told me to check the SQL Agent Job history to find out why it failed. Which as I showed is not helpful.
So I resorted to the only option I felt I had, I disabled every step and ran each one building upon each other. This is when I noticed the issue. I had setup the Rebuild Index task to "Keep index online while rebuilding" option selected.
Online indexing is a new feature in SQL 2005 that helps improve your uptime since you can do maintneance on your indexes without taking them offline. So I had though let's select this option.
This option doesn't work for my environment because we are using SQL Server Standard edition. Here's my gripe about the error handling here. Why didn't the error messages indicate that the rebuidling indexing failed? Better question is: Why is this package even allowed to be created?
Editors Note
Since writting this I have discovered that better error information is available in MSSQL\LOG directory. While this is useful, and in fact in this case it does say exactly the problem, I feel my post still holds true. The SQL Management Studio should have provided me better error information, or even better disabled that option entirely.
Subscribe to:
Posts (Atom)