SQL 2008 no longer supports DMO. Below is some VB.NET Code for creating a database in VB.NET using DMO
Imports SQLDMO
..
Public Function CREATE_DATABASE(ByVal NewDatabaseName As String, ByVal MDFPath As String, ByVal LDFPath As String, ByRef Message As String) As Boolean
REM Create an SQLDMO application
Dim sqlApp As New SQLDMO.ApplicationClass()
REM Create an Server, which resembles to your actual server
Dim srv As New SQLDMO.SQLServerClass()
REM Create Database
Dim nDatabase As New SQLDMO.Database()
REM Create Data Files
Dim nDBFileData As New SQLDMO.DBFile()
REM Create Log Files
Dim nLogFile As New SQLDMO.LogFile()
Try
REM Assign a name to database
nDatabase.Name = NewDatabaseName
REM Assign a name to datafile
nDBFileData.Name = NewDatabaseName
nDBFileData.PhysicalName = MDFPath
nDBFileData.PrimaryFile = True
nDBFileData.Size = 2
nDBFileData.FileGrowthType = SQLDMO.SQLDMO_GROWTH_TYPE.SQLDMOGrowth_MB
nDBFileData.FileGrowth = 1
REM Add the DBFile object
nDatabase.FileGroups.Item("PRIMARY").DBFiles.Add(nDBFileData)
REM Assign name to Log files
nLogFile.Name = NewDatabaseName + "_Log"
nLogFile.PhysicalName = LDFPath
nLogFile.Size = 2
nDatabase.TransactionLog.LogFiles.Add(nLogFile)
srv.LoginSecure = True
srv.Connect(_SourceDatabaseServer)
srv.Databases.Add(nDatabase)
srv.DisConnect()
srv = Nothing
Return True
Catch ex As Exception
Message = ex.Message
Return False
End Try
End Function
To do the same thing in SMO
Imports Microsoft.SqlServer.Management.Smo
Imports Microsoft.SqlServer.Management.Common
...
Public Function CREATE_DATABASE(ByVal NewDatabaseName As String, ByVal MDFPath As String, ByVal LDFPath As String, ByRef Message As String) As Boolean
Try
'Server server = new Server("localhost");
Dim iServer As New Server(_SourceDatabaseServer)
'Database database = new Database(server, "TESTDATABASE");
Dim iDB As New Database(iServer, NewDatabaseName)
'FileGroup fileGroup = new FileGroup(database, "PRIMARY");
Dim iFileGroup As New FileGroup(iDB, "PRIMARY")
'DataFile dataFile = new DataFile(fileGroup, "TESTDTABASE_DATA", "c:
'\\TESTDATABASE_DATA.mdf");
Dim iDataFile As New DataFile(iFileGroup, NewDatabaseName + "_Data", MDFPath)
'dataFile.GrowthType = FileGrowthType.Percent;
'dataFile.Growth = 10;
'dataFile.Size = 4000;
'fileGroup.Files.Add(dataFile);
iFileGroup.Files.Add(iDataFile)
'database.FileGroups.Add(fileGroup);
iDB.FileGroups.Add(iFileGroup)
'LogFile logFile = new LogFile(database, "TESTDATABASE_LOG", "c:
'\\TESTDATABASE_LOG.ldf");
Dim iLogFile As New LogFile(iDB, NewDatabaseName + "_Log", LDFPath)
iLogFile.GrowthType = FileGrowthType.Percent
'logFile.Growth = 10;
iLogFile.Size = 2
'database.LogFiles.Add(logFile);
iDB.LogFiles.Add(iLogFile)
'database.Create(false);
iDB.Create(False)
Return True
Catch ex As Exception
Message = ex.Message
Return False
End Try
End Function
Tuesday, 23 December 2008
Tuesday, 16 December 2008
Background Worker Threads in WPF
Here's a nice example how to use background worker threads in WPF
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Windows;
using System.Windows.Controls;
using System.Windows.Data;
using System.Windows.Documents;
using System.Windows.Input;
using System.Windows.Media;
using System.Windows.Media.Imaging;
using System.Windows.Shapes;
using System.ComponentModel;
using System.Threading;
namespace MultiCoreProgramming
{
///
/// Interaction logic for BackgroundWorkerThreadInLine.xaml
///
public partial class BackgroundWorkerThreadInLine : Window
{
public BackgroundWorkerThreadInLine()
{
InitializeComponent();
}
private void button1_Click(object sender, RoutedEventArgs e)
{
ValidateAndPersistLayers();
}
private void ValidateAndPersistLayers()
{
this.button1.IsEnabled = false;
var worker = new BackgroundWorker();
worker.DoWork += (doSender, doEventArgs) =>
{
// DO WORK
// BUT if you access dependancy properties you will get the error
// The calling thread cannot access this object because a different thread owns it.
if (true)
{
// Success
doEventArgs.Result = true;
}
//else
//{
// // Failure
// doEventArgs.Result = false;
//}
};
worker.RunWorkerCompleted += (runSender, runEventArgs) =>
{
this.button1.IsEnabled = true;
//this.nameResults.Content = runEventArgs.Result as string;
};
worker.RunWorkerAsync();
}
}
}
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Windows;
using System.Windows.Controls;
using System.Windows.Data;
using System.Windows.Documents;
using System.Windows.Input;
using System.Windows.Media;
using System.Windows.Media.Imaging;
using System.Windows.Shapes;
using System.ComponentModel;
using System.Threading;
namespace MultiCoreProgramming
{
///
/// Interaction logic for BackgroundWorkerThreadInLine.xaml
///
public partial class BackgroundWorkerThreadInLine : Window
{
public BackgroundWorkerThreadInLine()
{
InitializeComponent();
}
private void button1_Click(object sender, RoutedEventArgs e)
{
ValidateAndPersistLayers();
}
private void ValidateAndPersistLayers()
{
this.button1.IsEnabled = false;
var worker = new BackgroundWorker();
worker.DoWork += (doSender, doEventArgs) =>
{
// DO WORK
// BUT if you access dependancy properties you will get the error
// The calling thread cannot access this object because a different thread owns it.
if (true)
{
// Success
doEventArgs.Result = true;
}
//else
//{
// // Failure
// doEventArgs.Result = false;
//}
};
worker.RunWorkerCompleted += (runSender, runEventArgs) =>
{
this.button1.IsEnabled = true;
//this.nameResults.Content = runEventArgs.Result as string;
};
worker.RunWorkerAsync();
}
}
}
Monday, 15 December 2008
Hallo World Asynchronous WCF calls from WPF
Here is how to call the default service in a non blocking way in WPF:
Add the reference
using System.ComponentModel;
Then the code looks like:
namespace AsynchronousNonBlocking
{
public partial class Window1 : Window
{
ServiceReference1.Service1Client test2;
public Window1()
{
InitializeComponent();
test2 = new ServiceReference1.Service1Client();
}
private void btnSynchronous_Click(object sender, RoutedEventArgs e)
{
ServiceReference1.Service1Client test = new ServiceReference1.Service1Client();
test.DoWork();
}
private void btnAsynchronous_Click(object sender, RoutedEventArgs e)
{
test2.DoWorkCompleted += new EventHandler(DoWorkCallback);
test2.DoWorkAsync();
}
public event System.EventHandler DoWorkCompleted;
static void DoWorkCallback(object sender, AsyncCompletedEventArgs e)
{
Console.WriteLine("Done");
}
}
}
Add the reference
using System.ComponentModel;
Then the code looks like:
namespace AsynchronousNonBlocking
{
public partial class Window1 : Window
{
ServiceReference1.Service1Client test2;
public Window1()
{
InitializeComponent();
test2 = new ServiceReference1.Service1Client();
}
private void btnSynchronous_Click(object sender, RoutedEventArgs e)
{
ServiceReference1.Service1Client test = new ServiceReference1.Service1Client();
test.DoWork();
}
private void btnAsynchronous_Click(object sender, RoutedEventArgs e)
{
test2.DoWorkCompleted += new EventHandler
test2.DoWorkAsync();
}
public event System.EventHandler
static void DoWorkCallback(object sender, AsyncCompletedEventArgs e)
{
Console.WriteLine("Done");
}
}
}
Thursday, 11 December 2008
Simple TDD WCF oversight
I am a Test Driven Development fanatic and I started using WCF applications.
Development using the built in webserver with VS2008 worked fine but as soon as I deploy the WCF service I get the error:
"Could not find default endpoint element that references contract 'zzzServiceReference.Izzz' in the ServiceModel client configuration section. This might be because no configuration file was found for your application, or because no endpoint element matching this contract could be found in the client element."
I spent quite some time checking the config files. The problem was I was editing the App.Config file in a sub project. What I should have done is create an App.config in the unit test sub project.
Development using the built in webserver with VS2008 worked fine but as soon as I deploy the WCF service I get the error:
"Could not find default endpoint element that references contract 'zzzServiceReference.Izzz' in the ServiceModel client configuration section. This might be because no configuration file was found for your application, or because no endpoint element matching this contract could be found in the client element."
I spent quite some time checking the config files. The problem was I was editing the App.Config file in a sub project. What I should have done is create an App.config in the unit test sub project.
Wednesday, 10 December 2008
Single User Mode
I had an interesting experience setting up a copy of a productive system. Everything was working fine up until I wanted to restore the backup of the database. Then I got an error that the database backup was incomplete. Since it was 1am in the morning I thought now is the time to make sure that we had a backup of the productive system. So I start the backup. After about 20 minutes the backup fails and sets the database into a Single User mode. This can be seen in the Microsoft SQL Management studio next to the database. When selecting the properties of this database I got an error message to the effect that the database was opened by another user. Since it was early in the morning I decided to restart the server and while the server was restarting to lookup more details over the single user mode with the hope that in the time it takes to find out more information the reboot would solve the issue. Well, the server did not restart. Being remote this posed some problems. Apparently the night before one of the led’s on harddisk of this server started to shine orange instead of green and that in the event log there where entries like “Drive Array Physical Status Change… has a new status of 3 … 3=failed”
When the system administrators got back in the office the faulty disk was changed and the RAID was restored. But we still had the single user mode problem. From my search on the internet I found 2 commands:
alter database db-name set SINGLE_USER
alter database db-name set MULTI_USER
So I tried the second of these commands and got the message to the effect that the database was in single user mode and a user is already connected to it. I thought this was a little odd since the database server had just restarted. My next attempt was to use a stored procedure within master to kill all connections to this database and detatch it. This looked like
ALTER PROCEDURE [dbo].[sp_Detach_DB_FORCED]
@DATABASE_NAME as VARCHAR(40)
AS
BEGIN
DECLARE @spid int, @str varchar(1000), @DBname varchar (50)
SET @DBname = @DATABASE_NAME
DECLARE curGetSpids
CURSOR FOR
SELECT p.Spid
FROM master..sysprocesses p, master..sysdatabases d (NOLOCK)
where p.dbid = d.dbid
and d.name = @DBname
OPEN curGetSpids
FETCH NEXT FROM curGetSpids INTO @spid
WHILE(@@FETCH_STATUS<>-1)
BEGIN
SELECT @str='KILL '+CONVERT(varchar(3),@spid)
EXEC (@str)
FETCH NEXT FROM curGetSpids INTO @spid
END
CLOSE curGetSpids
DEALLOCATE curGetSpids
EXEC sp_detach_db @DBname
END
Unfortunately this came back with a similar set of messages. After digging further in the internet I found
Sp_who
This lists the spid of all processes that are connected to databases hosted on the server. It turns out that the database was locked by a select statement of a job that had become frozen. To get rid of this connection
Kill spid
Then the alter database statement worked and we all lived happily ever after :)
When the system administrators got back in the office the faulty disk was changed and the RAID was restored. But we still had the single user mode problem. From my search on the internet I found 2 commands:
alter database db-name set SINGLE_USER
alter database db-name set MULTI_USER
So I tried the second of these commands and got the message to the effect that the database was in single user mode and a user is already connected to it. I thought this was a little odd since the database server had just restarted. My next attempt was to use a stored procedure within master to kill all connections to this database and detatch it. This looked like
ALTER PROCEDURE [dbo].[sp_Detach_DB_FORCED]
@DATABASE_NAME as VARCHAR(40)
AS
BEGIN
DECLARE @spid int, @str varchar(1000), @DBname varchar (50)
SET @DBname = @DATABASE_NAME
DECLARE curGetSpids
CURSOR FOR
SELECT p.Spid
FROM master..sysprocesses p, master..sysdatabases d (NOLOCK)
where p.dbid = d.dbid
and d.name = @DBname
OPEN curGetSpids
FETCH NEXT FROM curGetSpids INTO @spid
WHILE(@@FETCH_STATUS<>-1)
BEGIN
SELECT @str='KILL '+CONVERT(varchar(3),@spid)
EXEC (@str)
FETCH NEXT FROM curGetSpids INTO @spid
END
CLOSE curGetSpids
DEALLOCATE curGetSpids
EXEC sp_detach_db @DBname
END
Unfortunately this came back with a similar set of messages. After digging further in the internet I found
Sp_who
This lists the spid of all processes that are connected to databases hosted on the server. It turns out that the database was locked by a select statement of a job that had become frozen. To get rid of this connection
Kill spid
Then the alter database statement worked and we all lived happily ever after :)
Thursday, 4 December 2008
Connecting to functions in Oracle
Currently there is no data adapter for Linq to Oracle. This means when connecting to Oracle you need to use ADO.NET or OLDDB. I have been trying to connect to a function that look like
FUNCTION getContract(i_businessID IN tblContract.BusinessID%TYPE,
i_periodFrom IN VARCHAR2,
i_periodTo IN VARCHAR2)
RETURN NUMBER;
The OLEDB Way
Use the following imports:
Imports System.Data.Common
Imports System.Data
Imports System.Data.OracleClient
Imports System.Data.OleDb
Imports System.Data.OracleClient.OracleType
Setup the connection string:
_DBConnStr = "Provider=MSDAORA;Data Source=" + ServerName + ";User Id=" + UserName + ";Password=" + Password + ""
The the function looks like:
Public Function GetContract(ByRef SPInput As sp_GetContractInfo.TableStructure, ByRef Message As String) As Integer
Try
Dim cnOra As New OleDbConnection
cnOra.ConnectionString = _DBConnStr
cnOra.Open()
Dim oCmd As New OleDbCommand
oCmd.CommandText = "{? = call Schema.Package.getContract(?,
Dim o_retValParam As OleDbParameter
o_retValParam = New OleDbParameter("RETVAL", System.Data.OleDb.OleDbType.VarNumeric)
o_retValParam.Direction = ParameterDirection.ReturnValue
o_retValParam.Size = 32
oCmd.Parameters.Add(o_retValParam)
Dim i_businessIDParam As OleDbParameter
i_businessIDParam = New OleDbParameter("i_businessID", System.Data.OleDb.OleDbType.VarChar)
i_businessIDParam.Direction = ParameterDirection.Input
i_businessIDParam.Size = SPInput.i_BusinessId.Length
oCmd.Parameters.Add(i_businessIDParam)
'i_businessID IN 'tblContract.BusinessID%TYPE
Dim i_periodFromParam As OleDbParameter
i_periodFromParam = New OleDbParameter("i_periodFrom", System.Data.OleDb.OleDbType.VarChar)
i_periodFromParam.Direction = ParameterDirection.Input
i_periodFromParam.Size = 19
oCmd.Parameters.Add(i_periodFromParam)
' i_periodFrom IN VARCHAR2,
Dim i_periodToParam As OleDbParameter
i_periodToParam = New OleDbParameter("i_periodTo", System.Data.OleDb.OleDbType.VarChar)
i_periodToParam.Direction = ParameterDirection.Input
i_periodToParam.Size = 19
oCmd.Parameters.Add(i_periodToParam)
' i_periodTo IN VARCHAR2,
Dim i_periodFrom As String = SPInput.i_PeriodFrom.ToString("yyyy-MM-dd") + " 00:00:00"
Dim i_periodTo As String = SPInput.i_PeriodTo.ToString("yyyy-MM-dd") + " 23:59:59"
oCmd.Parameters("i_businessID").Value = SPInput.i_BusinessId
oCmd.Parameters("i_periodFrom").Value = i_periodFrom
oCmd.Parameters("i_periodTo").Value = i_periodTo
oCmd.Connection = cnOra
Dim Result As Integer
oCmd.ExecuteNonQuery()
oCmd.ToString()
Return Result
Catch ex As Exception
Message= ex.ToString
Return -1
End Try
Return True
End Function
The ADO.NET way
Imports System
Imports System.Data.OracleClient
Imports System.Data
_DBConnStr = "Data Source=" + ServerName + ";User ID=" + UserName + ";Password=" + Password + ""
Then the function looks like
Public Function GetContract(ByRef SPInput As sp_GetContractInfo.TableStructure, ByRef Message As String) As Integer
Try
Dim cnOra As New OracleCommand
Dim conOra As New OracleConnection(_DBConnStr)
cnOra.Connection = conOra
cnOra.CommandText = "SchemaName.PackageName.getContract"
cnOra.CommandType = CommandType.StoredProcedure
cnOra.Parameters.Add("RETVAL", OracleType.Number).Direction = ParameterDirection.ReturnValue
cnOra.Parameters.Add("i_businessID", OracleType.VarChar, SPInput.i_BusinessId.Length).Direction = ParameterDirection.Input
cnOra.Parameters.Add("i_periodFrom", OracleType.VarChar, 19).Direction = ParameterDirection.Input
cnOra.Parameters.Add("i_periodTo", OracleType.VarChar, 19).Direction = ParameterDirection.Input
Dim i_periodFrom As String = SPInput.i_PeriodFrom.ToString("yyyy-MM-dd") + " 00:00:00"
Dim i_periodTo As String = SPInput.i_PeriodTo.ToString("yyyy-MM-dd") + " 23:59:59"
cnOra.Parameters("i_businessID").Value = SPInput.i_BusinessId
cnOra.Parameters("i_periodFrom").Value = i_periodFrom
cnOra.Parameters("i_periodTo").Value = i_periodTo
cnOra.Connection.Open()
cnOra.ExecuteNonQuery()
Dim Result As Integer
Result = cnOra.Parameters("RETVAL").Value
Return Result
Catch ex As Exception
Message = ex.Message
Return -1
End Try
Return True
End Function
FUNCTION getContract(i_businessID IN tblContract.BusinessID%TYPE,
i_periodFrom IN VARCHAR2,
i_periodTo IN VARCHAR2)
RETURN NUMBER;
The OLEDB Way
Use the following imports:
Imports System.Data.Common
Imports System.Data
Imports System.Data.OracleClient
Imports System.Data.OleDb
Imports System.Data.OracleClient.OracleType
Setup the connection string:
_DBConnStr = "Provider=MSDAORA;Data Source=" + ServerName + ";User Id=" + UserName + ";Password=" + Password + ""
The the function looks like:
Public Function GetContract(ByRef SPInput As sp_GetContractInfo.TableStructure, ByRef Message As String) As Integer
Try
Dim cnOra As New OleDbConnection
cnOra.ConnectionString = _DBConnStr
cnOra.Open()
Dim oCmd As New OleDbCommand
oCmd.CommandText = "{? = call Schema.Package.getContract(?,
Dim o_retValParam As OleDbParameter
o_retValParam = New OleDbParameter("RETVAL", System.Data.OleDb.OleDbType.VarNumeric)
o_retValParam.Direction = ParameterDirection.ReturnValue
o_retValParam.Size = 32
oCmd.Parameters.Add(o_retValParam)
Dim i_businessIDParam As OleDbParameter
i_businessIDParam = New OleDbParameter("i_businessID", System.Data.OleDb.OleDbType.VarChar)
i_businessIDParam.Direction = ParameterDirection.Input
i_businessIDParam.Size = SPInput.i_BusinessId.Length
oCmd.Parameters.Add(i_businessIDParam)
'i_businessID IN 'tblContract.BusinessID%TYPE
Dim i_periodFromParam As OleDbParameter
i_periodFromParam = New OleDbParameter("i_periodFrom", System.Data.OleDb.OleDbType.VarChar)
i_periodFromParam.Direction = ParameterDirection.Input
i_periodFromParam.Size = 19
oCmd.Parameters.Add(i_periodFromParam)
' i_periodFrom IN VARCHAR2,
Dim i_periodToParam As OleDbParameter
i_periodToParam = New OleDbParameter("i_periodTo", System.Data.OleDb.OleDbType.VarChar)
i_periodToParam.Direction = ParameterDirection.Input
i_periodToParam.Size = 19
oCmd.Parameters.Add(i_periodToParam)
' i_periodTo IN VARCHAR2,
Dim i_periodFrom As String = SPInput.i_PeriodFrom.ToString("yyyy-MM-dd") + " 00:00:00"
Dim i_periodTo As String = SPInput.i_PeriodTo.ToString("yyyy-MM-dd") + " 23:59:59"
oCmd.Parameters("i_businessID").Value = SPInput.i_BusinessId
oCmd.Parameters("i_periodFrom").Value = i_periodFrom
oCmd.Parameters("i_periodTo").Value = i_periodTo
oCmd.Connection = cnOra
Dim Result As Integer
oCmd.ExecuteNonQuery()
oCmd.ToString()
Return Result
Catch ex As Exception
Message= ex.ToString
Return -1
End Try
Return True
End Function
The ADO.NET way
Imports System
Imports System.Data.OracleClient
Imports System.Data
_DBConnStr = "Data Source=" + ServerName + ";User ID=" + UserName + ";Password=" + Password + ""
Then the function looks like
Public Function GetContract(ByRef SPInput As sp_GetContractInfo.TableStructure, ByRef Message As String) As Integer
Try
Dim cnOra As New OracleCommand
Dim conOra As New OracleConnection(_DBConnStr)
cnOra.Connection = conOra
cnOra.CommandText = "SchemaName.PackageName.getContract"
cnOra.CommandType = CommandType.StoredProcedure
cnOra.Parameters.Add("RETVAL", OracleType.Number).Direction = ParameterDirection.ReturnValue
cnOra.Parameters.Add("i_businessID", OracleType.VarChar, SPInput.i_BusinessId.Length).Direction = ParameterDirection.Input
cnOra.Parameters.Add("i_periodFrom", OracleType.VarChar, 19).Direction = ParameterDirection.Input
cnOra.Parameters.Add("i_periodTo", OracleType.VarChar, 19).Direction = ParameterDirection.Input
Dim i_periodFrom As String = SPInput.i_PeriodFrom.ToString("yyyy-MM-dd") + " 00:00:00"
Dim i_periodTo As String = SPInput.i_PeriodTo.ToString("yyyy-MM-dd") + " 23:59:59"
cnOra.Parameters("i_businessID").Value = SPInput.i_BusinessId
cnOra.Parameters("i_periodFrom").Value = i_periodFrom
cnOra.Parameters("i_periodTo").Value = i_periodTo
cnOra.Connection.Open()
cnOra.ExecuteNonQuery()
Dim Result As Integer
Result = cnOra.Parameters("RETVAL").Value
Return Result
Catch ex As Exception
Message = ex.Message
Return -1
End Try
Return True
End Function
Tuesday, 2 December 2008
A problem with out of date table statistics
We had an interesting database by problem We extracted data from an original database into a new database. Then we ran a specific query on both databases.So the query runs over exactly the same data, but on the original database the query took 14 hours to complete and on the new database 20 seconds.
The root cause was out-of-date statistics on the tables/views. I built up a new maintenance plan, including the Rebuild Index task and Update Statistics task and after finishing the maintplan, the same query finished within 16 seconds on the original database.
I also compared the execution plans and found it is the out-of-date statistics which caused SQL Server resulted in wrong cardinality estimation. The problem was that there was a huge difference between the EstimateRows and Rows columns. The estimated rows for the entire view is incorrectly estimated to 1 although there are 84938 rows existing. So SQL Server incorrectly generated the poor nested loop join operator based on this incorrect cardinality estimation. The result is that SQL Server had to run the table scan and was the reason why we saw the “IO_Completion” waittype and unnecessary disk bottleneck.
From this interesting case, we can learn two things:
(1) Frequently update statistics is a best practice to keep SQL Server “thinking” properly.
(2) Even with incorrect/out-of-date statistics, a good database design (correct indexing) could also help avoid the unnecessary overhead caused by the incorrect statistics.
To clear out of date statistics use sp_updatestats. But be carefull executing this on a productive system as it can lead to performance issues as the SQL server has less information to base it’s execution plans on
The root cause was out-of-date statistics on the tables/views. I built up a new maintenance plan, including the Rebuild Index task and Update Statistics task and after finishing the maintplan, the same query finished within 16 seconds on the original database.
I also compared the execution plans and found it is the out-of-date statistics which caused SQL Server resulted in wrong cardinality estimation. The problem was that there was a huge difference between the EstimateRows and Rows columns. The estimated rows for the entire view is incorrectly estimated to 1 although there are 84938 rows existing. So SQL Server incorrectly generated the poor nested loop join operator based on this incorrect cardinality estimation. The result is that SQL Server had to run the table scan and was the reason why we saw the “IO_Completion” waittype and unnecessary disk bottleneck.
From this interesting case, we can learn two things:
(1) Frequently update statistics is a best practice to keep SQL Server “thinking” properly.
(2) Even with incorrect/out-of-date statistics, a good database design (correct indexing) could also help avoid the unnecessary overhead caused by the incorrect statistics.
To clear out of date statistics use sp_updatestats. But be carefull executing this on a productive system as it can lead to performance issues as the SQL server has less information to base it’s execution plans on
Monday, 24 November 2008
Some notes on F#
About a year ago I was looking at how to convert a second order polynomial fit written in R to F#. The function looked like
polyfit <- function(x, y, filename = 'fit_out.csv') {
x2 <- x^2
## make linear model fit
lm.fit <- lm(y ~ x + x2)
write('Intercept,x,x^2', filename, append = FALSE)
cat(lm.fit$coefficients, file = filename, sep = ',', append = TRUE)
}
Unfortunately at this time there where not many mathematical libraries available for F# and I would have had to write my own. So I put F# to one side
At the PDC 2008 there where many sessions on how to make the most of multi core processors. To do this in C# is very challenging and hard to maintain. Here are some notes I made during the F# session on the last day of the PDC:
F# is a non imperative programming language. In imperative languages x = x + 1 makes sense, however from a mathematical stand point it is nonsence. I F# you first explore the domain by writing code and executing it with alt enter. Once finished you can compile the code and expose it to other .net languages. Since F# uses immutable variables it does not have the problem of updating shared values when running in parallel. It declative style lends itself well to parallel processing and is used by some users of the htc server.
#light
Let y = o
Does not mean assign 0 to y. Instead it means bind the value 0 to the symbol assigned to y. In this way y is read-only and immutable. The same principle holds for functions eg
Let sqr x = x * x
Writing a function in F# as if it where imperative
Let sqr x = x * x
Let SumOfSquareI = nums =
Let mutable acc = 0.0
For x in nums do
Acc <- acc + sqr x
Acc
This is not the way to program F#. Notice the genric approach to variable typing. In this case the return type was determined by the compiler working backward through the code to 0.0 which is a double.
A mathematician would do
Let SumOfSqrF =
Match nums with
[] -> 0.0
h::f -> sqr h + sumof: sumofsquaresf t
Match is a switch on steroids. Its a branch mechanism that binds. This has the effect of making the calculation parrellizeable
The F# way would be
Let sumofsquares nums =
Nums
> seq.map sqr
> seq.sum
The map function applies the function sqr to each element of nums. The > is a pipeline operator that works in the same way as a pipe in unix or dos
The flow has been encapsulated. There are fewer places to screwup. It lets the bar tender make the coffee.
There is a parallel extension framework that can be used to make this multi-core multi server enabled.
Raising the level of abstraction enables parallisation without restructuring code.
#light
Open system.net
Open system.io
Let ticker = "msft"
Let url = "http....
Let req = webrequest.create(url)
Let resp = req.getresponse()
Let stream = resp.getresponsestream()
Let reader = newstreamreader(stream)
Let csv = reader.readtoend()
Let prices = csv.split([','])
>seq.skip 1
>seq.map(fun line -> line.split([',']))
>seq.filter(fun values -> values > seq.length 7)
>seq.map(fun values -> system.datetimeparse(values.[0]) float values.[6])
The results can be visualized by the FlyingFrog library and the following commands
Grid prices;;
Plot prices ;;
To give up a thread so as not to block tasks when waiting
Let! Req = req.AsynchGetResponse
To make a static method asynchronous
Let internal loadprices ticker = async {.....
To make a pipeline asynchronous
Tickers
>seq.map loadprices
>async.parallel
>async.run
>.map(fun prices -> new stockanalyser(prices,days)
The flow of logic would be completely overwhelmed by the code needed to setup delegates and threads.
In 2009 there will be a supported release of F#
For further details see fsharp.net
polyfit <- function(x, y, filename = 'fit_out.csv') {
x2 <- x^2
## make linear model fit
lm.fit <- lm(y ~ x + x2)
write('Intercept,x,x^2', filename, append = FALSE)
cat(lm.fit$coefficients, file = filename, sep = ',', append = TRUE)
}
Unfortunately at this time there where not many mathematical libraries available for F# and I would have had to write my own. So I put F# to one side
At the PDC 2008 there where many sessions on how to make the most of multi core processors. To do this in C# is very challenging and hard to maintain. Here are some notes I made during the F# session on the last day of the PDC:
F# is a non imperative programming language. In imperative languages x = x + 1 makes sense, however from a mathematical stand point it is nonsence. I F# you first explore the domain by writing code and executing it with alt enter. Once finished you can compile the code and expose it to other .net languages. Since F# uses immutable variables it does not have the problem of updating shared values when running in parallel. It declative style lends itself well to parallel processing and is used by some users of the htc server.
#light
Let y = o
Does not mean assign 0 to y. Instead it means bind the value 0 to the symbol assigned to y. In this way y is read-only and immutable. The same principle holds for functions eg
Let sqr x = x * x
Writing a function in F# as if it where imperative
Let sqr x = x * x
Let SumOfSquareI = nums =
Let mutable acc = 0.0
For x in nums do
Acc <- acc + sqr x
Acc
This is not the way to program F#. Notice the genric approach to variable typing. In this case the return type was determined by the compiler working backward through the code to 0.0 which is a double.
A mathematician would do
Let SumOfSqrF =
Match nums with
[] -> 0.0
h::f -> sqr h + sumof: sumofsquaresf t
Match is a switch on steroids. Its a branch mechanism that binds. This has the effect of making the calculation parrellizeable
The F# way would be
Let sumofsquares nums =
Nums
> seq.map sqr
> seq.sum
The map function applies the function sqr to each element of nums. The > is a pipeline operator that works in the same way as a pipe in unix or dos
The flow has been encapsulated. There are fewer places to screwup. It lets the bar tender make the coffee.
There is a parallel extension framework that can be used to make this multi-core multi server enabled.
Raising the level of abstraction enables parallisation without restructuring code.
#light
Open system.net
Open system.io
Let ticker = "msft"
Let url = "http....
Let req = webrequest.create(url)
Let resp = req.getresponse()
Let stream = resp.getresponsestream()
Let reader = newstreamreader(stream)
Let csv = reader.readtoend()
Let prices = csv.split([','])
>seq.skip 1
>seq.map(fun line -> line.split([',']))
>seq.filter(fun values -> values > seq.length 7)
>seq.map(fun values -> system.datetimeparse(values.[0]) float values.[6])
The results can be visualized by the FlyingFrog library and the following commands
Grid prices;;
Plot prices ;;
To give up a thread so as not to block tasks when waiting
Let! Req = req.AsynchGetResponse
To make a static method asynchronous
Let internal loadprices ticker = async {.....
To make a pipeline asynchronous
Tickers
>seq.map loadprices
>async.parallel
>async.run
>.map(fun prices -> new stockanalyser(prices,days)
The flow of logic would be completely overwhelmed by the code needed to setup delegates and threads.
In 2009 there will be a supported release of F#
For further details see fsharp.net
Adding icons to WPF applications
I had some problems inserting icons into my wpf application. I was getting errors like "The image format is unrecognised". Here is the solution:
1. Got the properties of the WPF sub project
2. In the application tab select the [...] button next to the Icon text box
3. Browse to your icon4. When you start in debug mode the icon will not appear. It's a bug in VS2005 and VS2008
5. Under the debug drop down select "Start Without Debugging" and the icon will appear
1. Got the properties of the WPF sub project
2. In the application tab select the [...] button next to the Icon text box
3. Browse to your icon4. When you start in debug mode the icon will not appear. It's a bug in VS2005 and VS2008
5. Under the debug drop down select "Start Without Debugging" and the icon will appear
Saturday, 22 November 2008
“Cannot generate SSPI context”
This is another interesting error message that recently popped up when connecting to an SQL2005 server. It reminding me of some problems that I had installing SQL2005 3 years ago. Then the error was that the service account that the SQL Server was using had insuficient access rights. This time our friends in IT made the IP addresses of our servers DHCP. Unfortunately they made a mistake making a mixup of IP and server name which caused this problem.
Using enterprize storage drives to host database files
It can make sence to use network storage devices to host database files. I believe this works for small administration databases. But if the database needs to handle some significant load you can end up in some performance problems. Here is an interesting story demonstrating limitations of network storage
22.10 5:30pm GMT+1
There are serious time out issues in our Eastern US offices that stop a database application from working. CPU is about 70-90% PF Rate is at a stable 80%
23.10 7am GMT+1
CPU is about 0-10% after applying artificial loadPF Rate remains stable 80%
Since it is 3am in Eastern US this is a good indication that cause if from responsiveness of Network storage
23.20 8am GMT+1
I move the database files from Network storage and put them on D:
23.10 1pm GMT+1
The CPU is around 30% a significant improvement.
23.10 5:30pm GMT+1
The performance has improved but still lots of time outs
24.10 7am GMT+1
I move the TempDB to D:
At about 11am EUS time and one of my collegues increased the paging memorySince there where no performance issues that day I believe the TempDB was the main cause of the problem and not the Paging memory. Tests carried out at 1pm GMT+1 and the fact that there where no performance issues between 9-11 EUS time support this
Lessons learned
Putting database files on the netapp share can produce a significant drop performance.The drop in performance is only apparent during the day as the Netap share is being used.
22.10 5:30pm GMT+1
There are serious time out issues in our Eastern US offices that stop a database application from working. CPU is about 70-90% PF Rate is at a stable 80%
23.10 7am GMT+1
CPU is about 0-10% after applying artificial loadPF Rate remains stable 80%
Since it is 3am in Eastern US this is a good indication that cause if from responsiveness of Network storage
23.20 8am GMT+1
I move the database files from Network storage and put them on D:
23.10 1pm GMT+1
The CPU is around 30% a significant improvement.
23.10 5:30pm GMT+1
The performance has improved but still lots of time outs
24.10 7am GMT+1
I move the TempDB to D:
At about 11am EUS time and one of my collegues increased the paging memorySince there where no performance issues that day I believe the TempDB was the main cause of the problem and not the Paging memory. Tests carried out at 1pm GMT+1 and the fact that there where no performance issues between 9-11 EUS time support this
Lessons learned
Putting database files on the netapp share can produce a significant drop performance.The drop in performance is only apparent during the day as the Netap share is being used.
Friday, 21 November 2008
"Could not create an instance of type"
This is an interesting error message. Imagine you have been biulding up a wpf application using lots of user controls and databinding. Copying code from one control to another to save time on the typing. Then it is time to use 2 user controls together in the same window and then you get the error "Could not create an instance of type". Looking at this closely it happens during the initialization of one of the controls. Looking on the internet shows some articles about the problems of making wpf controls inherit from one another. It turned out to be much simpler. I only had forgotten to modify the depency object to be of the same type of the control.
Thursday, 20 November 2008
This is my first blog entry
Just comming back from the PDC with a lot of ideas in how to optimize code to run on multi core cpu on a server farm. I have made an architectural model for migrating from Windows Forms to WPF applications using the recently released WPF Toolkit
Subscribe to:
Posts (Atom)