Showing posts with label package. Show all posts
Showing posts with label package. Show all posts

Thursday, March 29, 2012

Advice on RAID

We sell a software package that runs on dedicated servers at sites that
typically have no db administrator. SQL Server 2000 is the database, and the
application is very write intensive – data for 50 to 200 parameters is
written to the database every minute, 24 hours a day, 7 days a week. Typical
database size is about 4 or 5 GB. Usually there are a few client
workstations. The application is also read intensive. We have configured
the systems with 3 drives such that tempdb is on C, database files are on D
and the log is on E. Because of frequent disk failures and the relatively
high expense of correcting those failures, we want to switch to RAID. Having
little experience, we are looking for advice. RAID 10 is probably not an
option because of the expense. I would highly appreciate any suggestions.
Rethink your aversion to RAID10. It performs better than RAID5 on
write-intensive installations and there is more high-availability associated
with it. Calculate how much revenue your business will lose in one day.
It's likely more than a handful of disks.
Tom
Thomas A. Moreau, BSc, PhD, MCSE, MCDBA
SQL Server MVP
Columnist, SQL Server Professional
Toronto, ON Canada
www.pinpub.com
..
"KMP" <KMP@.discussions.microsoft.com> wrote in message
news:87394BC3-4BD5-4D70-A0B4-C4D4C2DCC7AA@.microsoft.com...
We sell a software package that runs on dedicated servers at sites that
typically have no db administrator. SQL Server 2000 is the database, and
the
application is very write intensive – data for 50 to 200 parameters is
written to the database every minute, 24 hours a day, 7 days a week.
Typical
database size is about 4 or 5 GB. Usually there are a few client
workstations. The application is also read intensive. We have configured
the systems with 3 drives such that tempdb is on C, database files are on D
and the log is on E. Because of frequent disk failures and the relatively
high expense of correcting those failures, we want to switch to RAID.
Having
little experience, we are looking for advice. RAID 10 is probably not an
option because of the expense. I would highly appreciate any suggestions.
|||With such small amounts of data you might as well go with RAID 1
(mirroring), with three pairs of drives. RAID 10 has advantages but
with just 4 to 5 GB of data it would be difficult justifying enough
drives to make a good RAID 10 set. For a write intensive application
you should avoid RAID 5 altogether.
One question I have to ask. You say you have "frequent disk
failures". What kind of drives are you using? (For server
applications I expect SCSI.) What brand are they?
Good luck!
Roy
On Wed, 22 Feb 2006 12:11:27 -0800, KMP
<KMP@.discussions.microsoft.com> wrote:

>We sell a software package that runs on dedicated servers at sites that
>typically have no db administrator. SQL Server 2000 is the database, and the
>application is very write intensive data for 50 to 200 parameters is
>written to the database every minute, 24 hours a day, 7 days a week. Typical
>database size is about 4 or 5 GB. Usually there are a few client
>workstations. The application is also read intensive. We have configured
>the systems with 3 drives such that tempdb is on C, database files are on D
>and the log is on E. Because of frequent disk failures and the relatively
>high expense of correcting those failures, we want to switch to RAID. Having
>little experience, we are looking for advice. RAID 10 is probably not an
>option because of the expense. I would highly appreciate any suggestions.

Advice on RAID

We sell a software package that runs on dedicated servers at sites that
typically have no db administrator. SQL Server 2000 is the database, and the
application is very write intensive â' data for 50 to 200 parameters is
written to the database every minute, 24 hours a day, 7 days a week. Typical
database size is about 4 or 5 GB. Usually there are a few client
workstations. The application is also read intensive. We have configured
the systems with 3 drives such that tempdb is on C, database files are on D
and the log is on E. Because of frequent disk failures and the relatively
high expense of correcting those failures, we want to switch to RAID. Having
little experience, we are looking for advice. RAID 10 is probably not an
option because of the expense. I would highly appreciate any suggestions.Rethink your aversion to RAID10. It performs better than RAID5 on
write-intensive installations and there is more high-availability associated
with it. Calculate how much revenue your business will lose in one day.
It's likely more than a handful of disks.
--
Tom
----
Thomas A. Moreau, BSc, PhD, MCSE, MCDBA
SQL Server MVP
Columnist, SQL Server Professional
Toronto, ON Canada
www.pinpub.com
.
"KMP" <KMP@.discussions.microsoft.com> wrote in message
news:87394BC3-4BD5-4D70-A0B4-C4D4C2DCC7AA@.microsoft.com...
We sell a software package that runs on dedicated servers at sites that
typically have no db administrator. SQL Server 2000 is the database, and
the
application is very write intensive â' data for 50 to 200 parameters is
written to the database every minute, 24 hours a day, 7 days a week.
Typical
database size is about 4 or 5 GB. Usually there are a few client
workstations. The application is also read intensive. We have configured
the systems with 3 drives such that tempdb is on C, database files are on D
and the log is on E. Because of frequent disk failures and the relatively
high expense of correcting those failures, we want to switch to RAID.
Having
little experience, we are looking for advice. RAID 10 is probably not an
option because of the expense. I would highly appreciate any suggestions.|||With such small amounts of data you might as well go with RAID 1
(mirroring), with three pairs of drives. RAID 10 has advantages but
with just 4 to 5 GB of data it would be difficult justifying enough
drives to make a good RAID 10 set. For a write intensive application
you should avoid RAID 5 altogether.
One question I have to ask. You say you have "frequent disk
failures". What kind of drives are you using? (For server
applications I expect SCSI.) What brand are they?
Good luck!
Roy
On Wed, 22 Feb 2006 12:11:27 -0800, KMP
<KMP@.discussions.microsoft.com> wrote:
>We sell a software package that runs on dedicated servers at sites that
>typically have no db administrator. SQL Server 2000 is the database, and the
>application is very write intensive ? data for 50 to 200 parameters is
>written to the database every minute, 24 hours a day, 7 days a week. Typical
>database size is about 4 or 5 GB. Usually there are a few client
>workstations. The application is also read intensive. We have configured
>the systems with 3 drives such that tempdb is on C, database files are on D
>and the log is on E. Because of frequent disk failures and the relatively
>high expense of correcting those failures, we want to switch to RAID. Having
>little experience, we are looking for advice. RAID 10 is probably not an
>option because of the expense. I would highly appreciate any suggestions.

Advice on RAID

We sell a software package that runs on dedicated servers at sites that
typically have no db administrator. SQL Server 2000 is the database, and th
e
application is very write intensive – data for 50 to 200 parameters is
written to the database every minute, 24 hours a day, 7 days a week. Typica
l
database size is about 4 or 5 GB. Usually there are a few client
workstations. The application is also read intensive. We have configured
the systems with 3 drives such that tempdb is on C, database files are on D
and the log is on E. Because of frequent disk failures and the relatively
high expense of correcting those failures, we want to switch to RAID. Havin
g
little experience, we are looking for advice. RAID 10 is probably not an
option because of the expense. I would highly appreciate any suggestions.Rethink your aversion to RAID10. It performs better than RAID5 on
write-intensive installations and there is more high-availability associated
with it. Calculate how much revenue your business will lose in one day.
It's likely more than a handful of disks.
Tom
----
Thomas A. Moreau, BSc, PhD, MCSE, MCDBA
SQL Server MVP
Columnist, SQL Server Professional
Toronto, ON Canada
www.pinpub.com
.
"KMP" <KMP@.discussions.microsoft.com> wrote in message
news:87394BC3-4BD5-4D70-A0B4-C4D4C2DCC7AA@.microsoft.com...
We sell a software package that runs on dedicated servers at sites that
typically have no db administrator. SQL Server 2000 is the database, and
the
application is very write intensive – data for 50 to 200 parameters is
written to the database every minute, 24 hours a day, 7 days a week.
Typical
database size is about 4 or 5 GB. Usually there are a few client
workstations. The application is also read intensive. We have configured
the systems with 3 drives such that tempdb is on C, database files are on D
and the log is on E. Because of frequent disk failures and the relatively
high expense of correcting those failures, we want to switch to RAID.
Having
little experience, we are looking for advice. RAID 10 is probably not an
option because of the expense. I would highly appreciate any suggestions.|||With such small amounts of data you might as well go with RAID 1
(mirroring), with three pairs of drives. RAID 10 has advantages but
with just 4 to 5 GB of data it would be difficult justifying enough
drives to make a good RAID 10 set. For a write intensive application
you should avoid RAID 5 altogether.
One question I have to ask. You say you have "frequent disk
failures". What kind of drives are you using? (For server
applications I expect SCSI.) What brand are they?
Good luck!
Roy
On Wed, 22 Feb 2006 12:11:27 -0800, KMP
<KMP@.discussions.microsoft.com> wrote:

>We sell a software package that runs on dedicated servers at sites that
>typically have no db administrator. SQL Server 2000 is the database, and t
he
>application is very write intensive data for 50 to 200 parameters is
>written to the database every minute, 24 hours a day, 7 days a week. Typic
al
>database size is about 4 or 5 GB. Usually there are a few client
>workstations. The application is also read intensive. We have configured
>the systems with 3 drives such that tempdb is on C, database files are on D
>and the log is on E. Because of frequent disk failures and the relatively
>high expense of correcting those failures, we want to switch to RAID. Havi
ng
>little experience, we are looking for advice. RAID 10 is probably not an
>option because of the expense. I would highly appreciate any suggestions.

Tuesday, March 27, 2012

Advice on DTS Package in Replication

Is it advicable to use DTS package as a tranformation media in transactional
replication or should transformation be done at Stored Procedure level on
Subscriber?
Under what circumstances it will be a good idea to use DTS package.
Thank you very much.
only use transformable subscription with hetergeneous subscribers. For SQL
Server subscribers there are other ways (stored procedures being one of
them) of transforming the data in route which offer far greater performance.
"Mark" <Mark@.discussions.microsoft.com> wrote in message
news:C9B7E873-C6A2-40CC-8EE5-0DAC1D1A7508@.microsoft.com...
> Is it advicable to use DTS package as a tranformation media in
> transactional
> replication or should transformation be done at Stored Procedure level on
> Subscriber?
> Under what circumstances it will be a good idea to use DTS package.
> Thank you very much.

Sunday, March 25, 2012

advicce needed, which task to choose in DTS package

Hi guys, I need to bits of advice.

I get data from several tables in database A and export it into database B. In this instance both databases are SQL I use a DDQ task to do this with VB script to provide the logic to insert the record if it dosnt exist and update the record if it does. What I would like to know is if there is a more effcient task that can perform this function, something that dosnt require the vb script

The 2nd piece of advice I'd like to know is concerning getting data from an oracle database and importing it into a sql database through the use of csv files. This process very slow (up to 8 hours). Is there way of using some kind of DTS package to speed things up? Or is that not possible.

many thanks

SQL2005 Integration Services would be great for this scenario. You can add a lookup task and handle your control of flow all in the one package. If you're using SQL2K, could you perhaps dump all the data into a holding table and then run a simple INSERT statement to put in the new values?

If you're taking data from Oracle, you could just create a DTS/SSIS package to query Oracle direct and pump the data into your SQL Server. I'd have thought this would be the most efficient way.

HTH!

|||

Thanks for that. In the case of using a datapump in SQL2k, do yuo have an example of some tsql I could use to insert new records and update existing ones?

many thanks

|||

I think you'd need a DTS package with the following flow:

SQLConn1.SrcTable -> DDQ -> SQLConn2.HoldingTable -> T-SQL Task

Your T-SQL Task could be an SQL stored procedure which did something like the following:

Code Snippet

CREATE PROC AddRecords
AS

UPDATE DstTable
SET Col1 = Hld.Col1, Col2 = Hld.Col2

FROM DstTable Dst

INNER JOIN HoldingTable Hld

ON Dst.Col4 = Hld.Col4

INSERT INTO DstTable
SELECT *
FROM HoldingTable Hld

WHERE NOT EXISTS (SELECT * FROM DstTable Dst WHERE Dst.Col4 = Hld.Col4)

DELETE
FROM HoldingTable

Of course, this is quite simplified. You'll need to change your JOIN and EXISTS clause to include all the columns which define a record as being unique.

You may also want to break this stored procedure up into 3 T-SQL tasks which you could run after the DDQ.

HTH!

|||

Thanks for that.

advicce needed, which task to choose in DTS package

Hi guys, I need to bits of advice.

I get data from several tables in database A and export it into database B. In this instance both databases are SQL I use a DDQ task to do this with VB script to provide the logic to insert the record if it dosnt exist and update the record if it does. What I would like to know is if there is a more effcient task that can perform this function, something that dosnt require the vb script

The 2nd piece of advice I'd like to know is concerning getting data from an oracle database and importing it into a sql database through the use of csv files. This process very slow (up to 8 hours). Is there way of using some kind of DTS package to speed things up? Or is that not possible.

many thanks

SQL2005 Integration Services would be great for this scenario. You can add a lookup task and handle your control of flow all in the one package. If you're using SQL2K, could you perhaps dump all the data into a holding table and then run a simple INSERT statement to put in the new values?

If you're taking data from Oracle, you could just create a DTS/SSIS package to query Oracle direct and pump the data into your SQL Server. I'd have thought this would be the most efficient way.

HTH!

|||

Thanks for that. In the case of using a datapump in SQL2k, do yuo have an example of some tsql I could use to insert new records and update existing ones?

many thanks

|||

I think you'd need a DTS package with the following flow:

SQLConn1.SrcTable -> DDQ -> SQLConn2.HoldingTable -> T-SQL Task

Your T-SQL Task could be an SQL stored procedure which did something like the following:

Code Snippet

CREATE PROC AddRecords
AS

UPDATE DstTable
SET Col1 = Hld.Col1, Col2 = Hld.Col2

FROM DstTable Dst

INNER JOIN HoldingTable Hld

ON Dst.Col4 = Hld.Col4

INSERT INTO DstTable
SELECT *
FROM HoldingTable Hld

WHERE NOT EXISTS (SELECT * FROM DstTable Dst WHERE Dst.Col4 = Hld.Col4)

DELETE
FROM HoldingTable

Of course, this is quite simplified. You'll need to change your JOIN and EXISTS clause to include all the columns which define a record as being unique.

You may also want to break this stored procedure up into 3 T-SQL tasks which you could run after the DDQ.

HTH!

|||

Thanks for that.

Sunday, March 11, 2012

Advanced functions in SQL Express

Hi,

I would like exploring Reporting Services associated with SQL Express Edition so i tried installing "SQLEXPR_ADV_.EXE" the package available on download SQL site, but after having selected Reporting Services option I've got an error explaining that component "sqlserver2005_bc.msi" is to be changed.

As this component is part of "SQLEXPR_ADV_.EXE" package is there anybody knowing an alternative solution ?

Are you worrying about that the sqlserver2005_bc.msi component will be changed, so that repaire/modify the changed component will fail? I found the simliar issue in the instruction of SQL2005 sp1, which provides a workaround. You can take look at here:

http://download.microsoft.com/download/b/d/1/bd1e0745-0e65-43a5-ac6a-f6173f58d80e/ReadmeSQL2005SP1.htm

Friday, February 24, 2012

ADO.NET or OLEDB connection/recordset?

My package needs to be a High Performance (target: 150,000 rows in 30 minutes) ETL solution. We are using all MS technologies - SSIS, SQL 2005, BIDS, etc. I need to loop the recordset executed by a Stored Proc in a Execute SQL Task in a Source Script Component.

If I use an ADO.NET Connection Manager, here is the code in the Source Script Component Public Overrides Sub CreateNewOutputRows()

Code 1

Dim sqlAdapter As New SqlDataAdapter

Dim dataRow As Data.DataRow

Dim ds As DataSet = CType(Me.Variables.rsSomeResultset, DataSet)

sqlAdapter.Fill(ds)

Iget: Error: System.InvalidCastException: Unable to cast object of type 'System.Object' to type 'System.Data.DataSet'.

Code 2

Dim oledbAdapter As New OleDb.OleDbDataAdapter

Dim dataTable As DataTable

oledbAdapter.Fill(dataTable, Me.Variables.rsSomeResultset)

Error: System.ArgumentException: Object is not an ADODB.RecordSet or an ADODB.Record. Parameter name: adodb

It works all right when I use an OLEDB Connection Manager with the second code sample.

Question: In order to extract the maximum performance, wouldn't it be preferred to use ADO.NET with SqlClient Provider in an all SQL Server 2005 environment? Or will an OLEDB Connection provide comparable or equal performance?

If so, what code can I use? Since the recordset returned by the Stored Proc (in the Execute SQL Task) can only be captured in a System.Object variable and you can only use the overload of the Fill() method of the OleDbDataAdapter to accept an ADO Recordset or Record object.

There was a post recently that compared some of the connection types, but thanks to the lovely search functionality, I can't find it. If anyone else has it, please post it to this thread.

In general, I don't think you are going to see a significant performance difference between ADO.NET and OLEDB against SQL Server.

That being said, if you want to do some further research into the problem, try adding a message box to your script to display the type of the variable.

Code Snippet

System.Windows.Forms.MessageBox.Show(Me.Variables.rsSomeResultset.ToString())

|||You Code 1 snippet doesn't look right for ADO.NET. You shouldn't be trying to call sqlAdapter.Fill(ds). Try Dim dataTable as DataTable = ds.Tables(0) instead. I'd guess that OLE DB would be faster. Please post your findings.
|||

Do you mean this discussion?

http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=1985034&SiteID=1

ADO.NET or OLEDB connection/recordset?

My package needs to be a High Performance (target: 150,000 rows in 30 minutes) ETL solution. We are using all MS technologies - SSIS, SQL 2005, BIDS, etc. I need to loop the recordset executed by a Stored Proc in a Execute SQL Task in a Source Script Component.

If I use an ADO.NET Connection Manager, here is the code in the Source Script Component Public Overrides Sub CreateNewOutputRows()

Code 1

Dim sqlAdapter As New SqlDataAdapter

Dim dataRow As Data.DataRow

Dim ds As DataSet = CType(Me.Variables.rsSomeResultset, DataSet)

sqlAdapter.Fill(ds)

Iget: Error: System.InvalidCastException: Unable to cast object of type 'System.Object' to type 'System.Data.DataSet'.

Code 2

Dim oledbAdapter As New OleDb.OleDbDataAdapter

Dim dataTable As DataTable

oledbAdapter.Fill(dataTable, Me.Variables.rsSomeResultset)

Error: System.ArgumentException: Object is not an ADODB.RecordSet or an ADODB.Record. Parameter name: adodb

It works all right when I use an OLEDB Connection Manager with the second code sample.

Question: In order to extract the maximum performance, wouldn't it be preferred to use ADO.NET with SqlClient Provider in an all SQL Server 2005 environment? Or will an OLEDB Connection provide comparable or equal performance?

If so, what code can I use? Since the recordset returned by the Stored Proc (in the Execute SQL Task) can only be captured in a System.Object variable and you can only use the overload of the Fill() method of the OleDbDataAdapter to accept an ADO Recordset or Record object.

There was a post recently that compared some of the connection types, but thanks to the lovely search functionality, I can't find it. If anyone else has it, please post it to this thread.

In general, I don't think you are going to see a significant performance difference between ADO.NET and OLEDB against SQL Server.

That being said, if you want to do some further research into the problem, try adding a message box to your script to display the type of the variable.

Code Snippet

System.Windows.Forms.MessageBox.Show(Me.Variables.rsSomeResultset.ToString())

|||You Code 1 snippet doesn't look right for ADO.NET. You shouldn't be trying to call sqlAdapter.Fill(ds). Try Dim dataTable as DataTable = ds.Tables(0) instead. I'd guess that OLE DB would be faster. Please post your findings.
|||

Do you mean this discussion?

http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=1985034&SiteID=1

Thursday, February 9, 2012

AdjustTokenPrivileges Error?

Hi,

I have created two packages.. a child package and a main package that is responsable by executing all his childs in a specific order...

What start happening is that the child when executed alone is ok, but the the main package executes it... then i get a DOS Windows saying "AdjustTokenPrivileges () Failed (00000514)"

What can be causing this? I haven't changed anything and the package was executing right some days ago...

Best Regards,

Luis Sim?es

I have one package that is causing this error as well. Anyone know why? It only generates the error when I try to insert into a database. otherwise it does not generate this error and runs fine.

AdjustTokenPrivileges Error?

Hi,

I have created two packages.. a child package and a main package that is responsable by executing all his childs in a specific order...

What start happening is that the child when executed alone is ok, but the the main package executes it... then i get a DOS Windows saying "AdjustTokenPrivileges () Failed (00000514)"

What can be causing this? I haven't changed anything and the package was executing right some days ago...

Best Regards,

Luis Sim?es

I have one package that is causing this error as well. Anyone know why? It only generates the error when I try to insert into a database. otherwise it does not generate this error and runs fine.

AdjustTokenPrivileges () failed

I have an SSIS package that parses a text file into 3 smaller text files and then takes the data and puts it into tables. The package runs fine up to the point where it needs to insert the data. I turned logging on but no errors are generated. But I do get a file named SQLDUMPER_ERRORLOG.log that is generated with the info below. Any ideas of where to look?

11/16/06 13:31:38, ERROR , SQLDUMPER_UNKNOWN_APP.EXE, AdjustTokenPrivileges () failed (00000514)

11/16/06 13:31:38, ACTION, SQLDUMPER_UNKNOWN_APP.EXE, Input parameters:

4 supplied

11/16/06 13:31:38, ACTION, SQLDUMPER_UNKNOWN_APP.EXE, ProcessID =

1844

11/16/06 13:31:38, ACTION, SQLDUMPER_UNKNOWN_APP.EXE, ThreadId = 0

11/16/06 13:31:38, ACTION, SQLDUMPER_UNKNOWN_APP.EXE, Flags = 0x0

11/16/06 13:31:38, ACTION, SQLDUMPER_UNKNOWN_APP.EXE, MiniDumpFlags

= 0x0

11/16/06 13:31:38, ACTION, SQLDUMPER_UNKNOWN_APP.EXE, SqlInfoPtr =

0x0100C5D0

11/16/06 13:31:38, ACTION, SQLDUMPER_UNKNOWN_APP.EXE, DumpDir =

<NULL>

11/16/06 13:31:38, ACTION, SQLDUMPER_UNKNOWN_APP.EXE, ExceptionRecordPtr = 0x00000000

11/16/06 13:31:38, ACTION, SQLDUMPER_UNKNOWN_APP.EXE, ContextPtr =

0x00000000

11/16/06 13:31:38, ACTION, SQLDUMPER_UNKNOWN_APP.EXE, ExtraFile =

<NULL>

11/16/06 13:31:38, ACTION, SQLDUMPER_UNKNOWN_APP.EXE, InstanceName =

<NULL>

11/16/06 13:31:38, ACTION, SQLDUMPER_UNKNOWN_APP.EXE, ServiceName =

<NULL>

11/16/06 13:31:38, ACTION, SQLDUMPER_UNKNOWN_APP.EXE, Callback type 11 not used

11/16/06 13:31:43, ACTION, SQLDUMPER_UNKNOWN_APP.EXE, Callback type 7 not used

11/16/06 13:31:43, ACTION, SQLDUMPER_UNKNOWN_APP.EXE, MiniDump

completed: C:\Program Files\Microsoft SQL Server\90\Shared\ErrorDumps\SQLDmpr0033.mdmp

11/16/06 13:31:43, ACTION, DtsDebugHost.exe, Watson Invoke: No

11/16/06 13:31:43, ERROR , SQLDUMPER_UNKNOWN_APP.EXE, AdjustTokenPrivileges () failed (00000514)

11/16/06 13:31:43, ACTION, SQLDUMPER_UNKNOWN_APP.EXE, Input parameters:

4 supplied

11/16/06 13:31:43, ACTION, SQLDUMPER_UNKNOWN_APP.EXE, ProcessID =

1844

11/16/06 13:31:43, ACTION, SQLDUMPER_UNKNOWN_APP.EXE, ThreadId = 0

11/16/06 13:31:43, ACTION, SQLDUMPER_UNKNOWN_APP.EXE, Flags = 0x0

11/16/06 13:31:43, ACTION, SQLDUMPER_UNKNOWN_APP.EXE, MiniDumpFlags

= 0x0

11/16/06 13:31:43, ACTION, SQLDUMPER_UNKNOWN_APP.EXE, SqlInfoPtr =

0x0100C5D0

11/16/06 13:31:43, ACTION, SQLDUMPER_UNKNOWN_APP.EXE, DumpDir =

<NULL>

11/16/06 13:31:43, ACTION, SQLDUMPER_UNKNOWN_APP.EXE, ExceptionRecordPtr = 0x00000000

11/16/06 13:31:43, ACTION, SQLDUMPER_UNKNOWN_APP.EXE, ContextPtr =

0x00000000

11/16/06 13:31:43, ACTION, SQLDUMPER_UNKNOWN_APP.EXE, ExtraFile =

<NULL>

11/16/06 13:31:43, ACTION, SQLDUMPER_UNKNOWN_APP.EXE, InstanceName =

<NULL>

11/16/06 13:31:43, ACTION, SQLDUMPER_UNKNOWN_APP.EXE, ServiceName =

<NULL>

11/16/06 13:31:43, ACTION, SQLDUMPER_UNKNOWN_APP.EXE, Callback type 11 not used

11/16/06 13:31:44, ERROR , SQLDUMPER_UNKNOWN_APP.EXE, MiniDumpWriteDump

() Failed 0x80070005 - Access is denied.

11/16/06 13:31:44, ACTION, SQLDUMPER_UNKNOWN_APP.EXE, Watson Invoke: No

Do you have anything in your Windows Event log that might indicate a better error?

Also, how are you authenticating to the databases? SQL Server users? Active Directory?

I'm just blurting out stuff to check, I guess.

Phil|||Also, you didn't need to start this thread when you replied to the other one. We don't need two threads about the same topic. Having more than one thread only makes it cumbersome to help you.|||

Nothing in the Application Log. I've tried authentication using windows auth and sql auth both result in the same error.

I was using an OLE DB Destination and I changed it to a SQL Server Destination which got rid of the error above but presented another. I still believe OLE DB Destination should work though.

I should add that I can run the package from my machine without a problem. But whenever I try to run it from the server It will be scheduled on is when I have the problem