Showing posts with label accomplish. Show all posts
Showing posts with label accomplish. Show all posts

Monday, March 19, 2012

Advanced T-SQL Question: Producing Backup Window Timeline Chart

I'm an Oracle DBA who inherited 100 or so SQL Databases. I know how I
would accomplish this task in Oracle but I have no clue how to do it in
T-SQL. In short, I want to produce a comma delimited gantt type
timeline chart of my Backup Windows for import into Excel.
I'm not sure if this is the best place to get this information but I
found the msdb sysjobservers table which contains information about the
last run and duration of a job. It might be nice to get historical
information but I'm not sure where I'd find it. If there is a better
place to get this information then I'd like to know.
Here is the data I'm looking at...
use msdb
go
select j.name, s.last_run_outcome, s.last_run_date, s.last_run_time,
s.last_run_duration
from sysjobs j, sysjobservers s
where j.job_id = s.job_id
go
name last_run_outcome last_run_date last_run_time
last_run_duration
-- -- -- --
--
DB BACKUP1 1 20060406 200000 1958
DB BACKUP2 1 20060401 10000 5943
DB BACKUP3 1 20060402 30000 31934
(3 row(s) affected)
If I understand this correctly, the data shows backup1 last ran at
20:00 and ran for 1958 seconds or approx 33 minutes rounded up. I want
to kick out a comma delimited timeline showing the duration of the
backup. The first field is of course the name. The remaining 24
fields corrospond to the hours of the day 0 through 23.
So using the above data my output would look like this...
DB BACKUP1,,,,,,,,,,,,,,,,,,,,,X,,,
DB BACKUP2,,X,X,,,,,,,,,,,,,,,,,,,,,
DB BACKUP3,,,,X,X,X,X,X,X,X,X,X,,,,,,,,,,,,
BACKUP1 ran at 20:00 for less than an hour so there is one X in the
field corrosponding to the 20:00 hour showing the timeblock where the
backup ran.
BACKUP2 ran at 1:00 for less than 2 hours so there are X's in the
fields corrosponding to the 1:00 and 2:00 hours showing the timeblock
where the backup ran.
BACKUP3 ran at 3:00 for less than 9 hours so there are X's in the
fields corrosponding to the 3:00 through 11:00 hours showing the
timeblock where the backup ran.
I would greatly appreciate it if someone could help me with the T-SQL
to produce such a report.
Thanks in advance,
Roger WestbrookThis conversation is ongoing in the .programming group.
Tibor Karaszi, SQL Server MVP
http://www.karaszi.com/sqlserver/default.asp
http://www.solidqualitylearning.com/
Blog: http://solidqualitylearning.com/blogs/tibor/
<Kael_Sidhe@.yahoo.com> wrote in message news:1144417913.196803.81210@.t31g2000cwb.googlegroup
s.com...
> I'm an Oracle DBA who inherited 100 or so SQL Databases. I know how I
> would accomplish this task in Oracle but I have no clue how to do it in
> T-SQL. In short, I want to produce a comma delimited gantt type
> timeline chart of my Backup Windows for import into Excel.
> I'm not sure if this is the best place to get this information but I
> found the msdb sysjobservers table which contains information about the
> last run and duration of a job. It might be nice to get historical
> information but I'm not sure where I'd find it. If there is a better
> place to get this information then I'd like to know.
> Here is the data I'm looking at...
> use msdb
> go
> select j.name, s.last_run_outcome, s.last_run_date, s.last_run_time,
> s.last_run_duration
> from sysjobs j, sysjobservers s
> where j.job_id = s.job_id
> go
> name last_run_outcome last_run_date last_run_time
> last_run_duration
> -- -- -- --
> --
> DB BACKUP1 1 20060406 200000 1958
> DB BACKUP2 1 20060401 10000 5943
> DB BACKUP3 1 20060402 30000 31934
> (3 row(s) affected)
> If I understand this correctly, the data shows backup1 last ran at
> 20:00 and ran for 1958 seconds or approx 33 minutes rounded up. I want
> to kick out a comma delimited timeline showing the duration of the
> backup. The first field is of course the name. The remaining 24
> fields corrospond to the hours of the day 0 through 23.
> So using the above data my output would look like this...
> DB BACKUP1,,,,,,,,,,,,,,,,,,,,,X,,,
> DB BACKUP2,,X,X,,,,,,,,,,,,,,,,,,,,,
> DB BACKUP3,,,,X,X,X,X,X,X,X,X,X,,,,,,,,,,,,
> BACKUP1 ran at 20:00 for less than an hour so there is one X in the
> field corrosponding to the 20:00 hour showing the timeblock where the
> backup ran.
> BACKUP2 ran at 1:00 for less than 2 hours so there are X's in the
> fields corrosponding to the 1:00 and 2:00 hours showing the timeblock
> where the backup ran.
> BACKUP3 ran at 3:00 for less than 9 hours so there are X's in the
> fields corrosponding to the 3:00 through 11:00 hours showing the
> timeblock where the backup ran.
>
> I would greatly appreciate it if someone could help me with the T-SQL
> to produce such a report.
> Thanks in advance,
> Roger Westbrook
>

Advanced T-SQL Question: Producing Backup Window Timeline Chart

I'm an Oracle DBA who inherited 100 or so SQL Databases. I know how I
would accomplish this task in Oracle but I have no clue how to do it in
T-SQL. In short, I want to produce a comma delimited gantt type
timeline chart of my Backup Windows for import into Excel.
I'm not sure if this is the best place to get this information but I
found the msdb sysjobservers table which contains information about the
last run and duration of a job. It might be nice to get historical
information but I'm not sure where I'd find it. If there is a better
place to get this information then I'd like to know.
Here is the data I'm looking at...
use msdb
go
select j.name, s.last_run_outcome, s.last_run_date, s.last_run_time,
s.last_run_duration
from sysjobs j, sysjobservers s
where j.job_id = s.job_id
go
name last_run_outcome last_run_date last_run_time
last_run_duration
-- -- -- --
--
DB BACKUP1 1 20060406 200000 1958
DB BACKUP2 1 20060401 10000 5943
DB BACKUP3 1 20060402 30000 31934
(3 row(s) affected)
If I understand this correctly, the data shows backup1 last ran at
20:00 and ran for 1958 seconds or approx 33 minutes rounded up. I want
to kick out a comma delimited timeline showing the duration of the
backup. The first field is of course the name. The remaining 24
fields corrospond to the hours of the day 0 through 23.
So using the above data my output would look like this...
DB BACKUP1,,,,,,,,,,,,,,,,,,,,,X,,,
DB BACKUP2,,X,X,,,,,,,,,,,,,,,,,,,,,
DB BACKUP3,,,,X,X,X,X,X,X,X,X,X,,,,,,,,,,,,
BACKUP1 ran at 20:00 for less than an hour so there is one X in the
field corrosponding to the 20:00 hour showing the timeblock where the
backup ran.
BACKUP2 ran at 1:00 for less than 2 hours so there are X's in the
fields corrosponding to the 1:00 and 2:00 hours showing the timeblock
where the backup ran.
BACKUP3 ran at 3:00 for less than 9 hours so there are X's in the
fields corrosponding to the 3:00 through 11:00 hours showing the
timeblock where the backup ran.
I would greatly appreciate it if someone could help me with the T-SQL
to produce such a report.
Thanks in advance,
Roger WestbrookThis conversation is ongoing in the .programming group.
--
Tibor Karaszi, SQL Server MVP
http://www.karaszi.com/sqlserver/default.asp
http://www.solidqualitylearning.com/
Blog: http://solidqualitylearning.com/blogs/tibor/
<Kael_Sidhe@.yahoo.com> wrote in message news:1144417913.196803.81210@.t31g2000cwb.googlegroups.com...
> I'm an Oracle DBA who inherited 100 or so SQL Databases. I know how I
> would accomplish this task in Oracle but I have no clue how to do it in
> T-SQL. In short, I want to produce a comma delimited gantt type
> timeline chart of my Backup Windows for import into Excel.
> I'm not sure if this is the best place to get this information but I
> found the msdb sysjobservers table which contains information about the
> last run and duration of a job. It might be nice to get historical
> information but I'm not sure where I'd find it. If there is a better
> place to get this information then I'd like to know.
> Here is the data I'm looking at...
> use msdb
> go
> select j.name, s.last_run_outcome, s.last_run_date, s.last_run_time,
> s.last_run_duration
> from sysjobs j, sysjobservers s
> where j.job_id = s.job_id
> go
> name last_run_outcome last_run_date last_run_time
> last_run_duration
> -- -- -- --
> --
> DB BACKUP1 1 20060406 200000 1958
> DB BACKUP2 1 20060401 10000 5943
> DB BACKUP3 1 20060402 30000 31934
> (3 row(s) affected)
> If I understand this correctly, the data shows backup1 last ran at
> 20:00 and ran for 1958 seconds or approx 33 minutes rounded up. I want
> to kick out a comma delimited timeline showing the duration of the
> backup. The first field is of course the name. The remaining 24
> fields corrospond to the hours of the day 0 through 23.
> So using the above data my output would look like this...
> DB BACKUP1,,,,,,,,,,,,,,,,,,,,,X,,,
> DB BACKUP2,,X,X,,,,,,,,,,,,,,,,,,,,,
> DB BACKUP3,,,,X,X,X,X,X,X,X,X,X,,,,,,,,,,,,
> BACKUP1 ran at 20:00 for less than an hour so there is one X in the
> field corrosponding to the 20:00 hour showing the timeblock where the
> backup ran.
> BACKUP2 ran at 1:00 for less than 2 hours so there are X's in the
> fields corrosponding to the 1:00 and 2:00 hours showing the timeblock
> where the backup ran.
> BACKUP3 ran at 3:00 for less than 9 hours so there are X's in the
> fields corrosponding to the 3:00 through 11:00 hours showing the
> timeblock where the backup ran.
>
> I would greatly appreciate it if someone could help me with the T-SQL
> to produce such a report.
> Thanks in advance,
> Roger Westbrook
>

Advanced T-SQL Question: Produce Backup Window Gantt Chart

I'm an Oracle DBA who inherited 100 or so SQL Databases. I know how I
would accomplish this task in Oracle but I have no clue how to do it in
T-SQL. In short, I want to produce a comma delimited gantt type
timeline chart of my Backup Windows for import into Excel.
I'm not sure if this is the best place to get this information but I
found the msdb sysjobservers table which contains information about the
last run and duration of a job. It might be nice to get historical
information but I'm not sure where I'd find it. If there is a better
place to get this information then I'd like to know.
Here is the data I'm looking at...
use msdb
go
select j.name, s.last_run_outcome, s.last_run_date, s.last_run_time,
s.last_run_duration
from sysjobs j, sysjobservers s
where j.job_id = s.job_id
go
name last_run_outcome last_run_date last_run_time
last_run_duration
-- -- -- --
--
DB BACKUP1 1 20060406 200000 1958
DB BACKUP2 1 20060401 10000 5943
DB BACKUP3 1 20060402 30000 31934
(3 row(s) affected)
If I understand this correctly, the data shows backup1 last ran at
20:00 and ran for 1958 seconds or approx 33 minutes rounded up. I want
to kick out a comma delimited timeline showing the duration of the
backup. The first field is of course the name. The remaining 24
fields corrospond to the hours of the day 0 through 23.
So using the above data my output would look like this...
DB BACKUP1,,,,,,,,,,,,,,,,,,,,,X,,,
DB BACKUP2,,X,X,,,,,,,,,,,,,,,,,,,,,
DB BACKUP3,,,,X,X,X,X,X,X,X,X,X,,,,,,,,,,,,
BACKUP1 ran at 20:00 for less than an hour so there is one X in the
field corrosponding to the 20:00 hour showing the timeblock where the
backup ran.
BACKUP2 ran at 1:00 for less than 2 hours so there are X's in the
fields corrosponding to the 1:00 and 2:00 hours showing the timeblock
where the backup ran.
BACKUP3 ran at 3:00 for less than 9 hours so there are X's in the
fields corrosponding to the 3:00 through 11:00 hours showing the
timeblock where the backup ran.
I would greatly appreciate it if someone could help me with the T-SQL
to produce such a report.
Thanks in advance,
Roger WestbrookI'm not familiar with these tables but this should give you
the output you want
SELECT j.name +
CASE WHEN 0 BETWEEN JobStart AND JobEnd THEN ',X' ELSE ',' END +
CASE WHEN 1 BETWEEN JobStart AND JobEnd THEN ',X' ELSE ',' END +
CASE WHEN 2 BETWEEN JobStart AND JobEnd THEN ',X' ELSE ',' END +
CASE WHEN 3 BETWEEN JobStart AND JobEnd THEN ',X' ELSE ',' END +
CASE WHEN 4 BETWEEN JobStart AND JobEnd THEN ',X' ELSE ',' END +
CASE WHEN 5 BETWEEN JobStart AND JobEnd THEN ',X' ELSE ',' END +
CASE WHEN 6 BETWEEN JobStart AND JobEnd THEN ',X' ELSE ',' END +
CASE WHEN 7 BETWEEN JobStart AND JobEnd THEN ',X' ELSE ',' END +
CASE WHEN 8 BETWEEN JobStart AND JobEnd THEN ',X' ELSE ',' END +
CASE WHEN 9 BETWEEN JobStart AND JobEnd THEN ',X' ELSE ',' END +
CASE WHEN 10 BETWEEN JobStart AND JobEnd THEN ',X' ELSE ',' END +
CASE WHEN 11 BETWEEN JobStart AND JobEnd THEN ',X' ELSE ',' END +
CASE WHEN 12 BETWEEN JobStart AND JobEnd THEN ',X' ELSE ',' END +
CASE WHEN 13 BETWEEN JobStart AND JobEnd THEN ',X' ELSE ',' END +
CASE WHEN 14 BETWEEN JobStart AND JobEnd THEN ',X' ELSE ',' END +
CASE WHEN 15 BETWEEN JobStart AND JobEnd THEN ',X' ELSE ',' END +
CASE WHEN 16 BETWEEN JobStart AND JobEnd THEN ',X' ELSE ',' END +
CASE WHEN 17 BETWEEN JobStart AND JobEnd THEN ',X' ELSE ',' END +
CASE WHEN 18 BETWEEN JobStart AND JobEnd THEN ',X' ELSE ',' END +
CASE WHEN 19 BETWEEN JobStart AND JobEnd THEN ',X' ELSE ',' END +
CASE WHEN 20 BETWEEN JobStart AND JobEnd THEN ',X' ELSE ',' END +
CASE WHEN 21 BETWEEN JobStart AND JobEnd THEN ',X' ELSE ',' END +
CASE WHEN 22 BETWEEN JobStart AND JobEnd THEN ',X' ELSE ',' END +
CASE WHEN 23 BETWEEN JobStart AND JobEnd THEN ',X' ELSE ',' END
FROM sysjobs j
INNER JOIN (SELECT job_id,
DATEPART(hour,DATEADD(second,
last_run_time % 100 +
((last_run_time/100) % 100)*60 +
(last_run_time / 10000)*3600,
'19000101')) AS JobStart,
DATEPART(hour,DATEADD(second,
last_run_duration +
last_run_time % 100 +
((last_run_time/100) % 100)*60 +
(last_run_time / 10000)*3600,
'19000101')) AS JobEnd,
last_run_duration
FROM sysjobservers) s ON j.job_id = s.job_id|||> It might be nice to get historical
> information but I'm not sure where I'd find it.
You have 5 tables in msdb with historical information about backup. The name
of the tables all start
with backup. For instance, check out msdb.dbo.backupset.
Tibor Karaszi, SQL Server MVP
http://www.karaszi.com/sqlserver/default.asp
http://www.solidqualitylearning.com/
Blog: http://solidqualitylearning.com/blogs/tibor/
<Kael_Sidhe@.yahoo.com> wrote in message
news:1144417911.795292.287140@.z34g2000cwc.googlegroups.com...
> I'm an Oracle DBA who inherited 100 or so SQL Databases. I know how I
> would accomplish this task in Oracle but I have no clue how to do it in
> T-SQL. In short, I want to produce a comma delimited gantt type
> timeline chart of my Backup Windows for import into Excel.
> I'm not sure if this is the best place to get this information but I
> found the msdb sysjobservers table which contains information about the
> last run and duration of a job. It might be nice to get historical
> information but I'm not sure where I'd find it. If there is a better
> place to get this information then I'd like to know.
> Here is the data I'm looking at...
> use msdb
> go
> select j.name, s.last_run_outcome, s.last_run_date, s.last_run_time,
> s.last_run_duration
> from sysjobs j, sysjobservers s
> where j.job_id = s.job_id
> go
> name last_run_outcome last_run_date last_run_time
> last_run_duration
> -- -- -- --
> --
> DB BACKUP1 1 20060406 200000 1958
> DB BACKUP2 1 20060401 10000 5943
> DB BACKUP3 1 20060402 30000 31934
> (3 row(s) affected)
> If I understand this correctly, the data shows backup1 last ran at
> 20:00 and ran for 1958 seconds or approx 33 minutes rounded up. I want
> to kick out a comma delimited timeline showing the duration of the
> backup. The first field is of course the name. The remaining 24
> fields corrospond to the hours of the day 0 through 23.
> So using the above data my output would look like this...
> DB BACKUP1,,,,,,,,,,,,,,,,,,,,,X,,,
> DB BACKUP2,,X,X,,,,,,,,,,,,,,,,,,,,,
> DB BACKUP3,,,,X,X,X,X,X,X,X,X,X,,,,,,,,,,,,
> BACKUP1 ran at 20:00 for less than an hour so there is one X in the
> field corrosponding to the 20:00 hour showing the timeblock where the
> backup ran.
> BACKUP2 ran at 1:00 for less than 2 hours so there are X's in the
> fields corrosponding to the 1:00 and 2:00 hours showing the timeblock
> where the backup ran.
> BACKUP3 ran at 3:00 for less than 9 hours so there are X's in the
> fields corrosponding to the 3:00 through 11:00 hours showing the
> timeblock where the backup ran.
>
> I would greatly appreciate it if someone could help me with the T-SQL
> to produce such a report.
> Thanks in advance,
> Roger Westbrook
>|||Corrected (and simplified) version that handles jobs spanning midnight
SELECT j.name +
CASE WHEN 0 BETWEEN JobStart AND JobEnd THEN ',X' ELSE ',' END +
CASE WHEN 1 BETWEEN JobStart AND JobEnd THEN ',X' ELSE ',' END +
CASE WHEN 2 BETWEEN JobStart AND JobEnd THEN ',X' ELSE ',' END +
CASE WHEN 3 BETWEEN JobStart AND JobEnd THEN ',X' ELSE ',' END +
CASE WHEN 4 BETWEEN JobStart AND JobEnd THEN ',X' ELSE ',' END +
CASE WHEN 5 BETWEEN JobStart AND JobEnd THEN ',X' ELSE ',' END +
CASE WHEN 6 BETWEEN JobStart AND JobEnd THEN ',X' ELSE ',' END +
CASE WHEN 7 BETWEEN JobStart AND JobEnd THEN ',X' ELSE ',' END +
CASE WHEN 8 BETWEEN JobStart AND JobEnd THEN ',X' ELSE ',' END +
CASE WHEN 9 BETWEEN JobStart AND JobEnd THEN ',X' ELSE ',' END +
CASE WHEN 10 BETWEEN JobStart AND JobEnd THEN ',X' ELSE ',' END +
CASE WHEN 11 BETWEEN JobStart AND JobEnd THEN ',X' ELSE ',' END +
CASE WHEN 12 BETWEEN JobStart AND JobEnd THEN ',X' ELSE ',' END +
CASE WHEN 13 BETWEEN JobStart AND JobEnd THEN ',X' ELSE ',' END +
CASE WHEN 14 BETWEEN JobStart AND JobEnd THEN ',X' ELSE ',' END +
CASE WHEN 15 BETWEEN JobStart AND JobEnd THEN ',X' ELSE ',' END +
CASE WHEN 16 BETWEEN JobStart AND JobEnd THEN ',X' ELSE ',' END +
CASE WHEN 17 BETWEEN JobStart AND JobEnd THEN ',X' ELSE ',' END +
CASE WHEN 18 BETWEEN JobStart AND JobEnd THEN ',X' ELSE ',' END +
CASE WHEN 19 BETWEEN JobStart AND JobEnd THEN ',X' ELSE ',' END +
CASE WHEN 20 BETWEEN JobStart AND JobEnd THEN ',X' ELSE ',' END +
CASE WHEN 21 BETWEEN JobStart AND JobEnd THEN ',X' ELSE ',' END +
CASE WHEN 22 BETWEEN JobStart AND JobEnd THEN ',X' ELSE ',' END +
CASE WHEN 23 BETWEEN JobStart AND JobEnd THEN ',X' ELSE ',' END
FROM sysjobs j
INNER JOIN (SELECT job_id,
last_run_time / 10000 AS JobStart,
(last_run_duration +
last_run_time % 100 +
((last_run_time/100) % 100)*60 +
(last_run_time / 10000)*3600) / 3600 AS JobEnd
FROM sysjobservers) s ON j.job_id = s.job_id|||Roger,
yet another approach:
create function backup_mask(@.from_hour int, @.to_hour int)
returns varchar(26)
as
begin
declare @.idle_mask varchar(26)
declare @.busy_mask varchar(24)
set @.idle_mask = '('+replicate('_',24)+')'
set @.busy_mask = replicate('X',24)
return(
substring(@.idle_mask, 1, @.from_hour)+
substring(@.busy_mask, 1, @.to_hour - @.from_hour+1)+
substring(@.idle_mask, 2 + @.to_hour,25-@.to_hour))
end
go
-- unit test --
select dbo.backup_mask(1,5)
select dbo.backup_mask(3,15)
select dbo.backup_mask(1,24)
select dbo.backup_mask(19,24)
(XXXXX___________________)
(1 row(s) affected)
(__XXXXXXXXXXXXX_________)
(1 row(s) affected)
(XXXXXXXXXXXXXXXXXXXXXXXX)
(1 row(s) affected)
(__________________XXXXXX)
(1 row(s) affected)
go
drop function backup_mask
go|||And another, using Alexander's function as a start,
just because we have QUOTENAME and STUFF.
(Note that the behavior of QUOTENAME when the
second parameter is '(' is not documented.)
create function backup_mask(@.from_hour int, @.to_hour int)
returns varchar(26)
as
begin
declare @.idle_mask varchar(26)
declare @.busy_mask varchar(24)
set @.idle_mask = replicate('_',24)
set @.busy_mask = replicate('X',24)
declare @.duration int
set @.duration = @.to_hour - @.from_hour + 1
if @.duration <= 0
return quotename(@.idle_mask,'(')
return
quotename(stuff(@.idle_mask,@.from_hour,@.d
uration,left(@.busy_mask,@.duration)),
'(')
end
Alexander Kuznetsov wrote:

>Roger,
>yet another approach:
>create function backup_mask(@.from_hour int, @.to_hour int)
>returns varchar(26)
>as
>begin
>declare @.idle_mask varchar(26)
>declare @.busy_mask varchar(24)
>set @.idle_mask = '('+replicate('_',24)+')'
>set @.busy_mask = replicate('X',24)
>return(
>substring(@.idle_mask, 1, @.from_hour)+
>substring(@.busy_mask, 1, @.to_hour - @.from_hour+1)+
>substring(@.idle_mask, 2 + @.to_hour,25-@.to_hour))
>end
>go
>-- unit test --
>select dbo.backup_mask(1,5)
>select dbo.backup_mask(3,15)
>select dbo.backup_mask(1,24)
>select dbo.backup_mask(19,24)
>
>--
>(XXXXX___________________)
>(1 row(s) affected)
>
>--
>(__XXXXXXXXXXXXX_________)
>(1 row(s) affected)
>
>--
>(XXXXXXXXXXXXXXXXXXXXXXXX)
>(1 row(s) affected)
>
>--
>(__________________XXXXXX)
>(1 row(s) affected)
>
>go
>drop function backup_mask
>go
>
>|||Thanks folks, I may have some follow up questions but for now I'm going
to digest the information you've sent me and see what I can come up
with. Thanks again for the assist!

Advanced Query Question

What I am trying to accomplish is to return a resultset that would look
like this:
Company.Name Table1.Column Table2.Column Table3.Column
Company A y y y
Company B n y n
Company C n n y
Company D n n n
I am using 4 tables to do this (Company, Table1, Table2, Table3). What
makes this somewhat complex is that I need to filter on a specific date
range (ie. 5/1/2006 - 5/31/2006) and display a company even if nothing
is available for the date range entered (Company D). Each Table has
its own date column. So I would need to apply my date criteria to
Table1.Date, Table2.Date, and Table3.Date.
In my first attempt I was using an Outer Join to display all company
names even if nothing was available in the three columns. That didn't
seem to work because then Company D wouldn't be included in the output.
The primary/foreign key relationship on the tables is the CompanyId.
The primary CompanyId being in the Company table and the foreign
CompanyId being in Tables 1,2, & 3.
How can I write this SQL statement in order to get the above results?
Thanks
*** Sent via Developersdex http://www.examnotes.net ***"JP" <jp@.nospam.com> wrote in message
news:OJoiMuahGHA.4368@.TK2MSFTNGP03.phx.gbl...
> What I am trying to accomplish is to return a resultset that would look
> like this:
> Company.Name Table1.Column Table2.Column Table3.Column
> Company A y y y
> Company B n y n
> Company C n n y
> Company D n n n
<snip>
CREATE TABLE #Company (CompanyID INT NOT NULL PRIMARY KEY,
[Name] VARCHAR(100) NOT NULL)
CREATE TABLE #Table1 (CompanyID INT NOT NULL,
EntryNum INT NOT NULL,
[Date] DATETIME,
PRIMARY KEY (CompanyID, EntryNum))
ALTER TABLE #Table1 ADD
CONSTRAINT FK_Table1_Company FOREIGN KEY
(
[CompanyID]
) REFERENCES #Company
(
[CompanyID]
)
CREATE TABLE #Table2 (CompanyID INT NOT NULL,
EntryNum INT NOT NULL,
[Date] DATETIME,
PRIMARY KEY (CompanyID, EntryNum))
ALTER TABLE #Table2 ADD
CONSTRAINT FK_Table2_Company FOREIGN KEY
(
[CompanyID]
) REFERENCES #Company
(
[CompanyID]
)
CREATE TABLE #Table3 (CompanyID INT NOT NULL,
EntryNum INT NOT NULL,
[Date] DATETIME,
PRIMARY KEY (CompanyID, EntryNum))
ALTER TABLE #Table3 ADD
CONSTRAINT FK_Table3_Company FOREIGN KEY
(
[CompanyID]
) REFERENCES #Company
(
[CompanyID]
)
INSERT INTO #Company (CompanyID, [Name])
SELECT 1, 'Microsoft'
UNION SELECT 2, 'Hooters'
UNION SELECT 3, 'Ca$h Munny Records'
UNION SELECT 4, 'Widgets LLC.'
UNION SELECT 5, 'Five Inc'
INSERT INTO #Table1 (CompanyID, EntryNum, [Date])
SELECT 1, 1, '2006-01-01'
UNION SELECT 1, 2, '2006-05-01'
UNION SELECT 3, 1, '2005-01-01'
UNION SELECT 5, 1, '2006-05-15'
INSERT INTO #Table2 (CompanyID, EntryNum, [Date])
SELECT 1, 1, '2006-05-31'
UNION SELECT 1, 2, '2006-01-01'
UNION SELECT 2, 1, '2005-04-01'
UNION SELECT 4, 1, '2006-05-15'
INSERT INTO #Table3 (CompanyID, EntryNum, [Date])
SELECT 1, 1, '2006-05-02'
UNION SELECT 2, 1, '2006-05-04'
UNION SELECT 3, 1, '2006-05-09'
UNION SELECT 4, 1, '2006-05-15'
UNION SELECT 5, 1, '2006-05-14'
UNION SELECT 2, 2, '2001-01-01'
SELECT DISTINCT c.CompanyID, c.[Name],
CASE WHEN t1.EntryNum IS NULL
THEN 'n'
ELSE 'y'
END AS [Table1_Column],
CASE WHEN t2.EntryNum IS NULL
THEN 'n'
ELSE 'y'
END AS [Table2_Column],
CASE WHEN t3.EntryNum IS NULL
THEN 'n'
ELSE 'y'
END AS [Table3_Column]
FROM #Company c
LEFT OUTER JOIN #Table1 t1
ON c.CompanyID = t1.CompanyID
AND (t1.[Date] >= '2006-05-01' AND t1.[Date] < '2006-06-01')
LEFT OUTER JOIN #Table2 t2
ON c.CompanyID = t2.CompanyID
AND (t2.[Date] >= '2006-05-01' AND t2.[Date] < '2006-06-01')
LEFT OUTER JOIN #Table3 t3
ON c.CompanyID = t3.CompanyID
AND (t3.[Date] >= '2006-05-01' AND t3.[Date] < '2006-06-01')
DROP TABLE #Table3
DROP TABLE #Table2
DROP TABLE #Table1
DROP TABLE #Company|||Thank you very much Mike!
*** Sent via Developersdex http://www.examnotes.net ***

Sunday, March 11, 2012

Advanced Log Shipping

We are researching whether we can accomplish the following goal with
log shipping. We have a reporting database that is mostly read-only
that we would like to migrate to log shipping.
Our environment consists of two database servers, one is our
operational database server and the other is a reporting database
server. All the data writes go to our operational database.
Currently, we replicate the operational database to the reporting
database every evening. We intentional do this once a day so that we
have a 24 hour window to correct any data entry issues if they should
occur.
Log shipping sounds easy enough to setup, but here is where it gets
complicated for us. Our reporting database is used via our web portal
application. Our users have the ability to make changes during the day
via the portal. Some of these changes cause writes to both the
reporting database as well as the operational database. The writes to
the reporting database let the users get immediate changes, and the
writes to the operational database ensure the data will be updated upon
the next nightly update.
1) Will these intermittent writes to the reporting database server
prevent a log shipping transaction log restore from completing
successfully?
2) If we do transaction log backups once an hour, I assume we can save
them all and then replay all the tlog backups at one time in the middle
of the night.
We tried to setup replication at one point, but our database schema
would not replicate to a second server and we have not had the
resources to try to resolve the replication issues.
Thank You,
Kevin
Kevin,
log shipping restores the logs using NORECOVERY or STANDBY, both of which
will prevent the editing of the reporting data.
Cheers,
Paul Ibison SQL Server MVP, www.replicationanswers.com
(recommended sql server 2000 replication book:
http://www.nwsu.com/0974973602p.html)
|||Log shipping target databases are locked read-only. You cannot update a
log-shipped target database. IF you bring it live then you cannot apply any
further transaction logs.
Yes, you can "save up" transaction logs and apply them in sequence.
Geoff N. Hiten
Senior Database Administrator
Microsoft SQL Server MVP
"kghammond" <kghammond@.nrscorp.com> wrote in message
news:1133799275.397007.188420@.g43g2000cwa.googlegr oups.com...
> We are researching whether we can accomplish the following goal with
> log shipping. We have a reporting database that is mostly read-only
> that we would like to migrate to log shipping.
> Our environment consists of two database servers, one is our
> operational database server and the other is a reporting database
> server. All the data writes go to our operational database.
> Currently, we replicate the operational database to the reporting
> database every evening. We intentional do this once a day so that we
> have a 24 hour window to correct any data entry issues if they should
> occur.
> Log shipping sounds easy enough to setup, but here is where it gets
> complicated for us. Our reporting database is used via our web portal
> application. Our users have the ability to make changes during the day
> via the portal. Some of these changes cause writes to both the
> reporting database as well as the operational database. The writes to
> the reporting database let the users get immediate changes, and the
> writes to the operational database ensure the data will be updated upon
> the next nightly update.
> 1) Will these intermittent writes to the reporting database server
> prevent a log shipping transaction log restore from completing
> successfully?
> 2) If we do transaction log backups once an hour, I assume we can save
> them all and then replay all the tlog backups at one time in the middle
> of the night.
> We tried to setup replication at one point, but our database schema
> would not replicate to a second server and we have not had the
> resources to try to resolve the replication issues.
> Thank You,
> Kevin
>
|||The business case that we are trying to resolve is that we are moving
our reporting database server to a co-location facility. In the past
we had high speed LAN access between our database servers so nightly
backup and restores of our database where not an issue. Our database
is currently about 7GB and it is growing at about 25% per year. Our
WAN connection is a standard point to point T1. 7GB is too much data
to move across the WAN on a nightly basis. LiteSpeed will shrink the
database to 1.5GB, but still that is very large compared to 35MB
transaction logs.
Is it possible to make a copy of the log shipping target database on
the remote site? In essence could we use the log shipping target
database to stage a production database across the WAN.
Operational Database --> Log Shipping --> Staging Database (RO) -->
Backup/detach/copy/etc --> Reporting Database (RW)
Thank You for the quick feedback,
Kevin
|||Upgrade to SQL Server 2005 and use database snapshots - exactly what they
are designed for.
You can't detach the database and copy, nor can you stop SQL Server and copy
the files and use those as inputs to sp_attach_db.
Tony.
Tony Rogerson
SQL Server MVP
http://sqlserverfaq.com - free video tutorials
"kghammond" <kghammond@.nrscorp.com> wrote in message
news:1133800712.975757.55690@.f14g2000cwb.googlegro ups.com...
> The business case that we are trying to resolve is that we are moving
> our reporting database server to a co-location facility. In the past
> we had high speed LAN access between our database servers so nightly
> backup and restores of our database where not an issue. Our database
> is currently about 7GB and it is growing at about 25% per year. Our
> WAN connection is a standard point to point T1. 7GB is too much data
> to move across the WAN on a nightly basis. LiteSpeed will shrink the
> database to 1.5GB, but still that is very large compared to 35MB
> transaction logs.
> Is it possible to make a copy of the log shipping target database on
> the remote site? In essence could we use the log shipping target
> database to stage a production database across the WAN.
> Operational Database --> Log Shipping --> Staging Database (RO) -->
> Backup/detach/copy/etc --> Reporting Database (RW)
> Thank You for the quick feedback,
> Kevin
>
|||Our final solution is to utilize Double Take. We plan to replicate our
..mdf and .ldf across the wan using double take. Then once a night we
will pause Double Take and copy the .mdf and .ldf out of the replica.
We will then mount that as our reporting database.
So far all seems well. Scripting out an automated way to attach the
database and verify that it is not corrupt and then backout to the
previous database if necessary is proving to be a little tricky. If
only DTS allowed better flow control...
SQL 2005 will be on its way soon, but not soon enough
Thank you all for your input,
Kevin
|||Hello,
Thanks for let me know the current status. If you have any questions or
concerns in future, feel free to post back!
Sophie Guo
Microsoft Online Partner Support
Get Secure! - www.microsoft.com/security
================================================== ===
When responding to posts, please "Reply to Group" via your newsreader so
that others may learn and benefit from your issue.
================================================== ===
This posting is provided "AS IS" with no warranties, and confers no rights.

Advanced Log Shipping

We are researching whether we can accomplish the following goal with
log shipping. We have a reporting database that is mostly read-only
that we would like to migrate to log shipping.
Our environment consists of two database servers, one is our
operational database server and the other is a reporting database
server. All the data writes go to our operational database.
Currently, we replicate the operational database to the reporting
database every evening. We intentional do this once a day so that we
have a 24 hour window to correct any data entry issues if they should
occur.
Log shipping sounds easy enough to setup, but here is where it gets
complicated for us. Our reporting database is used via our web portal
application. Our users have the ability to make changes during the day
via the portal. Some of these changes cause writes to both the
reporting database as well as the operational database. The writes to
the reporting database let the users get immediate changes, and the
writes to the operational database ensure the data will be updated upon
the next nightly update.
1) Will these intermittent writes to the reporting database server
prevent a log shipping transaction log restore from completing
successfully?
2) If we do transaction log backups once an hour, I assume we can save
them all and then replay all the tlog backups at one time in the middle
of the night.
We tried to setup replication at one point, but our database schema
would not replicate to a second server and we have not had the
resources to try to resolve the replication issues.
Thank You,
KevinKevin,
log shipping restores the logs using NORECOVERY or STANDBY, both of which
will prevent the editing of the reporting data.
Cheers,
Paul Ibison SQL Server MVP, www.replicationanswers.com
(recommended sql server 2000 replication book:
http://www.nwsu.com/0974973602p.html)|||Log shipping target databases are locked read-only. You cannot update a
log-shipped target database. IF you bring it live then you cannot apply any
further transaction logs.
Yes, you can "save up" transaction logs and apply them in sequence.
--
Geoff N. Hiten
Senior Database Administrator
Microsoft SQL Server MVP
"kghammond" <kghammond@.nrscorp.com> wrote in message
news:1133799275.397007.188420@.g43g2000cwa.googlegroups.com...
> We are researching whether we can accomplish the following goal with
> log shipping. We have a reporting database that is mostly read-only
> that we would like to migrate to log shipping.
> Our environment consists of two database servers, one is our
> operational database server and the other is a reporting database
> server. All the data writes go to our operational database.
> Currently, we replicate the operational database to the reporting
> database every evening. We intentional do this once a day so that we
> have a 24 hour window to correct any data entry issues if they should
> occur.
> Log shipping sounds easy enough to setup, but here is where it gets
> complicated for us. Our reporting database is used via our web portal
> application. Our users have the ability to make changes during the day
> via the portal. Some of these changes cause writes to both the
> reporting database as well as the operational database. The writes to
> the reporting database let the users get immediate changes, and the
> writes to the operational database ensure the data will be updated upon
> the next nightly update.
> 1) Will these intermittent writes to the reporting database server
> prevent a log shipping transaction log restore from completing
> successfully?
> 2) If we do transaction log backups once an hour, I assume we can save
> them all and then replay all the tlog backups at one time in the middle
> of the night.
> We tried to setup replication at one point, but our database schema
> would not replicate to a second server and we have not had the
> resources to try to resolve the replication issues.
> Thank You,
> Kevin
>|||The business case that we are trying to resolve is that we are moving
our reporting database server to a co-location facility. In the past
we had high speed LAN access between our database servers so nightly
backup and restores of our database where not an issue. Our database
is currently about 7GB and it is growing at about 25% per year. Our
WAN connection is a standard point to point T1. 7GB is too much data
to move across the WAN on a nightly basis. LiteSpeed will shrink the
database to 1.5GB, but still that is very large compared to 35MB
transaction logs.
Is it possible to make a copy of the log shipping target database on
the remote site? In essence could we use the log shipping target
database to stage a production database across the WAN.
Operational Database --> Log Shipping --> Staging Database (RO) -->
Backup/detach/copy/etc --> Reporting Database (RW)
Thank You for the quick feedback,
Kevin|||Upgrade to SQL Server 2005 and use database snapshots - exactly what they
are designed for.
You can't detach the database and copy, nor can you stop SQL Server and copy
the files and use those as inputs to sp_attach_db.
Tony.
--
Tony Rogerson
SQL Server MVP
http://sqlserverfaq.com - free video tutorials
"kghammond" <kghammond@.nrscorp.com> wrote in message
news:1133800712.975757.55690@.f14g2000cwb.googlegroups.com...
> The business case that we are trying to resolve is that we are moving
> our reporting database server to a co-location facility. In the past
> we had high speed LAN access between our database servers so nightly
> backup and restores of our database where not an issue. Our database
> is currently about 7GB and it is growing at about 25% per year. Our
> WAN connection is a standard point to point T1. 7GB is too much data
> to move across the WAN on a nightly basis. LiteSpeed will shrink the
> database to 1.5GB, but still that is very large compared to 35MB
> transaction logs.
> Is it possible to make a copy of the log shipping target database on
> the remote site? In essence could we use the log shipping target
> database to stage a production database across the WAN.
> Operational Database --> Log Shipping --> Staging Database (RO) -->
> Backup/detach/copy/etc --> Reporting Database (RW)
> Thank You for the quick feedback,
> Kevin
>|||Our final solution is to utilize Double Take. We plan to replicate our
.mdf and .ldf across the wan using double take. Then once a night we
will pause Double Take and copy the .mdf and .ldf out of the replica.
We will then mount that as our reporting database.
So far all seems well. Scripting out an automated way to attach the
database and verify that it is not corrupt and then backout to the
previous database if necessary is proving to be a little tricky. If
only DTS allowed better flow control...
SQL 2005 will be on its way soon, but not soon enough :(
Thank you all for your input,
Kevin|||Hello,
Thanks for let me know the current status. If you have any questions or
concerns in future, feel free to post back!
Sophie Guo
Microsoft Online Partner Support
Get Secure! - www.microsoft.com/security
=====================================================When responding to posts, please "Reply to Group" via your newsreader so
that others may learn and benefit from your issue.
=====================================================This posting is provided "AS IS" with no warranties, and confers no rights.

Advanced Log Shipping

We are researching whether we can accomplish the following goal with
log shipping. We have a reporting database that is mostly read-only
that we would like to migrate to log shipping.
Our environment consists of two database servers, one is our
operational database server and the other is a reporting database
server. All the data writes go to our operational database.
Currently, we replicate the operational database to the reporting
database every evening. We intentional do this once a day so that we
have a 24 hour window to correct any data entry issues if they should
occur.
Log shipping sounds easy enough to setup, but here is where it gets
complicated for us. Our reporting database is used via our web portal
application. Our users have the ability to make changes during the day
via the portal. Some of these changes cause writes to both the
reporting database as well as the operational database. The writes to
the reporting database let the users get immediate changes, and the
writes to the operational database ensure the data will be updated upon
the next nightly update.
1) Will these intermittent writes to the reporting database server
prevent a log shipping transaction log restore from completing
successfully?
2) If we do transaction log backups once an hour, I assume we can save
them all and then replay all the tlog backups at one time in the middle
of the night.
We tried to setup replication at one point, but our database schema
would not replicate to a second server and we have not had the
resources to try to resolve the replication issues.
Thank You,
KevinKevin,
log shipping restores the logs using NORECOVERY or STANDBY, both of which
will prevent the editing of the reporting data.
Cheers,
Paul Ibison SQL Server MVP, www.replicationanswers.com
(recommended sql server 2000 replication book:
http://www.nwsu.com/0974973602p.html)|||Log shipping target databases are locked read-only. You cannot update a
log-shipped target database. IF you bring it live then you cannot apply any
further transaction logs.
Yes, you can "save up" transaction logs and apply them in sequence.
Geoff N. Hiten
Senior Database Administrator
Microsoft SQL Server MVP
"kghammond" <kghammond@.nrscorp.com> wrote in message
news:1133799275.397007.188420@.g43g2000cwa.googlegroups.com...
> We are researching whether we can accomplish the following goal with
> log shipping. We have a reporting database that is mostly read-only
> that we would like to migrate to log shipping.
> Our environment consists of two database servers, one is our
> operational database server and the other is a reporting database
> server. All the data writes go to our operational database.
> Currently, we replicate the operational database to the reporting
> database every evening. We intentional do this once a day so that we
> have a 24 hour window to correct any data entry issues if they should
> occur.
> Log shipping sounds easy enough to setup, but here is where it gets
> complicated for us. Our reporting database is used via our web portal
> application. Our users have the ability to make changes during the day
> via the portal. Some of these changes cause writes to both the
> reporting database as well as the operational database. The writes to
> the reporting database let the users get immediate changes, and the
> writes to the operational database ensure the data will be updated upon
> the next nightly update.
> 1) Will these intermittent writes to the reporting database server
> prevent a log shipping transaction log restore from completing
> successfully?
> 2) If we do transaction log backups once an hour, I assume we can save
> them all and then replay all the tlog backups at one time in the middle
> of the night.
> We tried to setup replication at one point, but our database schema
> would not replicate to a second server and we have not had the
> resources to try to resolve the replication issues.
> Thank You,
> Kevin
>|||The business case that we are trying to resolve is that we are moving
our reporting database server to a co-location facility. In the past
we had high speed LAN access between our database servers so nightly
backup and restores of our database where not an issue. Our database
is currently about 7GB and it is growing at about 25% per year. Our
WAN connection is a standard point to point T1. 7GB is too much data
to move across the WAN on a nightly basis. LiteSpeed will shrink the
database to 1.5GB, but still that is very large compared to 35MB
transaction logs.
Is it possible to make a copy of the log shipping target database on
the remote site? In essence could we use the log shipping target
database to stage a production database across the WAN.
Operational Database --> Log Shipping --> Staging Database (RO) -->
Backup/detach/copy/etc --> Reporting Database (RW)
Thank You for the quick feedback,
Kevin|||Upgrade to SQL Server 2005 and use database snapshots - exactly what they
are designed for.
You can't detach the database and copy, nor can you stop SQL Server and copy
the files and use those as inputs to sp_attach_db.
Tony.
Tony Rogerson
SQL Server MVP
http://sqlserverfaq.com - free video tutorials
"kghammond" <kghammond@.nrscorp.com> wrote in message
news:1133800712.975757.55690@.f14g2000cwb.googlegroups.com...
> The business case that we are trying to resolve is that we are moving
> our reporting database server to a co-location facility. In the past
> we had high speed LAN access between our database servers so nightly
> backup and restores of our database where not an issue. Our database
> is currently about 7GB and it is growing at about 25% per year. Our
> WAN connection is a standard point to point T1. 7GB is too much data
> to move across the WAN on a nightly basis. LiteSpeed will shrink the
> database to 1.5GB, but still that is very large compared to 35MB
> transaction logs.
> Is it possible to make a copy of the log shipping target database on
> the remote site? In essence could we use the log shipping target
> database to stage a production database across the WAN.
> Operational Database --> Log Shipping --> Staging Database (RO) -->
> Backup/detach/copy/etc --> Reporting Database (RW)
> Thank You for the quick feedback,
> Kevin
>|||Our final solution is to utilize Double Take. We plan to replicate our
.mdf and .ldf across the wan using double take. Then once a night we
will pause Double Take and copy the .mdf and .ldf out of the replica.
We will then mount that as our reporting database.
So far all seems well. Scripting out an automated way to attach the
database and verify that it is not corrupt and then backout to the
previous database if necessary is proving to be a little tricky. If
only DTS allowed better flow control...
SQL 2005 will be on its way soon, but not soon enough
Thank you all for your input,
Kevin|||Hello,
Thanks for let me know the current status. If you have any questions or
concerns in future, feel free to post back!
Sophie Guo
Microsoft Online Partner Support
Get Secure! - www.microsoft.com/security
========================================
=============
When responding to posts, please "Reply to Group" via your newsreader so
that others may learn and benefit from your issue.
========================================
=============
This posting is provided "AS IS" with no warranties, and confers no rights.

Advanced Log Shipping

We are researching whether we can accomplish the following goal with
log shipping. We have a reporting database that is mostly read-only
that we would like to migrate to log shipping.
Our environment consists of two database servers, one is our
operational database server and the other is a reporting database
server. All the data writes go to our operational database.
Currently, we replicate the operational database to the reporting
database every evening. We intentional do this once a day so that we
have a 24 hour window to correct any data entry issues if they should
occur.
Log shipping sounds easy enough to setup, but here is where it gets
complicated for us. Our reporting database is used via our web portal
application. Our users have the ability to make changes during the day
via the portal. Some of these changes cause writes to both the
reporting database as well as the operational database. The writes to
the reporting database let the users get immediate changes, and the
writes to the operational database ensure the data will be updated upon
the next nightly update.
1) Will these intermittent writes to the reporting database server
prevent a log shipping transaction log restore from completing
successfully?
2) If we do transaction log backups once an hour, I assume we can save
them all and then replay all the tlog backups at one time in the middle
of the night.
We tried to setup replication at one point, but our database schema
would not replicate to a second server and we have not had the
resources to try to resolve the replication issues.
Thank You,
Kevin
Kevin,
log shipping restores the logs using NORECOVERY or STANDBY, both of which
will prevent the editing of the reporting data.
Cheers,
Paul Ibison SQL Server MVP, www.replicationanswers.com
(recommended sql server 2000 replication book:
http://www.nwsu.com/0974973602p.html)
|||Log shipping target databases are locked read-only. You cannot update a
log-shipped target database. IF you bring it live then you cannot apply any
further transaction logs.
Yes, you can "save up" transaction logs and apply them in sequence.
Geoff N. Hiten
Senior Database Administrator
Microsoft SQL Server MVP
"kghammond" <kghammond@.nrscorp.com> wrote in message
news:1133799275.397007.188420@.g43g2000cwa.googlegr oups.com...
> We are researching whether we can accomplish the following goal with
> log shipping. We have a reporting database that is mostly read-only
> that we would like to migrate to log shipping.
> Our environment consists of two database servers, one is our
> operational database server and the other is a reporting database
> server. All the data writes go to our operational database.
> Currently, we replicate the operational database to the reporting
> database every evening. We intentional do this once a day so that we
> have a 24 hour window to correct any data entry issues if they should
> occur.
> Log shipping sounds easy enough to setup, but here is where it gets
> complicated for us. Our reporting database is used via our web portal
> application. Our users have the ability to make changes during the day
> via the portal. Some of these changes cause writes to both the
> reporting database as well as the operational database. The writes to
> the reporting database let the users get immediate changes, and the
> writes to the operational database ensure the data will be updated upon
> the next nightly update.
> 1) Will these intermittent writes to the reporting database server
> prevent a log shipping transaction log restore from completing
> successfully?
> 2) If we do transaction log backups once an hour, I assume we can save
> them all and then replay all the tlog backups at one time in the middle
> of the night.
> We tried to setup replication at one point, but our database schema
> would not replicate to a second server and we have not had the
> resources to try to resolve the replication issues.
> Thank You,
> Kevin
>
|||The business case that we are trying to resolve is that we are moving
our reporting database server to a co-location facility. In the past
we had high speed LAN access between our database servers so nightly
backup and restores of our database where not an issue. Our database
is currently about 7GB and it is growing at about 25% per year. Our
WAN connection is a standard point to point T1. 7GB is too much data
to move across the WAN on a nightly basis. LiteSpeed will shrink the
database to 1.5GB, but still that is very large compared to 35MB
transaction logs.
Is it possible to make a copy of the log shipping target database on
the remote site? In essence could we use the log shipping target
database to stage a production database across the WAN.
Operational Database --> Log Shipping --> Staging Database (RO) -->
Backup/detach/copy/etc --> Reporting Database (RW)
Thank You for the quick feedback,
Kevin
|||Upgrade to SQL Server 2005 and use database snapshots - exactly what they
are designed for.
You can't detach the database and copy, nor can you stop SQL Server and copy
the files and use those as inputs to sp_attach_db.
Tony.
Tony Rogerson
SQL Server MVP
http://sqlserverfaq.com - free video tutorials
"kghammond" <kghammond@.nrscorp.com> wrote in message
news:1133800712.975757.55690@.f14g2000cwb.googlegro ups.com...
> The business case that we are trying to resolve is that we are moving
> our reporting database server to a co-location facility. In the past
> we had high speed LAN access between our database servers so nightly
> backup and restores of our database where not an issue. Our database
> is currently about 7GB and it is growing at about 25% per year. Our
> WAN connection is a standard point to point T1. 7GB is too much data
> to move across the WAN on a nightly basis. LiteSpeed will shrink the
> database to 1.5GB, but still that is very large compared to 35MB
> transaction logs.
> Is it possible to make a copy of the log shipping target database on
> the remote site? In essence could we use the log shipping target
> database to stage a production database across the WAN.
> Operational Database --> Log Shipping --> Staging Database (RO) -->
> Backup/detach/copy/etc --> Reporting Database (RW)
> Thank You for the quick feedback,
> Kevin
>
|||Hello,
Just check in to see if our MVPs suggestions were helpful. If anything is
unclear, get in touch.
Sophie Guo
Microsoft Online Partner Support
Get Secure! - www.microsoft.com/security
================================================== ===
When responding to posts, please "Reply to Group" via your newsreader so
that others may learn and benefit from your issue.
================================================== ===
This posting is provided "AS IS" with no warranties, and confers no rights.
|||Our final solution is to utilize Double Take. We plan to replicate our
..mdf and .ldf across the wan using double take. Then once a night we
will pause Double Take and copy the .mdf and .ldf out of the replica.
We will then mount that as our reporting database.
So far all seems well. Scripting out an automated way to attach the
database and verify that it is not corrupt and then backout to the
previous database if necessary is proving to be a little tricky. If
only DTS allowed better flow control...
SQL 2005 will be on its way soon, but not soon enough
Thank you all for your input,
Kevin
|||Hello,
Thanks for let me know the current status. If you have any questions or
concerns in future, feel free to post back!
Sophie Guo
Microsoft Online Partner Support
When responding to posts, please "Reply to Group" via your newsreader so
that others may learn and benefit from your issue.
================================================== ===
Business-Critical Phone Support (BCPS) provides you with technical phone
support at no charge during critical LAN outages or "business down"
situations. This benefit is available 24 hours a day, 7 days a week to all
Microsoft technology partners in the United States and Canada.
This and other support options are available here:
BCPS:
https://partner.microsoft.com/US/tec...rview/40010469
Others: https://partner.microsoft.com/US/tec...pportoverview/
If you are outside the United States, please visit our International
Support page:
http://support.microsoft.com/common/international.aspx
================================================== ===
This posting is provided "AS IS" with no warranties, and confers no rights.