Showing posts with label reporting. Show all posts
Showing posts with label reporting. Show all posts

Thursday, March 29, 2012

Advice on SQL Server 2005 vs. Express Edition w/ Advanced Services

I've been using SQL Server 2005 Std. for my development work on a laptop w/ 2 GHz Celeron and 768 Mb RAM. Now that Express Edition has Reporting Services and Full-Text Search, would I be wise to just use it instead? I actually have both on my machine since Express Edition installed with VS 2005 Standard. Also, if I uninstall SQL Server 2005 Std. what will I lose? Thanks in advance.

-Mike

It depends what you want to do. SQL Express Advanced Services is not the same as Std, Std has more features (OLAP, Integration Services) richer features (better SSMS, better reporting including report builder) it also has less restrictions on memory, procs etc.

If you are only using this for development you should also be able to use Developer Edition which is < 50USD but is Enterprise Edition, with a no production use clause.

|||

I've decided to uninstall SQL Server 2005 Std. and try to work with Express w/ Advanced Services. I believe it will do all I need. I'm more concerned with performance on my laptop than features like OLAP, etc. If I gain any incite after using it for a while, I'll post back here. Thanks.

-Mike

sql

Tuesday, March 27, 2012

Advice on Data Warehouse & Web Reporting

I'm about to embark on re-writing a database & bespoke web reporting
application for our call centre & would like a little advice please.
Currently the database has 10 tables containing summaried (<=1 record
per staff member per day) data from different legacy systems,
populated by DTS. There is an 11th table that has staff data in which
is used to link the others together as many have different primary
keys. After the data has been linked together an aggregated table (1
record per person per day) is created once a day.
Currently our intranet site is configured to run a number of stored
procedures that return KPI data from the aggregated table into
datasets which are then rendered in the form of datagrids. Users are
either allowed to specify the parameters for these stored procedures
or they are pre-determined for them depending on who they are (eg
agents in the call centre all see a MTD report for themselves only).
The aim of the re-write is to
(a) cut down on admin when KPI definitions change
(b) make the setup much more generic so that it could be transported
to other areas of the business or even to different companies with
minimum rework
(c) upgrade from SQL 2000 to SQL 2005
(d) tidy the webpages a little & maybe add some gauge type controls
I'm unsure about 2 things -
(1) Should I totally re-design things & use Analysis Services instead
or would I find no benefit as everyone is only given one view of the
truth (ie no slicing & dicing depending upon preference)? I know very
little about this service so it would be a challenge & from what I've
read I'm not so sure whether it would be appropriate for all of the
staff querying the database constantly anyway(there are over 500 of
them & currently the stored procedures use nested temp tables to
calculate everything that needs to be shown on the webpages). I guess
that I couldn't fill a datagrid with their data using this method
either but I'm sure that someone will be able to keep me right.
(2) Should I dump the datagrids in favour of Reporting Services? This
was originally not used as our IT department could get it installed
properly on the SQL 2000 server & the datagrid solution was found to
be both adequate & easy to setup. We have Crystal Reports in the
company also but licence costs are likely to be a problem.
Hope I haven't upset anyone by crossposting the question - I'm just
after a balanced view before I start work & the queries fit with a few
different ng's.
TIA
SteveI think that AS is more important; more critical-- than RS.
there are other tools like RS on the market.
but AS leads the market by a wide margin.
Does that mean it's EASY? no. Does it mean it's SIMPLE? no.
I would reccomend taking a month off of work; immersing yourself in
SSAS and coming back to work to scrap all your existing DB work.
10 million relational developers CAN be wrong and they are.
It's better to build a solution for non technical people-- SSAS is best
utilized using OWC - Office Web Components- and non-technical people...
All of your relational mess just sounds overly complicated.
-Aaron
C4rtm4N wrote:
> I'm about to embark on re-writing a database & bespoke web reporting
> application for our call centre & would like a little advice please.
> Currently the database has 10 tables containing summaried (<=1 record
> per staff member per day) data from different legacy systems,
> populated by DTS. There is an 11th table that has staff data in which
> is used to link the others together as many have different primary
> keys. After the data has been linked together an aggregated table (1
> record per person per day) is created once a day.
> Currently our intranet site is configured to run a number of stored
> procedures that return KPI data from the aggregated table into
> datasets which are then rendered in the form of datagrids. Users are
> either allowed to specify the parameters for these stored procedures
> or they are pre-determined for them depending on who they are (eg
> agents in the call centre all see a MTD report for themselves only).
> The aim of the re-write is to
> (a) cut down on admin when KPI definitions change
> (b) make the setup much more generic so that it could be transported
> to other areas of the business or even to different companies with
> minimum rework
> (c) upgrade from SQL 2000 to SQL 2005
> (d) tidy the webpages a little & maybe add some gauge type controls
> I'm unsure about 2 things -
> (1) Should I totally re-design things & use Analysis Services instead
> or would I find no benefit as everyone is only given one view of the
> truth (ie no slicing & dicing depending upon preference)? I know very
> little about this service so it would be a challenge & from what I've
> read I'm not so sure whether it would be appropriate for all of the
> staff querying the database constantly anyway(there are over 500 of
> them & currently the stored procedures use nested temp tables to
> calculate everything that needs to be shown on the webpages). I guess
> that I couldn't fill a datagrid with their data using this method
> either but I'm sure that someone will be able to keep me right.
> (2) Should I dump the datagrids in favour of Reporting Services? This
> was originally not used as our IT department could get it installed
> properly on the SQL 2000 server & the datagrid solution was found to
> be both adequate & easy to setup. We have Crystal Reports in the
> company also but licence costs are likely to be a problem.
> Hope I haven't upset anyone by crossposting the question - I'm just
> after a balanced view before I start work & the queries fit with a few
> different ng's.
> TIA
> Steve

Advice on Data Warehouse & Web Reporting

I'm about to embark on re-writing a database & bespoke web reporting
application for our call centre & would like a little advice please.
Currently the database has 10 tables containing summaried (<=1 record
per staff member per day) data from different legacy systems,
populated by DTS. There is an 11th table that has staff data in which
is used to link the others together as many have different primary
keys. After the data has been linked together an aggregated table (1
record per person per day) is created once a day.
Currently our intranet site is configured to run a number of stored
procedures that return KPI data from the aggregated table into
datasets which are then rendered in the form of datagrids. Users are
either allowed to specify the parameters for these stored procedures
or they are pre-determined for them depending on who they are (eg
agents in the call centre all see a MTD report for themselves only).
The aim of the re-write is to
(a) cut down on admin when KPI definitions change
(b) make the setup much more generic so that it could be transported
to other areas of the business or even to different companies with
minimum rework
(c) upgrade from SQL 2000 to SQL 2005
(d) tidy the webpages a little & maybe add some gauge type controls
I'm unsure about 2 things -
(1) Should I totally re-design things & use Analysis Services instead
or would I find no benefit as everyone is only given one view of the
truth (ie no slicing & dicing depending upon preference)? I know very
little about this service so it would be a challenge & from what I've
read I'm not so sure whether it would be appropriate for all of the
staff querying the database constantly anyway(there are over 500 of
them & currently the stored procedures use nested temp tables to
calculate everything that needs to be shown on the webpages). I guess
that I couldn't fill a datagrid with their data using this method
either but I'm sure that someone will be able to keep me right.
(2) Should I dump the datagrids in favour of Reporting Services? This
was originally not used as our IT department could get it installed
properly on the SQL 2000 server & the datagrid solution was found to
be both adequate & easy to setup. We have Crystal Reports in the
company also but licence costs are likely to be a problem.
Hope I haven't upset anyone by crossposting the question - I'm just
after a balanced view before I start work & the queries fit with a few
different ng's.
TIA
Steve
I think that AS is more important; more critical-- than RS.
there are other tools like RS on the market.
but AS leads the market by a wide margin.
Does that mean it's EASY? no. Does it mean it's SIMPLE? no.
I would reccomend taking a month off of work; immersing yourself in
SSAS and coming back to work to scrap all your existing DB work.
10 million relational developers CAN be wrong and they are.
It's better to build a solution for non technical people-- SSAS is best
utilized using OWC - Office Web Components- and non-technical people...
All of your relational mess just sounds overly complicated.
-Aaron
C4rtm4N wrote:
> I'm about to embark on re-writing a database & bespoke web reporting
> application for our call centre & would like a little advice please.
> Currently the database has 10 tables containing summaried (<=1 record
> per staff member per day) data from different legacy systems,
> populated by DTS. There is an 11th table that has staff data in which
> is used to link the others together as many have different primary
> keys. After the data has been linked together an aggregated table (1
> record per person per day) is created once a day.
> Currently our intranet site is configured to run a number of stored
> procedures that return KPI data from the aggregated table into
> datasets which are then rendered in the form of datagrids. Users are
> either allowed to specify the parameters for these stored procedures
> or they are pre-determined for them depending on who they are (eg
> agents in the call centre all see a MTD report for themselves only).
> The aim of the re-write is to
> (a) cut down on admin when KPI definitions change
> (b) make the setup much more generic so that it could be transported
> to other areas of the business or even to different companies with
> minimum rework
> (c) upgrade from SQL 2000 to SQL 2005
> (d) tidy the webpages a little & maybe add some gauge type controls
> I'm unsure about 2 things -
> (1) Should I totally re-design things & use Analysis Services instead
> or would I find no benefit as everyone is only given one view of the
> truth (ie no slicing & dicing depending upon preference)? I know very
> little about this service so it would be a challenge & from what I've
> read I'm not so sure whether it would be appropriate for all of the
> staff querying the database constantly anyway(there are over 500 of
> them & currently the stored procedures use nested temp tables to
> calculate everything that needs to be shown on the webpages). I guess
> that I couldn't fill a datagrid with their data using this method
> either but I'm sure that someone will be able to keep me right.
> (2) Should I dump the datagrids in favour of Reporting Services? This
> was originally not used as our IT department could get it installed
> properly on the SQL 2000 server & the datagrid solution was found to
> be both adequate & easy to setup. We have Crystal Reports in the
> company also but licence costs are likely to be a problem.
> Hope I haven't upset anyone by crossposting the question - I'm just
> after a balanced view before I start work & the queries fit with a few
> different ng's.
> TIA
> Steve

Advice on Data Warehouse & Web Reporting

I'm about to embark on re-writing a database & bespoke web reporting
application for our call centre & would like a little advice please.
Currently the database has 10 tables containing summaried (<=1 record
per staff member per day) data from different legacy systems,
populated by DTS. There is an 11th table that has staff data in which
is used to link the others together as many have different primary
keys. After the data has been linked together an aggregated table (1
record per person per day) is created once a day.
Currently our intranet site is configured to run a number of stored
procedures that return KPI data from the aggregated table into
datasets which are then rendered in the form of datagrids. Users are
either allowed to specify the parameters for these stored procedures
or they are pre-determined for them depending on who they are (eg
agents in the call centre all see a MTD report for themselves only).
The aim of the re-write is to
(a) cut down on admin when KPI definitions change
(b) make the setup much more generic so that it could be transported
to other areas of the business or even to different companies with
minimum rework
(c) upgrade from SQL 2000 to SQL 2005
(d) tidy the webpages a little & maybe add some gauge type controls
I'm unsure about 2 things -
(1) Should I totally re-design things & use Analysis Services instead
or would I find no benefit as everyone is only given one view of the
truth (ie no slicing & dicing depending upon preference)? I know very
little about this service so it would be a challenge & from what I've
read I'm not so sure whether it would be appropriate for all of the
staff querying the database constantly anyway(there are over 500 of
them & currently the stored procedures use nested temp tables to
calculate everything that needs to be shown on the webpages). I guess
that I couldn't fill a datagrid with their data using this method
either but I'm sure that someone will be able to keep me right.
(2) Should I dump the datagrids in favour of Reporting Services? This
was originally not used as our IT department could get it installed
properly on the SQL 2000 server & the datagrid solution was found to
be both adequate & easy to setup. We have Crystal Reports in the
company also but licence costs are likely to be a problem.
Hope I haven't upset anyone by crossposting the question - I'm just
after a balanced view before I start work & the queries fit with a few
different ng's.
TIA
SteveI think that AS is more important; more critical-- than RS.
there are other tools like RS on the market.
but AS leads the market by a wide margin.
Does that mean it's EASY? no. Does it mean it's SIMPLE? no.
I would reccomend taking a month off of work; immersing yourself in
SSAS and coming back to work to scrap all your existing DB work.
10 million relational developers CAN be wrong and they are.
It's better to build a solution for non technical people-- SSAS is best
utilized using OWC - Office Web Components- and non-technical people...
All of your relational mess just sounds overly complicated.
-Aaron
C4rtm4N wrote:
> I'm about to embark on re-writing a database & bespoke web reporting
> application for our call centre & would like a little advice please.
> Currently the database has 10 tables containing summaried (<=1 record
> per staff member per day) data from different legacy systems,
> populated by DTS. There is an 11th table that has staff data in which
> is used to link the others together as many have different primary
> keys. After the data has been linked together an aggregated table (1
> record per person per day) is created once a day.
> Currently our intranet site is configured to run a number of stored
> procedures that return KPI data from the aggregated table into
> datasets which are then rendered in the form of datagrids. Users are
> either allowed to specify the parameters for these stored procedures
> or they are pre-determined for them depending on who they are (eg
> agents in the call centre all see a MTD report for themselves only).
> The aim of the re-write is to
> (a) cut down on admin when KPI definitions change
> (b) make the setup much more generic so that it could be transported
> to other areas of the business or even to different companies with
> minimum rework
> (c) upgrade from SQL 2000 to SQL 2005
> (d) tidy the webpages a little & maybe add some gauge type controls
> I'm unsure about 2 things -
> (1) Should I totally re-design things & use Analysis Services instead
> or would I find no benefit as everyone is only given one view of the
> truth (ie no slicing & dicing depending upon preference)? I know very
> little about this service so it would be a challenge & from what I've
> read I'm not so sure whether it would be appropriate for all of the
> staff querying the database constantly anyway(there are over 500 of
> them & currently the stored procedures use nested temp tables to
> calculate everything that needs to be shown on the webpages). I guess
> that I couldn't fill a datagrid with their data using this method
> either but I'm sure that someone will be able to keep me right.
> (2) Should I dump the datagrids in favour of Reporting Services? This
> was originally not used as our IT department could get it installed
> properly on the SQL 2000 server & the datagrid solution was found to
> be both adequate & easy to setup. We have Crystal Reports in the
> company also but licence costs are likely to be a problem.
> Hope I haven't upset anyone by crossposting the question - I'm just
> after a balanced view before I start work & the queries fit with a few
> different ng's.
> TIA
> Steve

Advice on a good strategy

I have a general question and a specific question.

1.) I'm doing some reporting using the Crystal Reports software my company has installed; however, I see threads of people using all kinds of more powerful software (VB, C, etc) to work with Crystal (web development, etc). Crystal Reports is on a terminal server here, and my audience will use that software to view the reports. Am I then limited to the Crystal (or Basic) syntax found in the software, or can I somehow use something like Perl or C to do my programming? I'm not sure how all of this goes together.

2.) More specifically, I'm trying to build a parts list tree (parent/child). The database is organized such that it makes this correlation difficult. In one Table (Part Master), is a list of all unique parts. In another table is a list of all unique jobs (times we've made the parts) (Jobs) that would show the part being made (one part shown) (link to Part Master). In the third table (Materials), the Jobs are listed with all of the sub components used for the main part for that job. Here, I'd want to capture the sub-components, but then loop back through the original Part Master to capture THEIR subcomponets, and so on. There could be up to 5 levels of this.

Basically, I want a user to enter a top-level assembly number, then I want to show every part used for that assembly, all the way down. My problem is that I really need to loop through the original list of returned records (I think). If I find the first part, and the first level of sub-parts, I may have passed those records already, so I need to re-read the list. I attempted build an array of the entire list of parts and do the logic at the end, but there are 82,000 of them (limit is 1000 and I don't plan to do Case with 82 levels). I'm inclined to believe that Crystal Reports (v.10) does not support multi-dimensional arrays, making this even tricker.

Can someone help me on a strategy to attack this one?Well, first you probably need to realize, that even if they are using VB or C-whatever or anything else in conjunction with Crystal, that Crystal is still a data-analysis/reporting engine. All it does is spit out the data in whatever format you design the report to do so. You can connect directly to your database (I am guessing it is a relatively standard engine, although you don't mention it), and then use crystal to select certain sets of data, group that data, and summarize that data, no matter what the general GUI interface to your data is. So just use crystal, point it to your database through ODBC or OLEDB (or whichever method you want), and go. Now, how you allow your users to view those reports is another matter, you might put shortcuts on the desktop (if the full version of CR is accessible through the terminal server), or use or develop a CR viewer for them to open the reports in. For that, you would need another tool, such as VB or C-whatever.

It does sound as if you have several layers of one-to many relationships to account for in the description of your project. My presonal experience leads me to usually start for the smallest partion, and work up. If you want a jobs report, that will show the detail down to the smallest sub-part, you would probably know that you need to start from an AssembyID, display each individual part, and below each part, a list of sub-parts. So you would add each of those tables to your report and link them by their primary keys (I usually use a left join if there is any possibility of missing links, i.e., a sub-part entry that is not in the table, otherwise an inner join would be more efficient). Then group by jobid, partid, and sub-part id. Add all the fields you want for each job (be sure to keep the field boxes in the correct group).

Obviously, this is hugely simplified and requires a lot more, but that is a basic approach. More detail could be given, if you give me a list of tables and their relationships.

ScottJ

Sunday, March 25, 2012

AdventureWorksDW

I am using a tutorial to learn SQL Reporting services and it calls for a
table vResellers in AdventureWorksDW database. I don't have this database,
instead I have AdventureWorks2000. Can anyone tell me where I can get
AdventureWorksDW?
Thanks,Hi,
The SQL Server 2005 Samlple database can be downloaded from Microsoft. Try
following link
http://www.microsoft.com/downloads/details.aspx?FamilyId=E719ECF7-9F46-4312-AF89-6AD8702E4E6E&displaylang=en
Patrick
"Moses" <Moses@.discussions.microsoft.com> wrote in message
news:755330A4-335B-4A31-A6ED-CFF217525F03@.microsoft.com...
>I am using a tutorial to learn SQL Reporting services and it calls for a
> table vResellers in AdventureWorksDW database. I don't have this database,
> instead I have AdventureWorks2000. Can anyone tell me where I can get
> AdventureWorksDW?
> Thanks,|||Patrick,
I tried installing this but I cannot attach the database. I believe it is
because this is a for sql 2005. Is there a sql 2000 version?
"Patrick" wrote:
> Hi,
> The SQL Server 2005 Samlple database can be downloaded from Microsoft. Try
> following link
> http://www.microsoft.com/downloads/details.aspx?FamilyId=E719ECF7-9F46-4312-AF89-6AD8702E4E6E&displaylang=en
> Patrick
> "Moses" <Moses@.discussions.microsoft.com> wrote in message
> news:755330A4-335B-4A31-A6ED-CFF217525F03@.microsoft.com...
> >I am using a tutorial to learn SQL Reporting services and it calls for a
> > table vResellers in AdventureWorksDW database. I don't have this database,
> > instead I have AdventureWorks2000. Can anyone tell me where I can get
> > AdventureWorksDW?
> >
> > Thanks,
>
>|||when you download thecourse fromthe course: 2030a Creating Reporting
Solutions Using Microsoft SQL Server 2000 Reporting Services in
c:\Program Files\Microsoft Learning\2030\Setup\Database
you find it
but my probleme is
I can't find database "AdventureWorks2000" in 2030a Creating Reporting
Solutions Using Microsoft SQL Server 2000 Reporting Services
IF YOU HAVE database "AdventureWorks2000"
SEND IT TO ME AND I SEND YOU
AdventureWorksDW database

Tuesday, March 20, 2012

Advantages/Disadvantages of SQL 2005 Reporting Vs Crystal Report 1

Hi:
Can someone show me a link where I can get the Advantages/Disadvantages of
SQL 2005 Reporting Vs Crystal Report 10 ?
Thanks in advance.
--
Thanks,
SDRoyTry this:
http://www.crystalreportsbook.com/SSRSandCR_ExecSummary.asp
"SDRoy" wrote:
> Hi:
> Can someone show me a link where I can get the Advantages/Disadvantages of
> SQL 2005 Reporting Vs Crystal Report 10 ?
> Thanks in advance.
> --
> Thanks,
> SDRoy|||I just perused it and two things. One is that it was written prior to SQL
Server 2005 coming out. And, two, the licensing is incomplete. Apples to
Apples I have never heard that Crystal Reports is cheaper. The licensing
costs mentioned are per CPU for SQL Server which you might or might not have
to do. From what I have seen when we looked at Crystal Reports licensing
there is no way that you will get away with the $7,500 license fee that the
author compares to the Enterprise per processor license for SQL Server.
Bruce Loehle-Conger
MVP SQL Server Reporting Services
"John G." <John G.@.discussions.microsoft.com> wrote in message
news:9AF11023-0A1E-4CE8-87DC-81B7F583A2D0@.microsoft.com...
> Try this:
> http://www.crystalreportsbook.com/SSRSandCR_ExecSummary.asp
> "SDRoy" wrote:
>> Hi:
>> Can someone show me a link where I can get the Advantages/Disadvantages
>> of
>> SQL 2005 Reporting Vs Crystal Report 10 ?
>> Thanks in advance.
>> --
>> Thanks,
>> SDRoy

Advantages of Reporting Services 2005


Hello,

I am working on the Reporting Services 2005, I don' know anything about Business Object.

Please let me know what are the advantages of Reporting Services 2005 over the Business Object.

Any specific link will be useful for me.

Thanks in advance.

Bye.

If you already have a SQL Server 2005 box...SSRS is free. That is usually a very compelling argument. Check with MS on your specific licensing scenario though.

The next is Report Builder which is a user friendly ad-hoc tool for analysts to build their own reports without the learning curve of Visual Studio (although the learning curve is really not that high on RS).

|||

Hello Davind,

I agree with you on the advantage you have mentioned above but I have question?

Reporing services have direct connection to the database there is know any middle tire, and I throught that because of this the processing of report might be slow. In the same case Business Object have some middletire I don't know exactly.

Any help in this will be appreciated.

|||

check this blog for all the Resources link on Reporting services.

http://blogs.sqlxml.org/vinodkumar/archive/2007/09/14/sql-rs-resources-links.aspx

Madhu

|||

Yes, when you look at BO, Cognos, SAS BI etc.. it is understood that you will build some middle tier...like an OLAP cube to report.

The difference with SSRS is that it is purely a reporting tool, abstract from a datasource. So think of it this way, you CAN report directly against a live OLTP database. To your point, you may have some performance considerations to make. But you CAN also build reports against a datawarehouse (SQL) or against Cubes that have been defined.

It goes beyond that...You can also build reports directly against other datasources like Oracle, Access..I've developed real-time SSRS reports against Lawson (Oracle hosted on Unix) for example.

Lastly, you can build reports based upon Models or even SSIS. SSIS can be thought of as a middle-ware...very robust ETL tool on steroids. So think of being able to mix DW data with say..live RSS feeds or data from a web service, or some real time ETL integration?

A report model is entirely different in that you can build a report from a user friendly definition without having to have a knowledge of the table structures, field names, business rules and joins (of course that 'model' must be defined first by someone who does)..that then leaves report building in the hands of analysts..not programmers.

So the biggest thing that I see is choice..you have deep choices to make in the datasets you consume and are not limited solely to OLAP datasources or MDX code. Plus the technology is primarily standard across the Microsoft framework. Meaning, the models can be used in SSRS or even Report Builder. The OLAP cubes or datasources can of course be consumed in SSRS, but also Excel PivotTable, ProClarity...basically any cube viewer tool of choice. Those should be additional measuring sticks when looking at other vendor solutions in BI.

Sunday, March 11, 2012

advanced parameter tutorial, lesson 5, multipart identifier error

Hello,

Hope I'm asking this question in the correct forum.

I'm a newbie in Reporting Services and currently working my way through the tutorials with AdventureWorks. Came across this error while doing the MSDN tutorial for Advanced Features, lesson 5 - user defined functions.

http://msdn2.microsoft.com/en-us/library/aa337435.aspx

Created a new report, copied the following to the query screen:

SELECT udf.ContactID, udf.FirstName + N' ' + udf.LastName AS Name,
c.Phone, c.EmailAddress, udf.JobTitle, udf.ContactType
FROM ufnGetContactInformation(@.ContactID) udf
JOIN Person.Contact c ON ufn.ContactID = c.ContactID

I'm following the directions to the letter, and consistently get the following error:

"The multi-part identifier "ufn.ContactID" could not be bound."

"The multip-part identifier "ufn.ContactID" could not be bound. (Microsoft SQL Server, Error: 4104)"

I'm running SQL 2005 Enterprise on Windows XP.

Any help you can give will be much appreciated! Thank you.

Looks like typo in a sample query

try udf.ContactID instead of ufn.ContactID

|||Thank you very much! Now it works.

Advanced Log Shipping

We are researching whether we can accomplish the following goal with
log shipping. We have a reporting database that is mostly read-only
that we would like to migrate to log shipping.
Our environment consists of two database servers, one is our
operational database server and the other is a reporting database
server. All the data writes go to our operational database.
Currently, we replicate the operational database to the reporting
database every evening. We intentional do this once a day so that we
have a 24 hour window to correct any data entry issues if they should
occur.
Log shipping sounds easy enough to setup, but here is where it gets
complicated for us. Our reporting database is used via our web portal
application. Our users have the ability to make changes during the day
via the portal. Some of these changes cause writes to both the
reporting database as well as the operational database. The writes to
the reporting database let the users get immediate changes, and the
writes to the operational database ensure the data will be updated upon
the next nightly update.
1) Will these intermittent writes to the reporting database server
prevent a log shipping transaction log restore from completing
successfully?
2) If we do transaction log backups once an hour, I assume we can save
them all and then replay all the tlog backups at one time in the middle
of the night.
We tried to setup replication at one point, but our database schema
would not replicate to a second server and we have not had the
resources to try to resolve the replication issues.
Thank You,
Kevin
Kevin,
log shipping restores the logs using NORECOVERY or STANDBY, both of which
will prevent the editing of the reporting data.
Cheers,
Paul Ibison SQL Server MVP, www.replicationanswers.com
(recommended sql server 2000 replication book:
http://www.nwsu.com/0974973602p.html)
|||Log shipping target databases are locked read-only. You cannot update a
log-shipped target database. IF you bring it live then you cannot apply any
further transaction logs.
Yes, you can "save up" transaction logs and apply them in sequence.
Geoff N. Hiten
Senior Database Administrator
Microsoft SQL Server MVP
"kghammond" <kghammond@.nrscorp.com> wrote in message
news:1133799275.397007.188420@.g43g2000cwa.googlegr oups.com...
> We are researching whether we can accomplish the following goal with
> log shipping. We have a reporting database that is mostly read-only
> that we would like to migrate to log shipping.
> Our environment consists of two database servers, one is our
> operational database server and the other is a reporting database
> server. All the data writes go to our operational database.
> Currently, we replicate the operational database to the reporting
> database every evening. We intentional do this once a day so that we
> have a 24 hour window to correct any data entry issues if they should
> occur.
> Log shipping sounds easy enough to setup, but here is where it gets
> complicated for us. Our reporting database is used via our web portal
> application. Our users have the ability to make changes during the day
> via the portal. Some of these changes cause writes to both the
> reporting database as well as the operational database. The writes to
> the reporting database let the users get immediate changes, and the
> writes to the operational database ensure the data will be updated upon
> the next nightly update.
> 1) Will these intermittent writes to the reporting database server
> prevent a log shipping transaction log restore from completing
> successfully?
> 2) If we do transaction log backups once an hour, I assume we can save
> them all and then replay all the tlog backups at one time in the middle
> of the night.
> We tried to setup replication at one point, but our database schema
> would not replicate to a second server and we have not had the
> resources to try to resolve the replication issues.
> Thank You,
> Kevin
>
|||The business case that we are trying to resolve is that we are moving
our reporting database server to a co-location facility. In the past
we had high speed LAN access between our database servers so nightly
backup and restores of our database where not an issue. Our database
is currently about 7GB and it is growing at about 25% per year. Our
WAN connection is a standard point to point T1. 7GB is too much data
to move across the WAN on a nightly basis. LiteSpeed will shrink the
database to 1.5GB, but still that is very large compared to 35MB
transaction logs.
Is it possible to make a copy of the log shipping target database on
the remote site? In essence could we use the log shipping target
database to stage a production database across the WAN.
Operational Database --> Log Shipping --> Staging Database (RO) -->
Backup/detach/copy/etc --> Reporting Database (RW)
Thank You for the quick feedback,
Kevin
|||Upgrade to SQL Server 2005 and use database snapshots - exactly what they
are designed for.
You can't detach the database and copy, nor can you stop SQL Server and copy
the files and use those as inputs to sp_attach_db.
Tony.
Tony Rogerson
SQL Server MVP
http://sqlserverfaq.com - free video tutorials
"kghammond" <kghammond@.nrscorp.com> wrote in message
news:1133800712.975757.55690@.f14g2000cwb.googlegro ups.com...
> The business case that we are trying to resolve is that we are moving
> our reporting database server to a co-location facility. In the past
> we had high speed LAN access between our database servers so nightly
> backup and restores of our database where not an issue. Our database
> is currently about 7GB and it is growing at about 25% per year. Our
> WAN connection is a standard point to point T1. 7GB is too much data
> to move across the WAN on a nightly basis. LiteSpeed will shrink the
> database to 1.5GB, but still that is very large compared to 35MB
> transaction logs.
> Is it possible to make a copy of the log shipping target database on
> the remote site? In essence could we use the log shipping target
> database to stage a production database across the WAN.
> Operational Database --> Log Shipping --> Staging Database (RO) -->
> Backup/detach/copy/etc --> Reporting Database (RW)
> Thank You for the quick feedback,
> Kevin
>
|||Our final solution is to utilize Double Take. We plan to replicate our
..mdf and .ldf across the wan using double take. Then once a night we
will pause Double Take and copy the .mdf and .ldf out of the replica.
We will then mount that as our reporting database.
So far all seems well. Scripting out an automated way to attach the
database and verify that it is not corrupt and then backout to the
previous database if necessary is proving to be a little tricky. If
only DTS allowed better flow control...
SQL 2005 will be on its way soon, but not soon enough
Thank you all for your input,
Kevin
|||Hello,
Thanks for let me know the current status. If you have any questions or
concerns in future, feel free to post back!
Sophie Guo
Microsoft Online Partner Support
Get Secure! - www.microsoft.com/security
================================================== ===
When responding to posts, please "Reply to Group" via your newsreader so
that others may learn and benefit from your issue.
================================================== ===
This posting is provided "AS IS" with no warranties, and confers no rights.

Advanced Log Shipping

We are researching whether we can accomplish the following goal with
log shipping. We have a reporting database that is mostly read-only
that we would like to migrate to log shipping.
Our environment consists of two database servers, one is our
operational database server and the other is a reporting database
server. All the data writes go to our operational database.
Currently, we replicate the operational database to the reporting
database every evening. We intentional do this once a day so that we
have a 24 hour window to correct any data entry issues if they should
occur.
Log shipping sounds easy enough to setup, but here is where it gets
complicated for us. Our reporting database is used via our web portal
application. Our users have the ability to make changes during the day
via the portal. Some of these changes cause writes to both the
reporting database as well as the operational database. The writes to
the reporting database let the users get immediate changes, and the
writes to the operational database ensure the data will be updated upon
the next nightly update.
1) Will these intermittent writes to the reporting database server
prevent a log shipping transaction log restore from completing
successfully?
2) If we do transaction log backups once an hour, I assume we can save
them all and then replay all the tlog backups at one time in the middle
of the night.
We tried to setup replication at one point, but our database schema
would not replicate to a second server and we have not had the
resources to try to resolve the replication issues.
Thank You,
KevinKevin,
log shipping restores the logs using NORECOVERY or STANDBY, both of which
will prevent the editing of the reporting data.
Cheers,
Paul Ibison SQL Server MVP, www.replicationanswers.com
(recommended sql server 2000 replication book:
http://www.nwsu.com/0974973602p.html)|||Log shipping target databases are locked read-only. You cannot update a
log-shipped target database. IF you bring it live then you cannot apply any
further transaction logs.
Yes, you can "save up" transaction logs and apply them in sequence.
--
Geoff N. Hiten
Senior Database Administrator
Microsoft SQL Server MVP
"kghammond" <kghammond@.nrscorp.com> wrote in message
news:1133799275.397007.188420@.g43g2000cwa.googlegroups.com...
> We are researching whether we can accomplish the following goal with
> log shipping. We have a reporting database that is mostly read-only
> that we would like to migrate to log shipping.
> Our environment consists of two database servers, one is our
> operational database server and the other is a reporting database
> server. All the data writes go to our operational database.
> Currently, we replicate the operational database to the reporting
> database every evening. We intentional do this once a day so that we
> have a 24 hour window to correct any data entry issues if they should
> occur.
> Log shipping sounds easy enough to setup, but here is where it gets
> complicated for us. Our reporting database is used via our web portal
> application. Our users have the ability to make changes during the day
> via the portal. Some of these changes cause writes to both the
> reporting database as well as the operational database. The writes to
> the reporting database let the users get immediate changes, and the
> writes to the operational database ensure the data will be updated upon
> the next nightly update.
> 1) Will these intermittent writes to the reporting database server
> prevent a log shipping transaction log restore from completing
> successfully?
> 2) If we do transaction log backups once an hour, I assume we can save
> them all and then replay all the tlog backups at one time in the middle
> of the night.
> We tried to setup replication at one point, but our database schema
> would not replicate to a second server and we have not had the
> resources to try to resolve the replication issues.
> Thank You,
> Kevin
>|||The business case that we are trying to resolve is that we are moving
our reporting database server to a co-location facility. In the past
we had high speed LAN access between our database servers so nightly
backup and restores of our database where not an issue. Our database
is currently about 7GB and it is growing at about 25% per year. Our
WAN connection is a standard point to point T1. 7GB is too much data
to move across the WAN on a nightly basis. LiteSpeed will shrink the
database to 1.5GB, but still that is very large compared to 35MB
transaction logs.
Is it possible to make a copy of the log shipping target database on
the remote site? In essence could we use the log shipping target
database to stage a production database across the WAN.
Operational Database --> Log Shipping --> Staging Database (RO) -->
Backup/detach/copy/etc --> Reporting Database (RW)
Thank You for the quick feedback,
Kevin|||Upgrade to SQL Server 2005 and use database snapshots - exactly what they
are designed for.
You can't detach the database and copy, nor can you stop SQL Server and copy
the files and use those as inputs to sp_attach_db.
Tony.
--
Tony Rogerson
SQL Server MVP
http://sqlserverfaq.com - free video tutorials
"kghammond" <kghammond@.nrscorp.com> wrote in message
news:1133800712.975757.55690@.f14g2000cwb.googlegroups.com...
> The business case that we are trying to resolve is that we are moving
> our reporting database server to a co-location facility. In the past
> we had high speed LAN access between our database servers so nightly
> backup and restores of our database where not an issue. Our database
> is currently about 7GB and it is growing at about 25% per year. Our
> WAN connection is a standard point to point T1. 7GB is too much data
> to move across the WAN on a nightly basis. LiteSpeed will shrink the
> database to 1.5GB, but still that is very large compared to 35MB
> transaction logs.
> Is it possible to make a copy of the log shipping target database on
> the remote site? In essence could we use the log shipping target
> database to stage a production database across the WAN.
> Operational Database --> Log Shipping --> Staging Database (RO) -->
> Backup/detach/copy/etc --> Reporting Database (RW)
> Thank You for the quick feedback,
> Kevin
>|||Our final solution is to utilize Double Take. We plan to replicate our
.mdf and .ldf across the wan using double take. Then once a night we
will pause Double Take and copy the .mdf and .ldf out of the replica.
We will then mount that as our reporting database.
So far all seems well. Scripting out an automated way to attach the
database and verify that it is not corrupt and then backout to the
previous database if necessary is proving to be a little tricky. If
only DTS allowed better flow control...
SQL 2005 will be on its way soon, but not soon enough :(
Thank you all for your input,
Kevin|||Hello,
Thanks for let me know the current status. If you have any questions or
concerns in future, feel free to post back!
Sophie Guo
Microsoft Online Partner Support
Get Secure! - www.microsoft.com/security
=====================================================When responding to posts, please "Reply to Group" via your newsreader so
that others may learn and benefit from your issue.
=====================================================This posting is provided "AS IS" with no warranties, and confers no rights.

Advanced Log Shipping

We are researching whether we can accomplish the following goal with
log shipping. We have a reporting database that is mostly read-only
that we would like to migrate to log shipping.
Our environment consists of two database servers, one is our
operational database server and the other is a reporting database
server. All the data writes go to our operational database.
Currently, we replicate the operational database to the reporting
database every evening. We intentional do this once a day so that we
have a 24 hour window to correct any data entry issues if they should
occur.
Log shipping sounds easy enough to setup, but here is where it gets
complicated for us. Our reporting database is used via our web portal
application. Our users have the ability to make changes during the day
via the portal. Some of these changes cause writes to both the
reporting database as well as the operational database. The writes to
the reporting database let the users get immediate changes, and the
writes to the operational database ensure the data will be updated upon
the next nightly update.
1) Will these intermittent writes to the reporting database server
prevent a log shipping transaction log restore from completing
successfully?
2) If we do transaction log backups once an hour, I assume we can save
them all and then replay all the tlog backups at one time in the middle
of the night.
We tried to setup replication at one point, but our database schema
would not replicate to a second server and we have not had the
resources to try to resolve the replication issues.
Thank You,
KevinKevin,
log shipping restores the logs using NORECOVERY or STANDBY, both of which
will prevent the editing of the reporting data.
Cheers,
Paul Ibison SQL Server MVP, www.replicationanswers.com
(recommended sql server 2000 replication book:
http://www.nwsu.com/0974973602p.html)|||Log shipping target databases are locked read-only. You cannot update a
log-shipped target database. IF you bring it live then you cannot apply any
further transaction logs.
Yes, you can "save up" transaction logs and apply them in sequence.
Geoff N. Hiten
Senior Database Administrator
Microsoft SQL Server MVP
"kghammond" <kghammond@.nrscorp.com> wrote in message
news:1133799275.397007.188420@.g43g2000cwa.googlegroups.com...
> We are researching whether we can accomplish the following goal with
> log shipping. We have a reporting database that is mostly read-only
> that we would like to migrate to log shipping.
> Our environment consists of two database servers, one is our
> operational database server and the other is a reporting database
> server. All the data writes go to our operational database.
> Currently, we replicate the operational database to the reporting
> database every evening. We intentional do this once a day so that we
> have a 24 hour window to correct any data entry issues if they should
> occur.
> Log shipping sounds easy enough to setup, but here is where it gets
> complicated for us. Our reporting database is used via our web portal
> application. Our users have the ability to make changes during the day
> via the portal. Some of these changes cause writes to both the
> reporting database as well as the operational database. The writes to
> the reporting database let the users get immediate changes, and the
> writes to the operational database ensure the data will be updated upon
> the next nightly update.
> 1) Will these intermittent writes to the reporting database server
> prevent a log shipping transaction log restore from completing
> successfully?
> 2) If we do transaction log backups once an hour, I assume we can save
> them all and then replay all the tlog backups at one time in the middle
> of the night.
> We tried to setup replication at one point, but our database schema
> would not replicate to a second server and we have not had the
> resources to try to resolve the replication issues.
> Thank You,
> Kevin
>|||The business case that we are trying to resolve is that we are moving
our reporting database server to a co-location facility. In the past
we had high speed LAN access between our database servers so nightly
backup and restores of our database where not an issue. Our database
is currently about 7GB and it is growing at about 25% per year. Our
WAN connection is a standard point to point T1. 7GB is too much data
to move across the WAN on a nightly basis. LiteSpeed will shrink the
database to 1.5GB, but still that is very large compared to 35MB
transaction logs.
Is it possible to make a copy of the log shipping target database on
the remote site? In essence could we use the log shipping target
database to stage a production database across the WAN.
Operational Database --> Log Shipping --> Staging Database (RO) -->
Backup/detach/copy/etc --> Reporting Database (RW)
Thank You for the quick feedback,
Kevin|||Upgrade to SQL Server 2005 and use database snapshots - exactly what they
are designed for.
You can't detach the database and copy, nor can you stop SQL Server and copy
the files and use those as inputs to sp_attach_db.
Tony.
Tony Rogerson
SQL Server MVP
http://sqlserverfaq.com - free video tutorials
"kghammond" <kghammond@.nrscorp.com> wrote in message
news:1133800712.975757.55690@.f14g2000cwb.googlegroups.com...
> The business case that we are trying to resolve is that we are moving
> our reporting database server to a co-location facility. In the past
> we had high speed LAN access between our database servers so nightly
> backup and restores of our database where not an issue. Our database
> is currently about 7GB and it is growing at about 25% per year. Our
> WAN connection is a standard point to point T1. 7GB is too much data
> to move across the WAN on a nightly basis. LiteSpeed will shrink the
> database to 1.5GB, but still that is very large compared to 35MB
> transaction logs.
> Is it possible to make a copy of the log shipping target database on
> the remote site? In essence could we use the log shipping target
> database to stage a production database across the WAN.
> Operational Database --> Log Shipping --> Staging Database (RO) -->
> Backup/detach/copy/etc --> Reporting Database (RW)
> Thank You for the quick feedback,
> Kevin
>|||Our final solution is to utilize Double Take. We plan to replicate our
.mdf and .ldf across the wan using double take. Then once a night we
will pause Double Take and copy the .mdf and .ldf out of the replica.
We will then mount that as our reporting database.
So far all seems well. Scripting out an automated way to attach the
database and verify that it is not corrupt and then backout to the
previous database if necessary is proving to be a little tricky. If
only DTS allowed better flow control...
SQL 2005 will be on its way soon, but not soon enough
Thank you all for your input,
Kevin|||Hello,
Thanks for let me know the current status. If you have any questions or
concerns in future, feel free to post back!
Sophie Guo
Microsoft Online Partner Support
Get Secure! - www.microsoft.com/security
========================================
=============
When responding to posts, please "Reply to Group" via your newsreader so
that others may learn and benefit from your issue.
========================================
=============
This posting is provided "AS IS" with no warranties, and confers no rights.

Advanced Log Shipping

We are researching whether we can accomplish the following goal with
log shipping. We have a reporting database that is mostly read-only
that we would like to migrate to log shipping.
Our environment consists of two database servers, one is our
operational database server and the other is a reporting database
server. All the data writes go to our operational database.
Currently, we replicate the operational database to the reporting
database every evening. We intentional do this once a day so that we
have a 24 hour window to correct any data entry issues if they should
occur.
Log shipping sounds easy enough to setup, but here is where it gets
complicated for us. Our reporting database is used via our web portal
application. Our users have the ability to make changes during the day
via the portal. Some of these changes cause writes to both the
reporting database as well as the operational database. The writes to
the reporting database let the users get immediate changes, and the
writes to the operational database ensure the data will be updated upon
the next nightly update.
1) Will these intermittent writes to the reporting database server
prevent a log shipping transaction log restore from completing
successfully?
2) If we do transaction log backups once an hour, I assume we can save
them all and then replay all the tlog backups at one time in the middle
of the night.
We tried to setup replication at one point, but our database schema
would not replicate to a second server and we have not had the
resources to try to resolve the replication issues.
Thank You,
Kevin
Kevin,
log shipping restores the logs using NORECOVERY or STANDBY, both of which
will prevent the editing of the reporting data.
Cheers,
Paul Ibison SQL Server MVP, www.replicationanswers.com
(recommended sql server 2000 replication book:
http://www.nwsu.com/0974973602p.html)
|||Log shipping target databases are locked read-only. You cannot update a
log-shipped target database. IF you bring it live then you cannot apply any
further transaction logs.
Yes, you can "save up" transaction logs and apply them in sequence.
Geoff N. Hiten
Senior Database Administrator
Microsoft SQL Server MVP
"kghammond" <kghammond@.nrscorp.com> wrote in message
news:1133799275.397007.188420@.g43g2000cwa.googlegr oups.com...
> We are researching whether we can accomplish the following goal with
> log shipping. We have a reporting database that is mostly read-only
> that we would like to migrate to log shipping.
> Our environment consists of two database servers, one is our
> operational database server and the other is a reporting database
> server. All the data writes go to our operational database.
> Currently, we replicate the operational database to the reporting
> database every evening. We intentional do this once a day so that we
> have a 24 hour window to correct any data entry issues if they should
> occur.
> Log shipping sounds easy enough to setup, but here is where it gets
> complicated for us. Our reporting database is used via our web portal
> application. Our users have the ability to make changes during the day
> via the portal. Some of these changes cause writes to both the
> reporting database as well as the operational database. The writes to
> the reporting database let the users get immediate changes, and the
> writes to the operational database ensure the data will be updated upon
> the next nightly update.
> 1) Will these intermittent writes to the reporting database server
> prevent a log shipping transaction log restore from completing
> successfully?
> 2) If we do transaction log backups once an hour, I assume we can save
> them all and then replay all the tlog backups at one time in the middle
> of the night.
> We tried to setup replication at one point, but our database schema
> would not replicate to a second server and we have not had the
> resources to try to resolve the replication issues.
> Thank You,
> Kevin
>
|||The business case that we are trying to resolve is that we are moving
our reporting database server to a co-location facility. In the past
we had high speed LAN access between our database servers so nightly
backup and restores of our database where not an issue. Our database
is currently about 7GB and it is growing at about 25% per year. Our
WAN connection is a standard point to point T1. 7GB is too much data
to move across the WAN on a nightly basis. LiteSpeed will shrink the
database to 1.5GB, but still that is very large compared to 35MB
transaction logs.
Is it possible to make a copy of the log shipping target database on
the remote site? In essence could we use the log shipping target
database to stage a production database across the WAN.
Operational Database --> Log Shipping --> Staging Database (RO) -->
Backup/detach/copy/etc --> Reporting Database (RW)
Thank You for the quick feedback,
Kevin
|||Upgrade to SQL Server 2005 and use database snapshots - exactly what they
are designed for.
You can't detach the database and copy, nor can you stop SQL Server and copy
the files and use those as inputs to sp_attach_db.
Tony.
Tony Rogerson
SQL Server MVP
http://sqlserverfaq.com - free video tutorials
"kghammond" <kghammond@.nrscorp.com> wrote in message
news:1133800712.975757.55690@.f14g2000cwb.googlegro ups.com...
> The business case that we are trying to resolve is that we are moving
> our reporting database server to a co-location facility. In the past
> we had high speed LAN access between our database servers so nightly
> backup and restores of our database where not an issue. Our database
> is currently about 7GB and it is growing at about 25% per year. Our
> WAN connection is a standard point to point T1. 7GB is too much data
> to move across the WAN on a nightly basis. LiteSpeed will shrink the
> database to 1.5GB, but still that is very large compared to 35MB
> transaction logs.
> Is it possible to make a copy of the log shipping target database on
> the remote site? In essence could we use the log shipping target
> database to stage a production database across the WAN.
> Operational Database --> Log Shipping --> Staging Database (RO) -->
> Backup/detach/copy/etc --> Reporting Database (RW)
> Thank You for the quick feedback,
> Kevin
>
|||Hello,
Just check in to see if our MVPs suggestions were helpful. If anything is
unclear, get in touch.
Sophie Guo
Microsoft Online Partner Support
Get Secure! - www.microsoft.com/security
================================================== ===
When responding to posts, please "Reply to Group" via your newsreader so
that others may learn and benefit from your issue.
================================================== ===
This posting is provided "AS IS" with no warranties, and confers no rights.
|||Our final solution is to utilize Double Take. We plan to replicate our
..mdf and .ldf across the wan using double take. Then once a night we
will pause Double Take and copy the .mdf and .ldf out of the replica.
We will then mount that as our reporting database.
So far all seems well. Scripting out an automated way to attach the
database and verify that it is not corrupt and then backout to the
previous database if necessary is proving to be a little tricky. If
only DTS allowed better flow control...
SQL 2005 will be on its way soon, but not soon enough
Thank you all for your input,
Kevin
|||Hello,
Thanks for let me know the current status. If you have any questions or
concerns in future, feel free to post back!
Sophie Guo
Microsoft Online Partner Support
When responding to posts, please "Reply to Group" via your newsreader so
that others may learn and benefit from your issue.
================================================== ===
Business-Critical Phone Support (BCPS) provides you with technical phone
support at no charge during critical LAN outages or "business down"
situations. This benefit is available 24 hours a day, 7 days a week to all
Microsoft technology partners in the United States and Canada.
This and other support options are available here:
BCPS:
https://partner.microsoft.com/US/tec...rview/40010469
Others: https://partner.microsoft.com/US/tec...pportoverview/
If you are outside the United States, please visit our International
Support page:
http://support.microsoft.com/common/international.aspx
================================================== ===
This posting is provided "AS IS" with no warranties, and confers no rights.

Advanced functions in SQL Express

Hi,

I would like exploring Reporting Services associated with SQL Express Edition so i tried installing "SQLEXPR_ADV_.EXE" the package available on download SQL site, but after having selected Reporting Services option I've got an error explaining that component "sqlserver2005_bc.msi" is to be changed.

As this component is part of "SQLEXPR_ADV_.EXE" package is there anybody knowing an alternative solution ?

Are you worrying about that the sqlserver2005_bc.msi component will be changed, so that repaire/modify the changed component will fail? I found the simliar issue in the instruction of SQL2005 sp1, which provides a workaround. You can take look at here:

http://download.microsoft.com/download/b/d/1/bd1e0745-0e65-43a5-ac6a-f6173f58d80e/ReadmeSQL2005SP1.htm

Thursday, March 8, 2012

ADSI and Reporting Services Permission issue.

I have a stored proc that calls on a view in sql server. The view is
actually my Active Directory LDAP query. I have already created a
linked server and account for my user name jkim.
I have a report that polls data from this stored proc. After I set the
datasource which uses an account "sqluser", I try to the run the report
and I get this:
An error has occurred during report processing. (rsProcessingAborted)
Get Online Help
Query execution failed for data set 'DispatchActivity'.
(rsErrorExecutingCommand) Get Online Help
OLE DB provider 'ADSDSOObject' reported an error. The provider
indicates that the user did not have the permission to perform the
operation.
The data source uses the account "sqluser" which is an account that's
stored in the report server. I'm pretty sure this is what's causing
the error. Does 'sqluser' have to be an actual Active Directory user
for it to have access to the ADSDSOObject?
I don't know if I'm making any sense.nevermind, i figured it out.
i didn't check the "use as windows authentication" checkbox in the
datasource.

Saturday, February 25, 2012

Adobe Reader could not open ...

On a weekly basis I have over 100 subscription reports in pdf format emailed
to clients.
We are using SQL server 2000 and SQL Reporting Services 2000.
Several of the clients are reporting that they are unable to open their
report.
They get "Adobe reader could not open ReportName because it is either not a
supported file type or the file is corrupted... etc"
No custom code involved, all reports are generated in the same manner and
deployed in the same manner.
In some organizations several clients receive the same report and only one
or two of the clients reports a problem.
All these clients are using the same version of Adobe reader.
Is this a reader problem or a Report Server issue?
I am having a problem narrowing it down to the reader or generator!We are having the same issue. It was working for a while and now, user by
user, we are running into the same problem. Does anyone have any suggestions?
"Doug Gifford" wrote:
> On a weekly basis I have over 100 subscription reports in pdf format emailed
> to clients.
> We are using SQL server 2000 and SQL Reporting Services 2000.
> Several of the clients are reporting that they are unable to open their
> report.
> They get "Adobe reader could not open ReportName because it is either not a
> supported file type or the file is corrupted... etc"
> No custom code involved, all reports are generated in the same manner and
> deployed in the same manner.
> In some organizations several clients receive the same report and only one
> or two of the clients reports a problem.
> All these clients are using the same version of Adobe reader.
> Is this a reader problem or a Report Server issue?
> I am having a problem narrowing it down to the reader or generator!
>

ADO.NET problem?

I'm experiencing a strange problem that I believe is related to ADO.NET
but I can't say for sure.

I have a simple ASP.NET reporting interface to a SQL Server 2000
database. One report that we run returns a listing of community
members and their contact information using a stored procedure.
Depending on the selected community, this can return from a hundred to
over 1000 rows. Occasionally, the report stops running when a
community with large membership is run -- the report hangs for a while
and then comes back empty (no dataset). If I try to run the stored
procedure directly with the same parameters, everything seems perfectly
fine. I can temporarily fix the problem by simply running an ALTER
PROCEDURE statement without making a single change to the procedure.
The report will now run fine for several days until it eventually stops
again.

I can't reproduce the problem in my development environment.

Does anyone have any ideas as to what this could be?

Bill E.
Hollywood, FL(billmiami2@.netscape.net) writes:

Quote:

Originally Posted by

I'm experiencing a strange problem that I believe is related to ADO.NET
but I can't say for sure.
>
I have a simple ASP.NET reporting interface to a SQL Server 2000
database. One report that we run returns a listing of community
members and their contact information using a stored procedure.
Depending on the selected community, this can return from a hundred to
over 1000 rows. Occasionally, the report stops running when a
community with large membership is run -- the report hangs for a while
and then comes back empty (no dataset). If I try to run the stored
procedure directly with the same parameters, everything seems perfectly
fine. I can temporarily fix the problem by simply running an ALTER
PROCEDURE statement without making a single change to the procedure.
The report will now run fine for several days until it eventually stops
again.
>
I can't reproduce the problem in my development environment.


It sounds that you run into a command timeout, and the error message
is then thrown away.

I assume that when you run the procedure directly afterwards, that you
are running it from Query Analyzer.

Next time this happens, before you run the procedure in Query Analyzer,
issue this command:

SET ARITHABORT OFF

My prediction is that it will now run as slow as it did ASP .Net.

As you may know SQL Server creates a query plan for a stored procedure
when you run it the first time, and this plan is put into cache.

Now, there can be more than one plan for the same procedure, because
of the different set options. For a discussion on this see
http://www.karaszi.com/SQLServer/in...compile_set.asp.

All modern client APIs uses the same SET options by default. Query
Analyzer uses a different default on one point: it runs with SET
ARITHABORT ON, which a client API does not.

Therefore when you run the procedure from QA, you will get a different
plan than you did for the ASP .Net client.

Next phenomenon is something called "parameter sniffing". When SQL Server
creates the query plan for a stored procedures, it looks at the input
parameter for the first invocation. My guess is that the weh page runs with
a plan that is good for a small selection. Typically there will be no
plan in cache which matches the settings for QA, so you will get a plan
which is better fit for the larger selection.

Note here that this does not say that things will faster if ARITHABORT
is ON. Had the defaults been in the reverse, you would have seen the
same behaviour. (With one qualification: ARITHABORT must be ON for
indexes on computed columns and views to be used, so if such are involved
it can make a lot of difference.)

Exactly what is the best resolution for your situation is difficult to
say with the amount of information given. If you can live with the slow
response time on large selections, set the CommandTimeout on the Connection
object to 0, to prevent timeouts from happenning. (There was a bug in
earlier versions of SqlClient where 0 was interpreted as 0. If you have an
old version of .Net Fx 1.x, you may have to set the command timeout to 32767
instead.)

If that is not feasible you may have to review indexing and also examine
the query plan in more detail.

--
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se
Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/pr...oads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodin...ions/books.mspx|||Erland,

Thanks for the response as always.

I'll try the SET ARITHABORT OFF in query analyzer the next time it
occurs. I would be very interested to see what happens in this case.

When it's working, the query runs very quickly, even when 1000 rows are
returned. This is true for both query analyzer and the web page. Of
course when it breaks down, the web page stops but query analyzer still
runs it in a split second.

I certainly know that SQL Server creates an execution plan on the first
run, but I never thought that there would be two execution plans and
certainly not a different one for a large resultset vs. a small
resultset.

The query contains a large number of subqueries which I think might be
confusing the optimizer. Perhaps a solution using some temp tables
will help--I'll have to see.

No, setting the timeout longer won't suffice, especially when I know
that the query has the potential to run quickly.

Bill

Erland Sommarskog wrote:

Quote:

Originally Posted by

(billmiami2@.netscape.net) writes:

Quote:

Originally Posted by

I'm experiencing a strange problem that I believe is related to ADO.NET
but I can't say for sure.

I have a simple ASP.NET reporting interface to a SQL Server 2000
database. One report that we run returns a listing of community
members and their contact information using a stored procedure.
Depending on the selected community, this can return from a hundred to
over 1000 rows. Occasionally, the report stops running when a
community with large membership is run -- the report hangs for a while
and then comes back empty (no dataset). If I try to run the stored
procedure directly with the same parameters, everything seems perfectly
fine. I can temporarily fix the problem by simply running an ALTER
PROCEDURE statement without making a single change to the procedure.
The report will now run fine for several days until it eventually stops
again.

I can't reproduce the problem in my development environment.


>
It sounds that you run into a command timeout, and the error message
is then thrown away.
>
I assume that when you run the procedure directly afterwards, that you
are running it from Query Analyzer.
>
Next time this happens, before you run the procedure in Query Analyzer,
issue this command:
>
SET ARITHABORT OFF
>
My prediction is that it will now run as slow as it did ASP .Net.
>
As you may know SQL Server creates a query plan for a stored procedure
when you run it the first time, and this plan is put into cache.
>
Now, there can be more than one plan for the same procedure, because
of the different set options. For a discussion on this see
http://www.karaszi.com/SQLServer/in...compile_set.asp.
>
All modern client APIs uses the same SET options by default. Query
Analyzer uses a different default on one point: it runs with SET
ARITHABORT ON, which a client API does not.
>
Therefore when you run the procedure from QA, you will get a different
plan than you did for the ASP .Net client.
>
Next phenomenon is something called "parameter sniffing". When SQL Server
creates the query plan for a stored procedures, it looks at the input
parameter for the first invocation. My guess is that the weh page runs with
a plan that is good for a small selection. Typically there will be no
plan in cache which matches the settings for QA, so you will get a plan
which is better fit for the larger selection.
>
Note here that this does not say that things will faster if ARITHABORT
is ON. Had the defaults been in the reverse, you would have seen the
same behaviour. (With one qualification: ARITHABORT must be ON for
indexes on computed columns and views to be used, so if such are involved
it can make a lot of difference.)
>
Exactly what is the best resolution for your situation is difficult to
say with the amount of information given. If you can live with the slow
response time on large selections, set the CommandTimeout on the Connection
object to 0, to prevent timeouts from happenning. (There was a bug in
earlier versions of SqlClient where 0 was interpreted as 0. If you have an
old version of .Net Fx 1.x, you may have to set the command timeout to 32767
instead.)
>
If that is not feasible you may have to review indexing and also examine
the query plan in more detail.
>
--
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se
>
Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/pr...oads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodin...ions/books.mspx

|||(billmiami2@.netscape.net) writes:

Quote:

Originally Posted by

When it's working, the query runs very quickly, even when 1000 rows are
returned. This is true for both query analyzer and the web page. Of
course when it breaks down, the web page stops but query analyzer still
runs it in a split second.
>
I certainly know that SQL Server creates an execution plan on the first
run, but I never thought that there would be two execution plans and
certainly not a different one for a large resultset vs. a small
resultset.


When I said that there could be different plans for large and small result
sets that was a simplification. Consider this simple procedure:

CREATE PROCEDURE get_count @.val int, @.count OUTPUT AS
SELECT @.count = count(DISTINCT col1) FROM tbl WHERE col2 = @.val

Assume that here is a non-clustered index on col2 and that this index
does not include col1, nor is col1 in the clustred index. Assume further
that the distribution of col2 is uneven. 30% of the rows have 0 in this
column, the remaining rows have scattered value.

If the first invocation is for @.val = 10, the optimizer will use the
index to compute the query. But if the first invocation is for @.val = 0,
the optimizer will scan the table, because that is faster in this case.

Now, exactly what is going in your application I don't know. But it
sounds as if the procedure is recompiled at some point, and the input
values at that point are very atypical, leading to a poor execution plan
for regular values. That poor plan could affect smaller selection, but
you could be lucky that the cost is less noticeable in this case.

But why would the procedure be recompiled? There are several reasons
for this. One is change in statistics. By default SQL Server maintains
statistics on the tables, and when they change for a table, referring
procedures will be recompiled. It could also be that the plan falls out
of cache if there is memory pressure, and the procedure has not been
used for a while.

One thing you could consider is to add WITH RECOMPILE to the procedure
definition. In this case the procedure is recompiled each time it is
invoked, and nothing is put into cache. The recompile has a cost, but
at least you prevent a bad plan from sticking.

--
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se
Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/pr...oads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodin...ions/books.mspx|||Erland,

This occurred again so I immediately tried the SET ARITHABORT OFF
option and you were correct--the query ran slowly in query analyzer as
well.

By the way, we discovered that this problem is occurring not only with
the member listing procedure but also a member search procedure when an
address search option is selected. Other search options do not cause
problems. Therefore, I am beginning to suspect that something is wrong
with our address table. Perhaps we have some fragmentation in an index
or something related. In the interim, I've added the WITH RECOMPILE
clause to these two procedures to ensure that they do not run slowly or
time out. Fortunately, this doesn't appear to be slowing them down
much.

Bill|||(billmiami2@.netscape.net) writes:

Quote:

Originally Posted by

By the way, we discovered that this problem is occurring not only with
the member listing procedure but also a member search procedure when an
address search option is selected. Other search options do not cause
problems. Therefore, I am beginning to suspect that something is wrong
with our address table. Perhaps we have some fragmentation in an index
or something related.


You can examine fragmentation with DBCC SHOWCONTIG. But I would not expect
that to be the problem, since you apparently get different plans.

But statistics may not be current. By default, SQL Server updates statistics
automatically, but for large tables it may not be often enough.

--
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se
Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/pr...oads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodin...ions/books.mspx|||I'll take a look.

The table isn't large. It only has about 15,000 rows.

Bill

Friday, February 24, 2012

ADO.NET DataSets as a DataSource

Can I use a ADO dataset as a datasource for a report using Reporting Services
2000? Or will I have to implement a custom data extension. Does anyone know?You have to implement an extension until version 2, then there will be new
web and winform controls.
Bruce Loehle-Conger
MVP SQL Server Reporting Services
"Aparna" <Aparna@.discussions.microsoft.com> wrote in message
news:819219C0-B02D-470D-BA74-D15A72DD014C@.microsoft.com...
> Can I use a ADO dataset as a datasource for a report using Reporting
Services
> 2000? Or will I have to implement a custom data extension. Does anyone
know?

Sunday, February 12, 2012

Administration Of Reporting Services

Books online talk about a report manager page to monitor and administrator running job, subscriptions etc...

But I use Reporting Servcies in sharepoint administration mode? Where do I go to see something similiar?

Thanks

Try putting the url below into an Internet Explorer address bar and click on go. It should pull up Report Manager for you.

For default SQL Server instances use this:

http://localhost/reports/Pages/Folder.aspx <-- substitute your web server name for localhost

For named SQL Server instances use this:

http://localhost/reports$InstanceName/Pages/Folder.aspx <-- substitued your web server name for localhost and your SQL Server instance name for InstanceName

|||

This do not work becuase as soon as you run in sharepoint integrated mode, you loose the report manager functionality. Actually, an error is returned if you try to navigate to that page.

Frank

Administering via SQL Mgmt Studio

Are BUILTIN\Administrators (local admins on the server) the only persons
allowed to use the SQL Server Mgmt Studio (SSMS) to administer Reporting
Services (SSRS)? Can others who may have the Content Manager Role on a
folder or that have the System Administrator Role use SSMS? So far my
experience is only BUILTIN\Admins can use SSMS. Anyting special that must be
done to open up the SSMS to Reporting Services for non BUILTIN\Admins?Why do you want someone, who is not DB admin, to use SSMS to "manage"
reporting services' meta database? Even a db manager has few need to manage
reporting services' database directly. Reporting Services is a web
application and it is managed through a web interfaccce (report manager -
http://serverName/reports).
"LTC" <LTC@.discussions.microsoft.com> wrote in message
news:3DD2B3EE-6637-476A-BC60-AAF6A9E31D75@.microsoft.com...
> Are BUILTIN\Administrators (local admins on the server) the only persons
> allowed to use the SQL Server Mgmt Studio (SSMS) to administer Reporting
> Services (SSRS)? Can others who may have the Content Manager Role on a
> folder or that have the System Administrator Role use SSMS? So far my
> experience is only BUILTIN\Admins can use SSMS. Anyting special that must
> be
> done to open up the SSMS to Reporting Services for non BUILTIN\Admins?|||I have taken 2 Microsoft Reporting Services (webcast) courses, which have
shown all the administration of SSRS taking place in SQL Server Management
Studio. I use SSMS to conduct other database and A.S. work and consider it
convenient to use SSMS to work SSRS issues, as well. At our location, DBAs
are not allowed to be local admins (except for temporary circumstances) as a
Sarbanes-Oxley design result. I think it is interesting that no one answers
my question, but are very willing to question the circumstances.
"Norman Yuan" wrote:
> Why do you want someone, who is not DB admin, to use SSMS to "manage"
> reporting services' meta database? Even a db manager has few need to manage
> reporting services' database directly. Reporting Services is a web
> application and it is managed through a web interfaccce (report manager -
> http://serverName/reports).
>
> "LTC" <LTC@.discussions.microsoft.com> wrote in message
> news:3DD2B3EE-6637-476A-BC60-AAF6A9E31D75@.microsoft.com...
> > Are BUILTIN\Administrators (local admins on the server) the only persons
> > allowed to use the SQL Server Mgmt Studio (SSMS) to administer Reporting
> > Services (SSRS)? Can others who may have the Content Manager Role on a
> > folder or that have the System Administrator Role use SSMS? So far my
> > experience is only BUILTIN\Admins can use SSMS. Anyting special that must
> > be
> > done to open up the SSMS to Reporting Services for non BUILTIN\Admins?
>
>|||>>I have taken 2 Microsoft Reporting Services (webcast) courses, which have
>> shown all the administration of SSRS taking place in SQL Server
>> Management
>> Studio. [..]
>>I think it is interesting that no one answers my question, but are very
>>willing to question the circumstances.
It *is* interesting, but it's also interesting that this is how the courses
recommended that you work, IMHO, since it may not be a viable strategy,
long-term.
Please read below, from Brian Welcker's Weblog
(http://blogs.msdn.com/bwelcker/ ) -- you may want to give him some
feedback.
Which parts of administering RS were you particularly interested in doing
via SSMS versus Report Manager (just curious)?
(snip) ---
Watusi (SSRS Management Tools Changes for Katmai)
For Katmai we are considering the removal of namespace management (folders,
reports, data sources, models) from the Reporting Services Add-in for SQL
Server Management Studio (SSMS). In other words, we are considering removing
the 'Home' folder under the Reporting Server node in SSMS.
Why the change?
Customer feedback and usage data indicates that Report Manager and/or
SharePoint are the tools of choice for managing the Report Server namespace,
rather than the SSMS add-in. The design constraints of SSMS mean that any
new namespace functionality is significantly expensive to implement,
specifically adding support for the namespace in SharePoint integrated mode.
For Katmai we want to invest in SharePoint and Report Manager for namespace
management and focus on SSMS as a server-level management tool.
This means that the namespace management functions that are not available in
Report Manager (Model ClickThrough and Model Item Security) will be added to
Report Manager (they are already in SharePoint). Job Management, configuring
System properties, and administering Roles will be moved to SSMS. In
addition, SSMS will be updated to work in SharePoint mode.
If you have feedback about these changes, please feel free to comment.
"LTC" <LTC@.discussions.microsoft.com> wrote in message
news:14C6CEB9-CA50-4BED-AADF-E3990A6C5B19@.microsoft.com...
>I have taken 2 Microsoft Reporting Services (webcast) courses, which have
> shown all the administration of SSRS taking place in SQL Server Management
> Studio. I use SSMS to conduct other database and A.S. work and consider
> it
> convenient to use SSMS to work SSRS issues, as well. At our location,
> DBAs
> are not allowed to be local admins (except for temporary circumstances) as
> a
> Sarbanes-Oxley design result. I think it is interesting that no one
> answers
> my question, but are very willing to question the circumstances.
> "Norman Yuan" wrote:
>> Why do you want someone, who is not DB admin, to use SSMS to "manage"
>> reporting services' meta database? Even a db manager has few need to
>> manage
>> reporting services' database directly. Reporting Services is a web
>> application and it is managed through a web interfaccce (report manager -
>> http://serverName/reports).
>>
>> "LTC" <LTC@.discussions.microsoft.com> wrote in message
>> news:3DD2B3EE-6637-476A-BC60-AAF6A9E31D75@.microsoft.com...
>> > Are BUILTIN\Administrators (local admins on the server) the only
>> > persons
>> > allowed to use the SQL Server Mgmt Studio (SSMS) to administer
>> > Reporting
>> > Services (SSRS)? Can others who may have the Content Manager Role on a
>> > folder or that have the System Administrator Role use SSMS? So far my
>> > experience is only BUILTIN\Admins can use SSMS. Anyting special that
>> > must
>> > be
>> > done to open up the SSMS to Reporting Services for non BUILTIN\Admins?
>>|||Appreciate your feedback and the insight into Katmai. Looks like it is best
to focus on the Report Manager tool. I am responsible for creating new
folders and assigning new accounts / roles. Most other work is done by the
report developers / folder content managers. I am a previous DBA whose work
has been outsourced. I, currently, provide a role of 'moving' the corp. into
new (DBMS, etc. ) software, then working out the details of handing the
support over to the service provider. I am not yet to the point of moving
the support of SSRS to the service providers, due to budgets, and other
administrative hurdles, so I am the current administrator, beyond the
installation of the product.
"Lisa Slater Nicholls" wrote:
> >>I have taken 2 Microsoft Reporting Services (webcast) courses, which have
> >> shown all the administration of SSRS taking place in SQL Server
> >> Management
> >> Studio. [..]
> >>I think it is interesting that no one answers my question, but are very
> >>willing to question the circumstances.
> It *is* interesting, but it's also interesting that this is how the courses
> recommended that you work, IMHO, since it may not be a viable strategy,
> long-term.
> Please read below, from Brian Welcker's Weblog
> (http://blogs.msdn.com/bwelcker/ ) -- you may want to give him some
> feedback.
> Which parts of administering RS were you particularly interested in doing
> via SSMS versus Report Manager (just curious)?
> (snip) ---
> Watusi (SSRS Management Tools Changes for Katmai)
> For Katmai we are considering the removal of namespace management (folders,
> reports, data sources, models) from the Reporting Services Add-in for SQL
> Server Management Studio (SSMS). In other words, we are considering removing
> the 'Home' folder under the Reporting Server node in SSMS.
> Why the change?
> Customer feedback and usage data indicates that Report Manager and/or
> SharePoint are the tools of choice for managing the Report Server namespace,
> rather than the SSMS add-in. The design constraints of SSMS mean that any
> new namespace functionality is significantly expensive to implement,
> specifically adding support for the namespace in SharePoint integrated mode.
> For Katmai we want to invest in SharePoint and Report Manager for namespace
> management and focus on SSMS as a server-level management tool.
> This means that the namespace management functions that are not available in
> Report Manager (Model ClickThrough and Model Item Security) will be added to
> Report Manager (they are already in SharePoint). Job Management, configuring
> System properties, and administering Roles will be moved to SSMS. In
> addition, SSMS will be updated to work in SharePoint mode.
> If you have feedback about these changes, please feel free to comment.
> "LTC" <LTC@.discussions.microsoft.com> wrote in message
> news:14C6CEB9-CA50-4BED-AADF-E3990A6C5B19@.microsoft.com...
> >I have taken 2 Microsoft Reporting Services (webcast) courses, which have
> > shown all the administration of SSRS taking place in SQL Server Management
> > Studio. I use SSMS to conduct other database and A.S. work and consider
> > it
> > convenient to use SSMS to work SSRS issues, as well. At our location,
> > DBAs
> > are not allowed to be local admins (except for temporary circumstances) as
> > a
> > Sarbanes-Oxley design result. I think it is interesting that no one
> > answers
> > my question, but are very willing to question the circumstances.
> >
> > "Norman Yuan" wrote:
> >
> >> Why do you want someone, who is not DB admin, to use SSMS to "manage"
> >> reporting services' meta database? Even a db manager has few need to
> >> manage
> >> reporting services' database directly. Reporting Services is a web
> >> application and it is managed through a web interfaccce (report manager -
> >> http://serverName/reports).
> >>
> >>
> >> "LTC" <LTC@.discussions.microsoft.com> wrote in message
> >> news:3DD2B3EE-6637-476A-BC60-AAF6A9E31D75@.microsoft.com...
> >> > Are BUILTIN\Administrators (local admins on the server) the only
> >> > persons
> >> > allowed to use the SQL Server Mgmt Studio (SSMS) to administer
> >> > Reporting
> >> > Services (SSRS)? Can others who may have the Content Manager Role on a
> >> > folder or that have the System Administrator Role use SSMS? So far my
> >> > experience is only BUILTIN\Admins can use SSMS. Anyting special that
> >> > must
> >> > be
> >> > done to open up the SSMS to Reporting Services for non BUILTIN\Admins?
> >>
> >>
> >>
>|||>>Looks like it is best to focus on the Report Manager tool.
I didn't actually mean to say that! I meant to say: if you have cogent
reasons why the work you need to do is better done in Management Studio
rather than Report Manager... then MS deserves to hear your reasons <s>.
OTOH... reading that post closely, it seemed clear to me that the RS team
found the required heirarchical arrangement of functionality in Studio
limiting and not really suited to their purposes. They were probably tired
of shoe-horning features into it. In the Report Manager, they have a much
free-er hand and it was probably not necessary for them to split their
effort between implementation of upcoming features in both UIs.
Going forward, I guess we should be happy if they can focus their energy on
one management UI and we get more new features as a result <s>.
Cheers,
>L<
"LTC" <LTC@.discussions.microsoft.com> wrote in message
news:D6726064-25F3-45C6-A9DB-C54CD3778868@.microsoft.com...
> Appreciate your feedback and the insight into Katmai. Looks like it is
> best
> to focus on the Report Manager tool. I am responsible for creating new
> folders and assigning new accounts / roles. Most other work is done by
> the
> report developers / folder content managers. I am a previous DBA whose
> work
> has been outsourced. I, currently, provide a role of 'moving' the corp.
> into
> new (DBMS, etc. ) software, then working out the details of handing the
> support over to the service provider. I am not yet to the point of moving
> the support of SSRS to the service providers, due to budgets, and other
> administrative hurdles, so I am the current administrator, beyond the
> installation of the product.
> "Lisa Slater Nicholls" wrote:
>> >>I have taken 2 Microsoft Reporting Services (webcast) courses, which
>> >>have
>> >> shown all the administration of SSRS taking place in SQL Server
>> >> Management
>> >> Studio. [..]
>> >>I think it is interesting that no one answers my question, but are
>> >>very
>> >>willing to question the circumstances.
>> It *is* interesting, but it's also interesting that this is how the
>> courses
>> recommended that you work, IMHO, since it may not be a viable strategy,
>> long-term.
>> Please read below, from Brian Welcker's Weblog
>> (http://blogs.msdn.com/bwelcker/ ) -- you may want to give him some
>> feedback.
>> Which parts of administering RS were you particularly interested in doing
>> via SSMS versus Report Manager (just curious)?
>> (snip) ---
>> Watusi (SSRS Management Tools Changes for Katmai)
>> For Katmai we are considering the removal of namespace management
>> (folders,
>> reports, data sources, models) from the Reporting Services Add-in for SQL
>> Server Management Studio (SSMS). In other words, we are considering
>> removing
>> the 'Home' folder under the Reporting Server node in SSMS.
>> Why the change?
>> Customer feedback and usage data indicates that Report Manager and/or
>> SharePoint are the tools of choice for managing the Report Server
>> namespace,
>> rather than the SSMS add-in. The design constraints of SSMS mean that any
>> new namespace functionality is significantly expensive to implement,
>> specifically adding support for the namespace in SharePoint integrated
>> mode.
>> For Katmai we want to invest in SharePoint and Report Manager for
>> namespace
>> management and focus on SSMS as a server-level management tool.
>> This means that the namespace management functions that are not available
>> in
>> Report Manager (Model ClickThrough and Model Item Security) will be added
>> to
>> Report Manager (they are already in SharePoint). Job Management,
>> configuring
>> System properties, and administering Roles will be moved to SSMS. In
>> addition, SSMS will be updated to work in SharePoint mode.
>> If you have feedback about these changes, please feel free to comment.
>> "LTC" <LTC@.discussions.microsoft.com> wrote in message
>> news:14C6CEB9-CA50-4BED-AADF-E3990A6C5B19@.microsoft.com...
>> >I have taken 2 Microsoft Reporting Services (webcast) courses, which
>> >have
>> > shown all the administration of SSRS taking place in SQL Server
>> > Management
>> > Studio. I use SSMS to conduct other database and A.S. work and
>> > consider
>> > it
>> > convenient to use SSMS to work SSRS issues, as well. At our location,
>> > DBAs
>> > are not allowed to be local admins (except for temporary circumstances)
>> > as
>> > a
>> > Sarbanes-Oxley design result. I think it is interesting that no one
>> > answers
>> > my question, but are very willing to question the circumstances.
>> >
>> > "Norman Yuan" wrote:
>> >
>> >> Why do you want someone, who is not DB admin, to use SSMS to "manage"
>> >> reporting services' meta database? Even a db manager has few need to
>> >> manage
>> >> reporting services' database directly. Reporting Services is a web
>> >> application and it is managed through a web interfaccce (report
>> >> manager -
>> >> http://serverName/reports).
>> >>
>> >>
>> >> "LTC" <LTC@.discussions.microsoft.com> wrote in message
>> >> news:3DD2B3EE-6637-476A-BC60-AAF6A9E31D75@.microsoft.com...
>> >> > Are BUILTIN\Administrators (local admins on the server) the only
>> >> > persons
>> >> > allowed to use the SQL Server Mgmt Studio (SSMS) to administer
>> >> > Reporting
>> >> > Services (SSRS)? Can others who may have the Content Manager Role
>> >> > on a
>> >> > folder or that have the System Administrator Role use SSMS? So far
>> >> > my
>> >> > experience is only BUILTIN\Admins can use SSMS. Anyting special
>> >> > that
>> >> > must
>> >> > be
>> >> > done to open up the SSMS to Reporting Services for non
>> >> > BUILTIN\Admins?
>> >>
>> >>
>> >>
>>