Showing posts with label oltp. Show all posts
Showing posts with label oltp. Show all posts

Monday, March 19, 2012

is SSIS wonderful?

Hi guys, I'm using sql 2000 now but eventually will upgrade existing server to sql 2005. currently we only have OLTP dbs. I would like to set up data warehouse & OLAP functionality to drive reporting from the system, is SSIS the product I need? can you please give me the suggestions?

regards

SSIS will certainly help you get your data warehouse going. It is certainly light years ahead of DTS in SQL Server 2000 in terms of functionality and operatability.|||

I know SQL 2005 can do pivot() in the query, isn't enough for reporting? Do I really need to use SSIS ?

|||

NewWorld wrote:

I know SQL 2005 can do pivot() in the query, isn't enough for reporting? Do I really need to use SSIS ?

Reporting really has nothing to do with SSIS. Though you can perform upfront calculations in SSIS and load them into a table to be later used for reports.

Pivot? Pivot is only a small piece of SSIS.|||

I've been using SSIS for a couple of months now and have grown to like it. I'm using it for exactly what you mention -- importing data from (multiple) OLTP databases into a relational data warehouse.

The visual designer is very useful since you can easily see how the data is flowing while running the package. And data viewers are a terrific debugging tool since you can see the data at different points in the data pipeline (much easier than debugging raw SQL scripts).

As far as OLAP goes, you would use SQL Server 2005 Analysis Services to define the structure of your OLAP database, and then use SSIS to get your data from the relational warehouse into the OLAP database (that's usually the last step of my SSIS package -- process the cube).

For reporting, you could use SQL Server 2005 Reporting Services to report on the data in the relational warehouse and/or the OLAP cubes.

Hope this helps, and good luck.

-Larry

|||SSIS - whats that?

You would be much better off creating your own app... lol

Anyways - after using SSIS for 6 months - it is improvement over DTS; BUT MS 2nd hack at ETL tool isnt quite the big time yet --> Some/most Built-in tasks/tools are mediocre or missing (FTP,ZIP,Email etc) and Data Flow although good concept doesnt have a match for SQL based solution --> for example to Update data - you would drop dead using OLE Update Command.

Although - I might sound synical - this is not just based on my own experience banging head against wall trying to figure out SSIS; our consultant with 20 years of DB & programming experience has similar opinion - its his idea using SQL based ETLs which we decided to follow.

Wonderful - joke rite?
Improvement - YES
Easy to get started - YES
Satisfies all your requirements --> Maybe - Maybe Not - BUT be ready for all kinds of errors and mishaps in the journey.

As usual - SSIS forum is the best place to find help in case you get into trouble|||

TheViewMaster wrote:


Anyways - after using SSIS for 6 months - it is improvement over DTS; BUT MS 2nd hack at ETL tool isnt quite the big time yet --> Some/most Built-in tasks/tools are mediocre or missing (FTP,ZIP,Email etc) and Data Flow although good concept doesnt have a match for SQL based solution --> for example to Update data - you would drop dead using OLE Update Command.

I'm going to have to disagree with you here. The data flow (where the OLE DB Command transformation is located) is designed to operate on a row-by-row basis. In a perfect world, how would you propose to implement an "update" feature? Are you looking for more of a GUI-based batch update in the CONTROL flow (as opposed to the Execute SQL task)?

|||

Lawrence Parker wrote:

The visual designer is very useful since you can easily see how the data is flowing while running the package. And data viewers are a terrific debugging tool since you can see the data at different points in the data pipeline (much easier than debugging raw SQL scripts).

Sorry mate - Data Flow data view is no match to SQL solution with using TEMP table to Load data and then Views, SPs, queries to process it which you can use anytime to monitor data movement. Dataview is only design time feature and doesnt help you much if package breaks down in the middle of a job.

Also - SQL solution although a bit more code will make packages easier to upgrade

My personal opinion - SSIS is not quite ready for ETL yet - do Extraxt Load Transform - and you are much better off|||I like being disagreed with on technical stuff - by defending my ideas on importing others we all improve.

OLE DB Command - row by row is way too slow - in a perfect world you would have something like UPDATE mytable FROM ... translated to SSIS.

Also - i forgot in me previous post -
I found a nugget today in Microsoft SQL language -
DELETE FROM x FROM x x ...|||

TheViewMaster wrote:

I like being disagreed with on technical stuff - by defending my ideas on importing others we all improve.

OLE DB Command - row by row is way too slow - in a perfect world you would have something like UPDATE mytable FROM ... translated to SSIS.

Also - i forgot in me previous post -
I found a nugget today in SQL language -
DELETE FROM x FROM x x ...

Right, but you cannot perform an "UPDATE mytable FROM anotherTable" in the data flow. (It shouldn't be expected that you could) You can do that today, and it is the preferred method, in the control flow via an Execute SQL task. Even if they did perform an "UPDATE mytable FROM anotherTable" in the data flow, you certainly wouldn't want that executing for EVERY row, would you? The control flow is the place for that operation. So again, are you asking for a GUI-based task in the control flow to perform batch updates?

|||Perhaps I am asking for wrong thing - but in SQL Stored Procedure all that could be done in one task -
Update existing
Insert new
Delete old
which is the bases of most of our ETL imports to our DB tables

At the moment we are using Data Flow to insert text files to db temp tables (In my case I sneak in occasional Derived column) and then do SQL processing.
I do like control of application (SSIS) to do the work, however benefits of SQL solution have outwighed my initial preference:
SQL has better maintainability - changes can be made to SPs, VIEWS without having to go through of hassle to recompile & deploy package
SQL - simpler processing model - TXT - TEMP table - VIEW - SP IMPORT table - VIEW - SP- Production table

However - SSIS benefits:
less TEMP tables and SQL code
ability to add info in Data Flow - derived column
customized filter duplicates Data Flow transform & able to do "INITCAP" more efficiently than in SQL

50/50|||

TheViewMaster wrote:

Perhaps I am asking for wrong thing - but in SQL Stored Procedure all that could be done in one task -
Update existing
Insert new
Delete old
which is the bases of most of our ETL imports to our DB tables

You're going to get me to 1,000 posts yet tonight, aren't you?! What you are talking about can be done in SSIS, though not inside one data flow. You need to think of the control flow as the foundation for all data work. Using your example, and assuming that order, here's how you'd do it.

Execute SQL task to update existing (update myTable FROM anotherTable) -> Data Flow (to insert new records) -> Execute SQL task (delete old)

Pretty easy, and perhaps faster than your stored procedure... Who knows. The data flow is a highly optimized engine for transporting data from point A (which could be one to many sources -- try that in your stored proc) to point B (which could be one to many destinations). Unfortunately updates aren't really part of data transportation, and is really at the mercy of the database engine.

|||1000 posts here we go - oh wait it's only my 135th post

Anyways - it s more complicated than

update

existing (update myTable FROM anotherTable) -> Data Flow (to

insert new records) -> Execute SQL task (delete old)

You need to determine which are new (lookup) - then which are changed (split) [referencing Jaime's article about SSIS - find columns in table A which are not in table B] - the SQL update, SQL delete (and trust me on this one - in SSIS books and docs they dont mention it at all - trial and error my friend to find out best way)

Somehow I think that INSERT in Store Proc is not that much slower that data flow import (and I would be rather in the mercy of DB engine than in the SSIS engine as the probablity is SSIS will be upgraded sooner)
|||

TheViewMaster wrote:


SQL has better maintainability - changes can be made to SPs, VIEWS without having to go through of hassle to recompile & deploy package

Now I *really* disagree with this statement. Using package configurations, you can create a package such that you'll never have to touch it again. I frequently use configurations to store sql so that I can make changes without touching the package. I can change databases, filenames, SQL, connection strings, etc... all with package configurations and without ever having to touch the package.

|||

TheViewMaster wrote:

1000 posts here we go - oh wait it's only my 135th post

You need to determine which are new (lookup) - then which are changed (split) [referencing Jaime's article about SSIS - find columns in table A which are not in table B] - the SQL update, SQL delete (and trust me on this one - in SSIS books and docs they dont mention it at all - trial and error my friend to find out best way)

Somehow I think that INSERT in Store Proc is not that much slower that data flow import (and I would be rather in the mercy of DB engine than in the SSIS engine as the probablity is SSIS will be upgraded sooner)

You did it! In response to the SQL delete, there are a TON of resources out there - Transact-SQL. Most anything you can do in a database, you can do in SSIS in one way or another... If you understand Transact-SQL, you can make SSIS do some pretty wonderful things!

I'll have to take this up another time, as my night has ended...

Phil

is SSIS wonderful?

Hi guys, I'm using sql 2000 now but eventually will upgrade existing server to sql 2005. currently we only have OLTP dbs. I would like to set up data warehouse & OLAP functionality to drive reporting from the system, is SSIS the product I need? can you please give me the suggestions?

regards

SSIS will certainly help you get your data warehouse going. It is certainly light years ahead of DTS in SQL Server 2000 in terms of functionality and operatability.|||

I know SQL 2005 can do pivot() in the query, isn't enough for reporting? Do I really need to use SSIS ?

|||

NewWorld wrote:

I know SQL 2005 can do pivot() in the query, isn't enough for reporting? Do I really need to use SSIS ?

Reporting really has nothing to do with SSIS. Though you can perform upfront calculations in SSIS and load them into a table to be later used for reports.

Pivot? Pivot is only a small piece of SSIS.|||

I've been using SSIS for a couple of months now and have grown to like it. I'm using it for exactly what you mention -- importing data from (multiple) OLTP databases into a relational data warehouse.

The visual designer is very useful since you can easily see how the data is flowing while running the package. And data viewers are a terrific debugging tool since you can see the data at different points in the data pipeline (much easier than debugging raw SQL scripts).

As far as OLAP goes, you would use SQL Server 2005 Analysis Services to define the structure of your OLAP database, and then use SSIS to get your data from the relational warehouse into the OLAP database (that's usually the last step of my SSIS package -- process the cube).

For reporting, you could use SQL Server 2005 Reporting Services to report on the data in the relational warehouse and/or the OLAP cubes.

Hope this helps, and good luck.

-Larry

|||SSIS - whats that?

You would be much better off creating your own app... lol

Anyways - after using SSIS for 6 months - it is improvement over DTS; BUT MS 2nd hack at ETL tool isnt quite the big time yet --> Some/most Built-in tasks/tools are mediocre or missing (FTP,ZIP,Email etc) and Data Flow although good concept doesnt have a match for SQL based solution --> for example to Update data - you would drop dead using OLE Update Command.

Although - I might sound synical - this is not just based on my own experience banging head against wall trying to figure out SSIS; our consultant with 20 years of DB & programming experience has similar opinion - its his idea using SQL based ETLs which we decided to follow.

Wonderful - joke rite?
Improvement - YES
Easy to get started - YES
Satisfies all your requirements --> Maybe - Maybe Not - BUT be ready for all kinds of errors and mishaps in the journey.

As usual - SSIS forum is the best place to find help in case you get into trouble|||

TheViewMaster wrote:


Anyways - after using SSIS for 6 months - it is improvement over DTS; BUT MS 2nd hack at ETL tool isnt quite the big time yet --> Some/most Built-in tasks/tools are mediocre or missing (FTP,ZIP,Email etc) and Data Flow although good concept doesnt have a match for SQL based solution --> for example to Update data - you would drop dead using OLE Update Command.

I'm going to have to disagree with you here. The data flow (where the OLE DB Command transformation is located) is designed to operate on a row-by-row basis. In a perfect world, how would you propose to implement an "update" feature? Are you looking for more of a GUI-based batch update in the CONTROL flow (as opposed to the Execute SQL task)?

|||

Lawrence Parker wrote:

The visual designer is very useful since you can easily see how the data is flowing while running the package. And data viewers are a terrific debugging tool since you can see the data at different points in the data pipeline (much easier than debugging raw SQL scripts).

Sorry mate - Data Flow data view is no match to SQL solution with using TEMP table to Load data and then Views, SPs, queries to process it which you can use anytime to monitor data movement. Dataview is only design time feature and doesnt help you much if package breaks down in the middle of a job.

Also - SQL solution although a bit more code will make packages easier to upgrade

My personal opinion - SSIS is not quite ready for ETL yet - do Extraxt Load Transform - and you are much better off|||I like being disagreed with on technical stuff - by defending my ideas on importing others we all improve.

OLE DB Command - row by row is way too slow - in a perfect world you would have something like UPDATE mytable FROM ... translated to SSIS.

Also - i forgot in me previous post -
I found a nugget today in Microsoft SQL language -
DELETE FROM x FROM x x ...|||

TheViewMaster wrote:

I like being disagreed with on technical stuff - by defending my ideas on importing others we all improve.

OLE DB Command - row by row is way too slow - in a perfect world you would have something like UPDATE mytable FROM ... translated to SSIS.

Also - i forgot in me previous post -
I found a nugget today in SQL language -
DELETE FROM x FROM x x ...

Right, but you cannot perform an "UPDATE mytable FROM anotherTable" in the data flow. (It shouldn't be expected that you could) You can do that today, and it is the preferred method, in the control flow via an Execute SQL task. Even if they did perform an "UPDATE mytable FROM anotherTable" in the data flow, you certainly wouldn't want that executing for EVERY row, would you? The control flow is the place for that operation. So again, are you asking for a GUI-based task in the control flow to perform batch updates?

|||Perhaps I am asking for wrong thing - but in SQL Stored Procedure all that could be done in one task -
Update existing
Insert new
Delete old
which is the bases of most of our ETL imports to our DB tables

At the moment we are using Data Flow to insert text files to db temp tables (In my case I sneak in occasional Derived column) and then do SQL processing.
I do like control of application (SSIS) to do the work, however benefits of SQL solution have outwighed my initial preference:
SQL has better maintainability - changes can be made to SPs, VIEWS without having to go through of hassle to recompile & deploy package
SQL - simpler processing model - TXT - TEMP table - VIEW - SP IMPORT table - VIEW - SP- Production table

However - SSIS benefits:
less TEMP tables and SQL code
ability to add info in Data Flow - derived column
customized filter duplicates Data Flow transform & able to do "INITCAP" more efficiently than in SQL

50/50|||

TheViewMaster wrote:

Perhaps I am asking for wrong thing - but in SQL Stored Procedure all that could be done in one task -
Update existing
Insert new
Delete old
which is the bases of most of our ETL imports to our DB tables

You're going to get me to 1,000 posts yet tonight, aren't you?! What you are talking about can be done in SSIS, though not inside one data flow. You need to think of the control flow as the foundation for all data work. Using your example, and assuming that order, here's how you'd do it.

Execute SQL task to update existing (update myTable FROM anotherTable) -> Data Flow (to insert new records) -> Execute SQL task (delete old)

Pretty easy, and perhaps faster than your stored procedure... Who knows. The data flow is a highly optimized engine for transporting data from point A (which could be one to many sources -- try that in your stored proc) to point B (which could be one to many destinations). Unfortunately updates aren't really part of data transportation, and is really at the mercy of the database engine.

|||1000 posts here we go - oh wait it's only my 135th post

Anyways - it s more complicated than

update

existing (update myTable FROM anotherTable) -> Data Flow (to

insert new records) -> Execute SQL task (delete old)

You need to determine which are new (lookup) - then which are changed (split) [referencing Jaime's article about SSIS - find columns in table A which are not in table B] - the SQL update, SQL delete (and trust me on this one - in SSIS books and docs they dont mention it at all - trial and error my friend to find out best way)

Somehow I think that INSERT in Store Proc is not that much slower that data flow import (and I would be rather in the mercy of DB engine than in the SSIS engine as the probablity is SSIS will be upgraded sooner)
|||

TheViewMaster wrote:


SQL has better maintainability - changes can be made to SPs, VIEWS without having to go through of hassle to recompile & deploy package

Now I *really* disagree with this statement. Using package configurations, you can create a package such that you'll never have to touch it again. I frequently use configurations to store sql so that I can make changes without touching the package. I can change databases, filenames, SQL, connection strings, etc... all with package configurations and without ever having to touch the package.

|||

TheViewMaster wrote:

1000 posts here we go - oh wait it's only my 135th post

You need to determine which are new (lookup) - then which are changed (split) [referencing Jaime's article about SSIS - find columns in table A which are not in table B] - the SQL update, SQL delete (and trust me on this one - in SSIS books and docs they dont mention it at all - trial and error my friend to find out best way)

Somehow I think that INSERT in Store Proc is not that much slower that data flow import (and I would be rather in the mercy of DB engine than in the SSIS engine as the probablity is SSIS will be upgraded sooner)

You did it! In response to the SQL delete, there are a TON of resources out there - Transact-SQL. Most anything you can do in a database, you can do in SSIS in one way or another... If you understand Transact-SQL, you can make SSIS do some pretty wonderful things!

I'll have to take this up another time, as my night has ended...

Phil

Wednesday, March 7, 2012

Is replication the best solution

We have a high volume OLTP database of about 300 g and growing at anywhere
from 5-20 g a week. We are running out of space and want to archive data
older than 45 days off of the database to an archive database that could be
used for reporting.
The question is would replication be the best method for moving data to the
Reporting/Archive database, in that data older than 45 would be deleted from
the OLTP but needs to remain in the Archive. We also do upgrades to our
internal applications about once a month, and sometimes the upgrades involve
database changes.
The suggestion has been made to simply run the OLTP and Reporting from the
same database and simply move it to a larger SAN (and a server that isn't
running Biztalk and housing its databases), since we don't seem to have any
processing or memory issues, unless backups are being done.
I think a separate reporting and OLTP solution would be better in that we
will be continuing to grow in transactions, possibly doubling it this year.
The reports would be for internal purposes and also for web reporting to the
payors and payees we handle. Actually a lot of the database bloat is in XML
overflow from the forms we receive in and that need to be reassembled for
some of the web clients.
Just to note we are using SQL Server 2000 EE active/passive cluster, on
Windows 2003 servers
Any insight or advice welcome.
Nancy
Sounds more like a job for DTS/SSIS. If you delete a replicated record on
the Publisher (OLTP) side, it deletes on the other side (Subscriber) as
well.
Write scripts/packages that move the data that is X days old or has been
changed since Y days to the other database. You will of course need some
way to determine when a record changed or was deleted/inserted...
Other opinions may vary
Kevin Hill
3NF Consulting
http://www.3nf-inc.com/NewsGroups.htm
Real-world stuff I run across with SQL Server:
http://kevin3nf.blogspot.com
"Nancy Lytle" <nancy_lytle@.payformance.com> wrote in message
news:emsbntFTHHA.5016@.TK2MSFTNGP05.phx.gbl...
> We have a high volume OLTP database of about 300 g and growing at anywhere
> from 5-20 g a week. We are running out of space and want to archive data
> older than 45 days off of the database to an archive database that could
> be used for reporting.
> The question is would replication be the best method for moving data to
> the Reporting/Archive database, in that data older than 45 would be
> deleted from the OLTP but needs to remain in the Archive. We also do
> upgrades to our internal applications about once a month, and sometimes
> the upgrades involve database changes.
> The suggestion has been made to simply run the OLTP and Reporting from the
> same database and simply move it to a larger SAN (and a server that isn't
> running Biztalk and housing its databases), since we don't seem to have
> any processing or memory issues, unless backups are being done.
> I think a separate reporting and OLTP solution would be better in that we
> will be continuing to grow in transactions, possibly doubling it this
> year. The reports would be for internal purposes and also for web
> reporting to the payors and payees we handle. Actually a lot of the
> database bloat is in XML overflow from the forms we receive in and that
> need to be reassembled for some of the web clients.
> Just to note we are using SQL Server 2000 EE active/passive cluster, on
> Windows 2003 servers
> Any insight or advice welcome.
> Nancy
>
|||The problem is that both databases need to be insync,
with the Archive database holding all the data and the OLTP holding all the
"current" data. The only difference will be that whatever data is deleted
by the "Archiving process" will be deleted from OLTP but should not be
deleted from Archive/Rpt DB. That is the hard part to work around. Unless
you push all data to Archive and run any "non-Archive deletes" on Archive
separately, but that might be a maintenance nightmare.
But you've given me more to think about.
Thanks,
Nancy
"Kevin3NF" <kevin@.SPAMTRAP.3nf-inc.com> wrote in message
news:%23F3F$%23FTHHA.3948@.TK2MSFTNGP05.phx.gbl...
> Sounds more like a job for DTS/SSIS. If you delete a replicated record on
> the Publisher (OLTP) side, it deletes on the other side (Subscriber) as
> well.
> Write scripts/packages that move the data that is X days old or has been
> changed since Y days to the other database. You will of course need some
> way to determine when a record changed or was deleted/inserted...
> Other opinions may vary
> --
> Kevin Hill
> 3NF Consulting
> http://www.3nf-inc.com/NewsGroups.htm
> Real-world stuff I run across with SQL Server:
> http://kevin3nf.blogspot.com
>
> "Nancy Lytle" <nancy_lytle@.payformance.com> wrote in message
> news:emsbntFTHHA.5016@.TK2MSFTNGP05.phx.gbl...
>
|||You can use replication for this, and just have deletes be effective is they
are on data recent than your archive interval. You will need a filter for
this.
Hilary Cotter
Looking for a SQL Server replication book?
http://www.nwsu.com/0974973602.html
Looking for a FAQ on Indexing Services/SQL FTS
http://www.indexserverfaq.com
"Nancy Lytle" <nancy_lytle@.payformance.com> wrote in message
news:emsbntFTHHA.5016@.TK2MSFTNGP05.phx.gbl...
> We have a high volume OLTP database of about 300 g and growing at anywhere
> from 5-20 g a week. We are running out of space and want to archive data
> older than 45 days off of the database to an archive database that could
> be used for reporting.
> The question is would replication be the best method for moving data to
> the Reporting/Archive database, in that data older than 45 would be
> deleted from the OLTP but needs to remain in the Archive. We also do
> upgrades to our internal applications about once a month, and sometimes
> the upgrades involve database changes.
> The suggestion has been made to simply run the OLTP and Reporting from the
> same database and simply move it to a larger SAN (and a server that isn't
> running Biztalk and housing its databases), since we don't seem to have
> any processing or memory issues, unless backups are being done.
> I think a separate reporting and OLTP solution would be better in that we
> will be continuing to grow in transactions, possibly doubling it this
> year. The reports would be for internal purposes and also for web
> reporting to the payors and payees we handle. Actually a lot of the
> database bloat is in XML overflow from the forms we receive in and that
> need to be reassembled for some of the web clients.
> Just to note we are using SQL Server 2000 EE active/passive cluster, on
> Windows 2003 servers
> Any insight or advice welcome.
> Nancy
>
|||So in essence we would be filtering out any deletes of data older than 45
days (or however they decide to do the Archiving)? How would you prevent
the deletes from being pushed over to the subscriber? I'm having a hard time
picture a filter to prevent deletes from being pushed.
Anyway, thank you for your time and answer.
Nancy Lytle
"Hilary Cotter" <hilary.cotter@.gmail.com> wrote in message
news:eylT5YHTHHA.5016@.TK2MSFTNGP05.phx.gbl...
> You can use replication for this, and just have deletes be effective is
> they are on data recent than your archive interval. You will need a filter
> for this.
> --
> Hilary Cotter
> Looking for a SQL Server replication book?
> http://www.nwsu.com/0974973602.html
> Looking for a FAQ on Indexing Services/SQL FTS
> http://www.indexserverfaq.com
>
> "Nancy Lytle" <nancy_lytle@.payformance.com> wrote in message
> news:emsbntFTHHA.5016@.TK2MSFTNGP05.phx.gbl...
>
|||There are several approaches to this. If there is a date column on each
table you are replicating you could include this logic in your proc.
Here is an example :
create procedure "sp_MSdel_dbotest"
@.pkc1 int
as
/*
begin
delete "dbo"."test"
where "pk" = @.pkc1
if @.@.rowcount = 0
if @.@.microsoftversion>0x07320000
exec sp_MSreplraiserror 20598
end
*/
delete "dbo"."test"
where "pk" = @.pkc1 and DateColumn>getdate()-25
You can also do your the delete statement which does your archive by
replicating the execution of a stored procedure and have the stored
procedure do nothing on the subscriber.
Another option is to put a dummy filter on the tables you are replicating,
and toggle it so it returns false when you are doing the archiving deletes.
In this situation all DML will not be replicated so it is not optimal but
can work for you.
Hilary Cotter
Looking for a SQL Server replication book?
http://www.nwsu.com/0974973602.html
Looking for a FAQ on Indexing Services/SQL FTS
http://www.indexserverfaq.com
"Nancy Lytle" <nancy_lytle@.payformance.com> wrote in message
news:eev7s4HTHHA.3592@.TK2MSFTNGP03.phx.gbl...
> So in essence we would be filtering out any deletes of data older than 45
> days (or however they decide to do the Archiving)? How would you prevent
> the deletes from being pushed over to the subscriber? I'm having a hard
> time picture a filter to prevent deletes from being pushed.
> Anyway, thank you for your time and answer.
> Nancy Lytle
> "Hilary Cotter" <hilary.cotter@.gmail.com> wrote in message
> news:eylT5YHTHHA.5016@.TK2MSFTNGP05.phx.gbl...
>
|||Thank you very much for the explanation and options to work with.
I am sure replication can work in this situation, I think the developers
have just been wary of it best of problems we have had when they have run
update scripts on OLTP and Reporting, without mentioning schema changes,
adding/deleting columns, which we find out when the replication breaks. I
think that is one reason they are afraid to start Archiving with
replication.
But again, thank you so much for your advice and assistance.
Nancy
"Hilary Cotter" <hilary.cotter@.gmail.com> wrote in message
news:uKRpy0ITHHA.1600@.TK2MSFTNGP05.phx.gbl...
> There are several approaches to this. If there is a date column on each
> table you are replicating you could include this logic in your proc.
> Here is an example :
>
> create procedure "sp_MSdel_dbotest"
> @.pkc1 int
> as
> /*
> begin
> delete "dbo"."test"
> where "pk" = @.pkc1
> if @.@.rowcount = 0
> if @.@.microsoftversion>0x07320000
> exec sp_MSreplraiserror 20598
> end
> */
> delete "dbo"."test"
> where "pk" = @.pkc1 and DateColumn>getdate()-25
> You can also do your the delete statement which does your archive by
> replicating the execution of a stored procedure and have the stored
> procedure do nothing on the subscriber.
> Another option is to put a dummy filter on the tables you are replicating,
> and toggle it so it returns false when you are doing the archiving
> deletes. In this situation all DML will not be replicated so it is not
> optimal but can work for you.
> --
> Hilary Cotter
> Looking for a SQL Server replication book?
> http://www.nwsu.com/0974973602.html
> Looking for a FAQ on Indexing Services/SQL FTS
> http://www.indexserverfaq.com
>
> "Nancy Lytle" <nancy_lytle@.payformance.com> wrote in message
> news:eev7s4HTHHA.3592@.TK2MSFTNGP03.phx.gbl...
>