Showing posts with label process. Show all posts
Showing posts with label process. Show all posts

Wednesday, March 28, 2012

Is there a more efficient way to perform this process?

My e-commerce site is currently running the following process when items are shipped from the manufacturer:

1. Manufacturer sends a 2-column CSV to the retailer containing an Order Number and its Shipping Tracking Number.

2. Through the web admin panel I've built, a retail staff member uploads the CSV to the server.

3. In code, the CSV is parsed. The tracking number is saved to the database attached to the Order Number.

4. After a tracking # is saved, each item in that order has its status updated to Shipped.

5. The customer is sent an email using ASPEmail from Persits which contains their tracking #.

The process seems to work without a problem so long as the CSV contains roughly 50 tracking #'s or so. The retailer has gotten insanely busy and wants to upload 3 or 4 thousand tracking #'s in a single CSV, but the process times out, even with large server timeout values being set in code. Is there a way to streamline the process to make this work more efficiently? I can provide the code if that helps.

It really depends on how your database backend works. SQL Server? Transactions?

|||

SQL Server 2000. I am not using Transactions. Any suggestions you have as to how I can improve the performance he would be appreciated. Here's what the code looks like. Basically I am building a giant SQL string (multiple EXEC calls on the same procedure) with a stringbuilder and executing it all at once at the bottom:

objOleDb.Open()
objCmd = New OleDbCommand("SELECT * FROM TrackingImport.csv", objOleDb)
objRdr = objCmd.ExecuteReader()

While objRdr.Read()
If Len(Trim(objRdr(1).ToString)) > 0 And Len(Trim(objRdr(0).ToString)) > 0 Then
If IsNumeric(objRdr(0)) Then
If CLng(objRdr(0)) > 0 Then
strSQL.Append("EXEC spImportTracking '" & objRdr(1) & "'," & objRdr(0) & ";")
EmailTracking(objRdr(0), objRdr(1))
i = i + 1
End If
End If
End If
End While

'Update orders with tracking numbers
dbobj.Open()
objSQLCmd = New SqlCommand(strSQL.ToString(), dbobj)
objSQLCmd.ExecuteNonQuery()

The stored procedure called above looks like this:

AlterPROCEDUREspImportTracking

@.TrackingNumberNVARCHAR(50)=NULL,

@.OrderNumberINT=NULL

AS

DECLARE@.OrderAddressIDINT,

@.OrderIDINT,

@.OrderItemIDINT,

@.CountryVARCHAR(2)

/*1. Get OrderID*/

SELECT@.OrderID=(SELECTuid

FROMOrders

WHEREOrderNumber=@.OrderNumber)

/*2. Check Country*/

SELECT@.Country=(SELECTTOP1Country

FROMOrderAddresses

WHEREOrderID=@.OrderID

ANDType=3)

IF@.Country='CA'ANDLEFT(@.TrackingNumber,2)='1Z'

BEGIN

SELECT dbo.Orders.uid,dbo.Orders.OrderNumber,dbo.Customers.FirstName,dbo.Customers.LastName,dbo.Customers.EMail,

'Invalid'AS'TrackingNo'

FROM dbo.CustomersINNERJOIN

dbo.OrdersONdbo.Customers.uid=dbo.Orders.CustomerID

WHEREdbo.Orders.OrderNumber=@.OrderNumber

END

ELSE

BEGIN

/*3. Insert to Tracking*/

INSERTINTOOrderTracking(BackOrderFlag,TrackingNumber,TrackingMessage)

VALUES(0,@.TrackingNumber,@.OrderNumber)

/*2. Get all unpaid (full or partial) orders for thiscustomer*/

DECLAREOrder_CursorCURSORFOR

SELECT OrderItems.uid

FROM OrderItems

WHERE OrderItems.OrderID=@.OrderID

OPENOrder_Cursor

FETCHNEXTFROMOrder_Cursor

INTO@.OrderItemID

WHILE@.@.FETCH_STATUS=0

BEGIN

/*2. Mark All Items in this Order as Shipped*/

EXECspUpdateProductionStatus@.OrderItemID,100

FETCHNEXTFROMOrder_Cursor

INTO@.OrderItemID

END

CLOSEOrder_Cursor

DEALLOCATEOrder_Cursor

/*Return Customer information for confirmation Email*/

SELECT dbo.Orders.uid,dbo.Orders.OrderNumber,dbo.Customers.FirstName,dbo.Customers.LastName,dbo.Customers.EMail,@.TrackingNumberAS'TrackingNo'

FROM dbo.CustomersINNERJOIN

dbo.OrdersONdbo.Customers.uid=dbo.Orders.CustomerID

WHEREdbo.Orders.OrderNumber=@.OrderNumber

END

SETNOCOUNTON

The EmailTracking method that gets called for each record looks like this:

PrivateSubEmailTracking(ByVal intOrderNumberAsLong,ByVal strTrackingNoAsString)

Dim objEmailAsNew CEmail()

Dim strBodyAsNewStringBuilder()

Dim dbobjAsNewSqlConnection(ConfigurationSettings.AppSettings("strConn"))

Dim objCmdAsSqlCommand

Dim objRdrAs SqlDataReader

Dim strBillCountryAsString =""

Try

dbobj.Open()

objCmd =NewSqlCommand("spGetShippingAddresses", dbobj)

objCmd.CommandType = CommandType.StoredProcedure

objCmd.Parameters.Add("@.OrderNumber",intOrderNumber)

objRdr = objCmd.ExecuteReader()

objEmail.strFrom = "shipping@.mydomain.com"

objEmail.strFromName = "Shipping Department"

objEmail.strSubject = "Ship Notification, TrackingNumber " & strTrackingNo

strBody.Append(strIntro)

While objRdr.Read

strBillCountry =objRdr("BillCountry").ToString

objEmail.strTo = objRdr("Email").ToString

strBody.Append("<b>Ship To:</b><br>"& objRdr("Recipient").ToString & "<br>" &_

objRdr("Address").ToString &"<br>" & objRdr("City").ToString &"<br>" & objRdr("State").ToString & _

"<br>" &objRdr("Zip").ToString & "<br>" &objRdr("Country").ToString & "<hr>")

IfobjRdr("WSClient").ToString <> ""Then

UpdatePhotoMgr(objRdr("WSClient").ToString,objRdr("PO#").ToString, strTrackingNo)

EndIf

EndWhile

strBody.Append("<p><b>Tracking Number:</b>" & strTrackingNo & "</p>" & _

"<p><b>Reference Order #:</b>" & intOrderNumber & "</p><p>"& _

SetTrackingLink(strTrackingNo))

strBody.Append(strEnd)

objEmail.strMessage = strBody.ToString

If strBillCountry ="US"Or strBillCountry = ""Or (strBillCountry = "CA"And Left(strTrackingNo, 1) = "D")Then

IfLen(objEmail.strTo) > 0Then

objEmail.SendMail()

Else

Response.Write("<script language='javascript'>alert('AddressMissing for Order " & _

intOrderNumber.ToString &"')</script>")

EndIf

EndIf

Catch objErrorAs Exception

lblError.Text = "Error occurred when sending to" & objEmail.strTo & ": " & objError.Message & _

". Please try re-loading the page, or contact technical support with adescription of this problem."

'Notify tech support

Dim objErrAsNew CEmail()

objErr.SendErrorMessage(objError, "Tracking EmailError", objEmail.strTo, Session("CompanyID"),Session("StoreID"))

objErr =Nothing

context.ClearError()

ExitTry

Finally

objRdr.Close()

objCmd.Dispose()

dbobj.Close()

EndTry

EndSub

1 thought I had would be rather than going to the database each time to retrieve the customer info for email purposes, I could grab all the customer info once in a dataset and then filter with a dataview 1 record at a time to send the emails. Not sure if that would help or not.

|||

1. Using the OleDbConnection to read the file is cool, but I think you could just as well just use a plain StreamReader, get each rows and split the string by the comma.

2. Don't use plain SQL strings. An evil CSV file could wipe out the entire database (consider the row "abc,1;DROP TABLE Orders")

3. Execute everything within a transaction. Not only could it help performance, it's also vital since this operation must be atomic. All the orders should be updated, and if something fails, nothing should happen.

4. Use several calls to spImportTracking (properly parametrized) and execute them each with its own ExecuteNonQuery.

5. Treat the email sending as a separate process. It could fail, and the orders need to know this. Add a status column to the Order table to indicate that an email should be sent, and process it afterwards. Don't return anything from spImportTracking.

6. What does spUpdateProductionStatus do? If it's just setting a status the entire cursor block should be replaced with someting like:

UPDATE OrderItems SET ProductionStatus=100 WHERE OrderID = @.OrderID

Extra:

Log each file upload (who, what and when) in a table with some kind of identifier. Store that identfier with each order (e.g add a column called "batchFileId" referring to the log table). It's good for tracking down errors or mistakes and can also use it when sending the emails (e.g send all emails relating to batchFileid=345)

Don't prefix your stored procedures with "sp". That prefix is reserved for system stored procedures.

An alternative would be to import the actual file contents into a table, and then make one single massive update by relating to that table (the import could be done using a DataAdapter, the SqlBulkUpdate class or just plain inserts (what the adapter would do anyway))|||

Thanks for such a thoughtful reply. I don't usually use SQL strings, but figured appending the Exec statements together and calling ExecuteNonQuery once was more efficient than doing in a loop. I will try that as you've suggested. I will test it out with the StreamReader, and yes, #6 makes a lot more sense. The UpdateProductionStatus procedure has a lot of logic it in to determine the order status which I guess I don't need in this case since I know what the status should be already (and the cursor isn't needed either).

|||

The StreamReader thing was really a minor issue, so don't wast too much energy on fixing that. Remember to use transactions, i.e

BeginTransaction

A lot of ExecuteNonQueries

If success then

Commit

Else

Rollback

End If


Friday, March 23, 2012

Is there a better way to skin this cat?

I have a table with about 50k rows. It's taking about an hour to process 10k rows of it. The outer cursor has about 30k rows, the inner between 1 and 7 rows.

I know cursors ar slow, especially the inner update cursor. I could do the same thing with a single cursor, and just keep track to see when @.carInit or @.carNumb change and reset @.tripID.

I have no clue how this all could be accomplished using a set based approach. If anybuddy has a bright idea, I'd love to hear it. Any indexes i should be using that I'm not?

Thanks,
Carl

ALTER PROCEDURE dbo.sp_FPS_CarSupplied AS
DECLARE @.carInit CHAR(4)
DECLARE @.carNumb CHAR(10)
DECLARE @.tripID INTEGER
DECLARE @.currStat CHAR(3)
DECLARE @.isSupStatus BIT

DECLARE curCar CURSOR FOR
SELECT CAR_INIT, CAR_NUMB FROM T_FPS_CCO_CAR_HIST
GROUP BY CAR_INIT, CAR_NUMB
ORDER BY COUNT(*) DESC

OPEN curCar
FETCH NEXT FROM curCar
INTO @.carInit, @.carNumb

WHILE @.@.FETCH_STATUS = 0
BEGIN
DECLARE curSetTrip CURSOR FORWARD_ONLY FOR
SELECT CAR_STAT_CD FROM T_FPS_CCO_CAR_HIST
WHERE CAR_INIT = @.carInit AND CAR_NUMB = @.carNumb
ORDER BY CAR_STAT_DT DESC, CAR_STAT_TM DESC
FOR UPDATE OF CAR_TRIP_ID, SUP_CNT_IND

OPEN curSetTrip
FETCH NEXT FROM curSetTrip
INTO @.currStat

SET @.tripID = 1
SET @.isSupStatus = 0
WHILE @.@.FETCH_STATUS = 0
BEGIN
IF @.currStat IN ('SPT','DEL','CP') AND @.isSupStatus = 0
BEGIN
SET @.isSupStatus = 1
UPDATE T_FPS_CCO_CAR_HIST
SET CAR_TRIP_ID = @.tripID, SUP_CNT_IND = 1
WHERE CURRENT OF curSetTrip
END
IF @.currStat IN ('REC','REL') AND @.isSupStatus = 1
BEGIN
SET @.isSupStatus = 0
SET @.tripID = @.tripID + 1
UPDATE T_FPS_CCO_CAR_HIST
SET CAR_TRIP_ID = @.tripID
WHERE CURRENT OF curSetTrip
END
IF @.currStat = 'PER' AND @.isSupStatus = 1
BEGIN
UPDATE T_FPS_CCO_CAR_HIST SET IS_SUP_ERR = 1
WHERE CAR_INIT = @.carInit AND CAR_NUMB = @.carNumb
AND CAR_STAT_CD IN ('SPT','DEL','CP')
AND CAR_TRIP_ID = @.tripID
END

FETCH NEXT FROM curSetTrip
INTO @.currStat
END
CLOSE curSetTrip
DEALLOCATE curSetTrip

FETCH NEXT FROM curCar
INTO @.carInit, @.carNumb
END
CLOSE curCar
DEALLOCATE curCar

Here's the table:

CREATE TABLE [T_FPS_CCO_CAR_HIST] (
[SUP_CNT_IND] [bit] NULL ,
[IS_SUP_ERR] [bit] NULL ,
[CAR_TRIP_ID] [int] NULL ,
[CAR_INIT] [nvarchar] (4) COLLATE Latin1_General_CI_AI NULL ,
[CAR_NUMB] [nvarchar] (10) COLLATE Latin1_General_CI_AI NULL ,
[UPD_DTTM] [smalldatetime] NULL ,
[CAR_STAT_CD] [nvarchar] (3) COLLATE Latin1_General_CI_AI NULL ,
[CAR_STAT_DT] [smalldatetime] NULL ,
[CAR_STAT_TM] [nvarchar] (8) COLLATE Latin1_General_CI_AI NULL ,
[LOAD_EMPTY] [nvarchar] (1) COLLATE Latin1_General_CI_AI NULL ,
[CCO_NBR] [nvarchar] (6) COLLATE Latin1_General_CI_AI NULL ,
[CCO_TYP] [nvarchar] (1) COLLATE Latin1_General_CI_AI NULL ,
[REJ_CD] [nvarchar] (2) COLLATE Latin1_General_CI_AI NULL
) ON [PRIMARY]
GO

CREATE
INDEX [ix_carid] ON T_FPS_CCO_CAR_HIST ([CAR_INIT], [CAR_NUMB])
WITH
DROP_EXISTING
ON [PRIMARY]

CREATE
INDEX [ix_dtm] ON T_FPS_CCO_CAR_HIST ([CAR_STAT_DT], [CAR_STAT_TM])
WITH
DROP_EXISTING
ON [PRIMARY]

CREATE
INDEX [IX_T_FPS_CCO_CAR_HIST] ON T_FPS_CCO_CAR_HIST ([CAR_TRIP_ID])
WITH
DROP_EXISTING
ON [PRIMARY]You are correct that you do not need a cursor to do this.
I would expect a performance boost of between 2 and 3 orders of magnitude (100-1000 times as fast) by converting this to an UPDATE statement using CASE functions to implement your logic.
Give it your best shot, and post what you come up with.sql

Is the SQL Enterprise Manager tool secure?

Or more exactly, is the authentication process when
registering a server encrypted in some fashion while
travelling over the network?
TIA,
Eric"Eric" <anonymous@.discussions.microsoft.com> wrote in message
news:13f6501c3f7e3$b11b5060$a401280a@.phx
.gbl...
> Or more exactly, is the authentication process when
> registering a server encrypted in some fashion while
> travelling over the network?
>
If you register as 'Use Windows Authentication' confidential user/password
information is not transmitted. While I have not traced it, I suspect that
SQL Server Authentication would transmit the user name and password in clear
text.
Steve|||The password is "obscured" for SQL logins but hardly encrypted which is why
Windows Authentication is preferred.
HTH
Jasper Smith (SQL Server MVP)
I support PASS - the definitive, global
community for SQL Server professionals -
http://www.sqlpass.org
"Steve Thompson" <SteveThompson@.nomail.please> wrote in message
news:%23VmoJM$9DHA.3900@.TK2MSFTNGP10.phx.gbl...
> "Eric" <anonymous@.discussions.microsoft.com> wrote in message
> news:13f6501c3f7e3$b11b5060$a401280a@.phx
.gbl...
> If you register as 'Use Windows Authentication' confidential user/password
> information is not transmitted. While I have not traced it, I suspect that
> SQL Server Authentication would transmit the user name and password in
clear
> text.
> Steve
>|||You can encrypt your traffic with SSL.
314636 FIX: Cannot Use Non-Administrator Account to Start SQL Server and
Force... http://support.microsoft.com/?id=314636
276553 HOW TO: Enable SSL Encryption for SQL Server 2000 with Certificate
Server http://support.microsoft.com/?id=276553
322144 SECDoClientHandShake Cannot Connect to SQL Server
http://support.microsoft.com/?id=322144
257591 Description of the Secure Sockets Layer (SSL) Handshake
http://support.microsoft.com/?id=257591
316898 HOW TO: Enable SSL Encryption for SQL Server 2000 with Microsoft...
http://support.microsoft.com/?id=316898
324777 WebCast: Microsoft SQL Server 2000: How to Configure SSL Encryption
http://support.microsoft.com/?id=324777
318605 INF: How SQL Server Uses a Certificate When the Force Protocol
http://support.microsoft.com/?id=318605
325757 INF: Using SQL Server 2000 with FIPS 140-1 Ciphers
http://support.microsoft.com/?id=325757
302409 FIX: Unable to Connect to SQL Server 2000 When Certificate
Authority... http://support.microsoft.com/?id=302409
311111 FIX: RPC Clients Unable to Login to SQL Server with Windows...
http://support.microsoft.com/?id=311111
Cindy Gross, MCDBA, MCSE
http://cindygross.tripod.com
This posting is provided "AS IS" with no warranties, and confers no rights.

Wednesday, March 7, 2012

Is RS/SQL 2000 compatible with VS 2005?

My experience has been that version 2000 reports are converted to 2005 when loaded into VS 2005. I want to process reports without converting. Basically I need what service pack of SQL/RS 2000 to edit, deploy, and run in full benefit of VS 2005? SP 3? SP 4? Or do I have to upgrade to SQL/RS 2005? How do I justify the expense to my client if so? Thanks.

I could be wrong but I don't think you can use VS.NET 2005 with RS 2000. I think the business Intelligent studio report only work w/ rs2k5.

Again, i have not try it and am not sure.

|||

I agree with Bruce.

You can't use vs2005 to produce RS2000 reports or run on a RS2000 server. You can run RS2000 reports on a SRS2005 server, without converting, but that'll give you the same result as running on RS2000. Obviously you can still connect to a SQL2000 db irrespective of which RS you use.

If your client needs that which RS200 cannot offer, then they'll have to front up the cash for SRS2005.

is replication session a transaction?

I am curious if a "session" for merge replication is a transaction. If I
see 100 actions in the Agent History and the last one is "the process could
not enumerate changes at the subscriber" due to General Network Error
(dropped connection or whatever), did those 10000 data changes get
committed or were they all rolled back and the next agent run will start
that all over again.
Assuming the negative possibility, what profile parameter would I change to
reduce the number of rows before a commit? Our agents for low-bandwidth
subscribers are running for hours and hours and never seem to catch up
because they always eventually lose the connection.
Thanks for any insights.
They are committed as singletons. So a batch of 100 (by default or whatever
is in the Upload|DownloadWriteChangesPerBatch) is tried and if errors
occurred while applying a batch the error rows go into a retry look which is
tried after the batch completes.
Should an network error occur during a synchronization, the merge agent will
look at the last batch successfully applied and then start to apply the next
one again. There is a possiblility that should 99 of the rows in the last
batch make it through, all 99 will be tried again to push the last one
through which was not successful.
You will notice that the slow link profile drops these values from 50 to 1
(or 5 for UploadGenerationsPerBatch).
Looking for a SQL Server replication book?
http://www.nwsu.com/0974973602.html
Looking for a FAQ on Indexing Services/SQL FTS
http://www.indexserverfaq.com
"Anachostic" <anachostic@.remove.700cb.net> wrote in message
news:Xns995CA07258BD9anachostic@.207.46.248.16...
>I am curious if a "session" for merge replication is a transaction. If I
> see 100 actions in the Agent History and the last one is "the process
> could
> not enumerate changes at the subscriber" due to General Network Error
> (dropped connection or whatever), did those 10000 data changes get
> committed or were they all rolled back and the next agent run will start
> that all over again.
> Assuming the negative possibility, what profile parameter would I change
> to
> reduce the number of rows before a commit? Our agents for low-bandwidth
> subscribers are running for hours and hours and never seem to catch up
> because they always eventually lose the connection.
> Thanks for any insights.