|
Thanks very much Hiren, for the help. I will try that.
Success is the good fortune that comes from aspiration, desperation, perspiration and inspiration.
|
|
|
|
|
As an alternative; you could create a function that returns[^] a table (with those ints). You could pass these even as parameters[^]
I are Troll
|
|
|
|
|
|
Thanks
|
|
|
|
|
Thank you very much Eddy. This is too good.
Success is the good fortune that comes from aspiration, desperation, perspiration and inspiration.
|
|
|
|
|
This is one of the times that I'm sad that I work in Oracle.
No such thing in Oracle as I know of.
"When did ignorance become a point of view" - Dilbert
|
|
|
|
|
sample join table query in MY SQL
Plz help
|
|
|
|
|
good subject lines give good Google results.
|
|
|
|
|
5
Had it been written like yoda speaks I would have give a 6.
"When did ignorance become a point of view" - Dilbert
|
|
|
|
|
whatever it takes to get people to try and use Google as a first step when in need.
|
|
|
|
|
Well said.
With Regards,
Kasson
|
|
|
|
|
Overview... Client has an inventory DB in SQL with over 2 million records. The inventory db is constantly updated by .NET 3.5 winforms app that queries another 3rd party db via web services. the inventory is essently in constant update. Client is looking for faster performance as it's taking days to cycle thru the whole table and inventoried items will continue to grow.
I have no control over the structure of the SQL table as its tied to other applications and I have no control over the 3rd party web service that I'm getting the updated info from.. I only have control over the winform app that communicates with the two. It simply queries the webservice for inventory data, do some work with the data, and update the results into SQL db. But it has to run down thru the entire inventory table.
I've inherited this app and so most of the work I've done was specific to functional tweaks and code cleanup but now the client is looking for more speed. Throwing 5 figures worth of hardware horsepower and my "tweaks" isn't giving him the results he was expecting. While his old hardware was inadequate for what he was doing, I explained throwing muscle doesn't really fix slow software.
Sending Update queries 1 record at a time whether via the .NET app or even calling a Stored Procedure just seems inefficient when your dealing with a dataset of this size.. And when it takes days to update, it runs 24/7 which causes other issues.
My thought is to do work with the data via a datatable and then do one big update to SQL. I'm not talking loading all records.. currently the app loops thru chunks of 500-1000 records at a time. There has to be way better than a 1 by 1 record update encapsulated in a big For-Loop.
Advice, Comments, Opinions?
I've done similar work with datatables but it was for manipulating CSV files and on much smaller scale (< 1000 rows). But it seems logical that the priciples would be the same and faster than calling SQL directly to update 1 record at a time.
Thanks In Advance!
|
|
|
|
|
Couple of things come to mind ...
Assuming the communication via the web-service is probably the slowest component in the process ...
Can you make your application multi-threaded or at least run more than one instance of the applicaiton ?
We had a process that used to process transaction from a single table which would cause many records to be updated; the problem was that new transactions were being added to the table faster than we could process them. As a quick and dirty fix for this customer was to create multiple "processor" applications; each instance of the "processor" applicaiton would read from the transaction table and handle only those records assigned to it. The workload assignment was broken into departments, Processor-A would handle transactions from Dept-A, etc. This actually worked out for us and the total throughput increased dramatically.
The advantage of this approach was that we didn't have to modify the original "proccessor" application drastically in order to get it to work.
Might be something you want to consider.
Good luck.
David
|
|
|
|
|
the web service is by far the slowest and also most restrictive component.. Right now the client is running between 4 and 8 instances of the app at once.. but when the app is only knocking out combined average 30,000 records an hour(on a good day), we're still talking days to cycle the whole table running 24/7. Each instance of the app is set up to queue up 500-1000 records at a time.. Queue as in queue build an array of IDs that will hit the webservices.. But then it updates record by record in a for-loop. Grabbing 500 IDs, adding them to an arraylist alone is silly slow.. Especially when a DT.Fill can do thousands of records in a couple seconds or less.
Among the "tweaks" I've done for the client is I have created the ability break down and categorize the workload for each instance of the app. This has helped the client some for specific business needs but in the end the entire inventory catalog needs to be updated as frequently as possible.
Given the web services constraints this project has a utopian theoretical max of 150,000 records an hour.. whether it's with 1 instance or with 20 instances.. at a certain threshold the webservice will start denying requests (security measure). I don't ever expect to see 150,000/hr with the current environment.. But I'd like to think that I could achieve a stable 80,000-100,000/hr if I could avoid having the app code from looping and hitting the SQL for each and every record it has queued up.
Too bad .NET doesn't have a BulkUpdate class similar to BulkCopy..
|
|
|
|
|
ok. seems like you've got a handle on the situation.
Regarding the updates and looping ...
If you have a table, #StagingTable, and it has the updates, then you should be able to do a join on your update statement. Something like this ...
UPDATE Table1
SET Table1.Field1 = #StagingTable.Field1
FROM Table1 INNER JOIN StagingTable ON Table1.Field2 = StagingTable.Field2
WHERE <whatever you="" need="">
By doing this, you will certainly reduce the amount of network traffic between the client computer and db server.
Let the db server do all the heavy work ...
Just an idea.
|
|
|
|
|
I would suggest that you first start by profiling the application so you can determine exactly what is taking the most time.
|
|
|
|
|
Seems to me there is a fundamental flaw in the process, the web services should be your bottleneck but for a database to only be updating 30k records an hour seems ludicrous.
If the cursor/loop is the problem then look a little further afield.
Can you construct an xml dataset on the UI end and pass in a bulk set of updates.
How about doing a bunch of bulk inserts into another table and then using a proc to do the updates to your transaction table. Target tables could be created for each instance and truncated after the update. Or cycle through 2 target tables per instance.
Never underestimate the power of human stupidity
RAH
|
|
|
|
|
Hi,
why this statement returns error on mysql?!
SELECT schema_name, CAST(sum(data_length) / 1024 AS UNSIGNED) AS db_size FROM information_schema.schemata INNER JOIN information_schema.tables ON information_schema.schemata.schema_name = information_schema.tables.table_schema GROUP BY information_schema.schemata.schema_name
|
|
|
|
|
It worked perfectly for me - MySQL version 5.1.45.
|
|
|
|
|
jrahma wrote: why this statement returns error on mysql?!
It'd help if you'd post an error-message
I are Troll
|
|
|
|
|
My Sp reading XML file data with help of OPENXML in SQL.
There is slight problem in this.
<Name_Address>
<name>JCB SALES PVT</name>
<address>24, SALAROURIA ARENA ADUGODI</address>
<address>HOSUR MAIN ROAD, Honolulu</address>
<country>N</country>
</Name_Address>
and my SQL query is
SELECT @address = CONVERT(VARCHAR(150),[TEXT]) FROM OPENXML(@idoc,'/Name_Address/address', 0) where [text] is not null
in @address i am getting last address tag value i.e
HOSUR MAIN ROAD, Honolulu
But it should be
24, SALAROURIA ARENA ADUGODI, HOSUR MAIN ROAD, Honolulu
How can i achieve this ?
Help me, guide me to do this.
regards
|
|
|
|
|
Hum Dum wrote: SELECT @address = CONVERT(VARCHAR(150),[TEXT]) FROM OPENXML(@idoc,'/Name_Address/address', 0) where [text] is not null
AFAIK the result is having two rows returning and At a time of converting SQL might picking up the second row and converting it accordingly, As It returns two rows having one column.
But I can Provide you another solution for that. To first Retrieve the result and using COALESCE you can JOIN them together to produce expected result.
Here's my work for you only.
SELECT @address = COALESCE(@address+',' ,'') + [address]
FROM
(
SELECT * FROM
OPENXML (@idoc, 'Name_Address/address',3)
WITH ([address] varchar(1000) '.')
) P
SELECT @address
It's working all right.
|
|
|
|
|
I'm looking into creating a custom database application that will access our companies database which is located on a SQL server. Previously we were using Access as the front end, however we are going to be implementing new 3rd party software that will our servers will need to communicate with, however to try and program this from access was going to be difficult. So now on to my question.
When my program load, it runs a query against the database to load the records into a dataset. I have no problem with loading records, updating records, or creating records. However I do have a problem if Person #1 is using my software and Person #2 updates/changes a record in the database after Person #1 has loaded my software. Person #1 will not see the changes Person #2 made unless they restart the program or I force the software to reload the entire database. However, I have no way of knowing if someone made changes to a record in order to update the dataset. Is there a way to have my software automatically detect changes to a record and automatically update the dataset? Or will I have to have the program update the dataset every few minutes? The problem is that the database contains >10,000 records and to have it update every few minutes could severely slow down the program, especially if someone is working over VPN. Any ideas would be greatly appreciated.
BTW I am working in VB.NET 3.5.
|
|
|
|
|
DisIsHoody wrote: load the records into a dataset
Yeah, don't do that.
Redesign your application; do it the right way.
|
|
|
|
|
DisIsHoody wrote: database contains >10,000 records
DisIsHoody wrote: Previously we were using Access as the front end
I suspect your database design is a complete disaster, probably evolved over a number of years by a power user (possibly yourself) and now you want to use it seriously. I make the STRONG suggestion to have a professional look at it.
Your database probably needs a complete redesign, and your existing data migrated to the new structure. PLEASE do not do this yourself, get a professional otherwise you will be back here in a year or two needing to do it all over again.
Never underestimate the power of human stupidity
RAH
|
|
|
|