|
thank you so much PIEBALDconsult
|
|
|
|
|
by using this query you can achive you require output
select * from rk_process where name like '%lee0%'
--raj
|
|
|
|
|
I'm not designing a purchase order system per se, but it is the closest thing to what I'm actually doing so that's why my question/concern involve it. In a "minimal" PO system, you have the following tables:
Customers
Orders
Order Details
Products
When an order is first created (e.g., #123), product descriptions and prices are pulled from the Products table. That order is marked as paid. A few months go by and some of the products in the Products table get updated with new prices. Now what happens when I look at order #123 and its details? The price of that order will no longer match the sum of its individual products. Is that common? Am I looking at this from the wrong angle?
Thanks.
- DC
"One man's wage rise is another man's price increase." - Harold Wilson
"Fireproof doesn't mean the fire will never come. It means when the fire comes that you will be able to withstand it." - Michael Simmons
"Show me a community that obeys the Ten Commandments and I'll show you a less crowded prison system." - Anonymous
|
|
|
|
|
Instead of having one price record per item, have many records with history so you can see what the price was when the order was placed.
Something like:
Item: ItemId,Name,Description,Supplier,etc.
1,Widget,A standard widget,Widget Co,...
ItemPrice: PriceId,ItemId,FromDate,ToDate,Price
1,1,2012-01-01,2012-07-01,1.00<br />
2,1,2012-07-01,null,1.50
When a new price is added, the FromDate of the new record and the ToDate of the previous price are set as Now.
You find the current price by WHERE ToDate IS NULL
To find a price for an order you can use WHERE orderdate BETWEEN FromDate and ISNULL(ToDate,Now)
You may add NextPrice and PreviousPrice fields if you need to do a lot of lookups.
(Specific syntax depends on the particular database in use.)
|
|
|
|
|
Another alternative is to have the details table also record the price.
That is basically a requirement if one allows a manager to do a price override.
|
|
|
|
|
Good point. Yet shouldn't there be some sort of an adjustment record that can store that rather than leave an auditor wondering why the price was different?
Still, you need both the price history and a way to override the price (with a paper trail).
|
|
|
|
|
PIEBALDconsult wrote: Yet shouldn't there be some sort of an adjustment record th
An Audit record.
The audit records would also keep track of things like voided transactions, reversed items, etc.
|
|
|
|
|
jschell has the right of it, put all the details required to calc the value on the detail record. This may also include any discount/adjustment values. Your invoice report should only pull the descriptors from the product and customer tables
Never underestimate the power of human stupidity
RAH
|
|
|
|
|
I am learning Asp.net with C# and Sql
server. I came across with a huge fixed width text file of 1780 columns. I have to import this file in sql server table. I will appreciate if you can help in this.
Thanks
Ibrahim Sharief
isharief291@gmail.com
|
|
|
|
|
|
Use Data Import Export Wizard of Sql Server
|
|
|
|
|
Besides just importing it you might want to consider if you really want to have a table with that many columns.
|
|
|
|
|
Hi
i want to show results something like this...
First row contain the opening balance calculated by this i.e: Sum(Balance)where datefield <= DateByUser
and next rows shows the records between dates given by user & evaluate balance on each row
output should like this..
VchrNo___DrAmnt___CrAmnt___ Balance
00 --------00---------00--------1570
113--------10---------00--------1580
115--------15---------00--------1585
123--------10---------00--------1595
123--------12---------00--------1607
. . . .
. . . .
|
|
|
|
|
|
Thanks For reply
the solution you provide help me lot and i am about to solve my problem..
|
|
|
|
|
I'm looking for a method/example/pseudo-code of importing a CSV file into a SQL Server table with several requirements:
1) I do not know the quantity of columns, column header nor data type of each column prior to the program running. Along with the header fields, I have another file which describes the type and size of each column (like integer, decimal or string) and I can already create the destination table based on this input.
2) Because I don't know the format of the input table before running the program (i.e. the input table changes every time), I can't create a class specifically designed to describe the contents of each record from the CSV file.
3) I already have a method working where a DataTable gets fully populated and the SQLBulkCopy variable uses the .WriteToServer() method to insert all of the records. HOWEVER, there's a problem when the source file gets too large (over 500mb or so) and the result is an OutOfMemory exception.
4) The AdHocGeek has a partial solution which works with strings only but doesn't address integers, decimals, dates, et.al.
The problem is that I can't see a way to use an IDataReader into a complex (i.e. many field types) table on the fly. I've added the .ColumnMappings() method to determine the name of the columns with which to map the input columns into the destination SQL Server table. When the application is run however, it gets an error indicating "The given value of type String from the data source cannot be converted to type decimal of the specified target column".
I can't determine why the DataTable works and the DataReader doesn't and can't find any examples to prove otherwise.
Any ideas or examples would be appreciated.
|
|
|
|
|
Well I would say you are screwed, only because you are trying to transform the data in your load. IMHO transforms are the biggest time waster ever inflicted on the load process. I would split the operations to a load process and then a transform. Caveat I am not addressing the size issue as I have never had the problem (and 500mb is very small beer in our environment).
I would do the following.
Read in the data file and get the header record.
Create a staging table in sql server exactly matching the column headers - every column to be varchar or nvarchar if needed
Convert the csv file to a datatable (This article [^]may help).
Use BulkCopy to load the data (everything is varchar so it WILL load)
Use a stored proc to do the transforms from the staging table to the target table. If you have to use a dictionary file then you are going to have to take that into account and it will be a challenge (probably a crap load of dynamic sql).
Drop the staging table
Never underestimate the power of human stupidity
RAH
|
|
|
|
|
That's a very interesting idea that should work. I would need to create a temporary SP based on the fields and data types for the conversion. It also pushes the field validation to the SP but that shouldn't be a problem.
Thanks for the feedback.
|
|
|
|
|
Allan Watkins wrote: a DataTable gets fully populated
That sounds like a bad idea; there's no need to have more than one record in memory at a time.
Allan Watkins wrote: cannot be converted
Sounds like you may need to specify a Parse routine. Take a look at the values that are causing the trouble -- it's possible that they are empty.
|
|
|
|
|
Hi,
Why I am getting the above error for this stored procedure?
CREATE DEFINER=`root`@`192.168.1.%` PROCEDURE `sp_open_file`(IN param_file_no int, IN param_clinic_id int, IN param_doctor_id int, IN param_account_category int, IN param_account_id int)
BEGIN
DECLARE param_patient_account_number BIGINT;
DECLARE param_price_list_id int;
DECLARE param_account_number BIGINT;
SET param_patient_account_number = (SELECT account_number FROM patients WHERE file_no = param_file_no);
IF (param_account_category = 1) THEN
SET param_account_number = param_patient_account_number;
ELSE
SET param_account_number = (SELECT account_number FROM accounts WHERE account_id = param_account_id);
END IF;
SET param_price_list_id = (SELECT price_list_id FROM accounts WHERE account_number = param_account_number);
INSERT INTO visits (file_no, clinic_id, doctor_id, account_category, account_number, price_list_id) VALUES (param_file_no, param_clinic_id, param_doctor_id, param_account_category , param_account_number, param_price_list_id);
UPDATE patients SET file_is_open = TRUE;
END
Technology News @ www.JassimRahma.com
|
|
|
|
|
..because one of the three subqueries returns more than one row.
(SELECT account_number FROM patients WHERE file_no = param_file_no);
(SELECT account_number FROM accounts WHERE account_id = param_account_id);
(SELECT price_list_id FROM accounts WHERE account_number = param_account_number);
Meaning that the table holds more than a single record with the requested Id.
|
|
|
|
|
Quote: Meaning that the table holds more than a single record with the requested Id.
True, this is the main reason for such errors. Many resolution are available for this errors.
1) Use Top 1 keyword i.e.
(SELECT TOP 1 account_number FROM patients WHERE file_no = param_file_no);
(SELECT TOP 1 account_number FROM accounts WHERE account_id = param_account_id);
(SELECT TOP 1 price_list_id FROM accounts WHERE account_number = param_account_number);
2) Some of your data require 1 or more condition (filter) in your sql query.
3) If any duplicate entry found, Please remove.
Maulik Dusara
Sr. Sofware Engineer
|
|
|
|
|
MaulikDusara wrote: True, this is the main reason for such errors. Many resolution are available for this errors.
1) Use Top 1 keyword i.e.
That's assuming that there's more of those records allowed in the table, and being an "Id", this COULD lead to an inconsistent database - exaggerating the problem.
|
|
|
|
|
I agree with Eddy, while top 1 fixes this problem it ignores the underlying data problem that the user was expecting 1 record per ID and is not getting the correct result.
Never underestimate the power of human stupidity
RAH
|
|
|
|
|
Quote: not getting the correct result.
I also agree and I also mentioned that Quote: Many resolution are available for this errors
Please also check my other resolutions.
Maulik Dusara
Software Engineer/Team Leader
|
|
|
|