|
Thanksalot!!!
my problem is solved now..
Allah bless youuu!
tc
regards,
Sana
$@N@
|
|
|
|
|
You're welcome
|
|
|
|
|
I have populated a dataset with a dataadapter and now I want to change the values. Nothing happens.
All rows are selected
Dim tblADR_BASE As DataTable
tblADR_BASE = dsADR_BASE.Tables("ADR_BASE")
Dim rows() As DataRow = tblADR_BASE.Select()
Dim intCRNO As Integer
Dim i As Integer
For i = 0 To rows.GetUpperBound(0)
intCRNO = checkForCRNO(rows(i))
rows(i)("dsp_CRNO") = intCRNO
rows(i)("CITY_ID") = "test"
Console.WriteLine(rows(i)("dsp_CRNO"))
dsADR_BASE.AcceptChanges()
Next i
Whats weird is I can wright the supposed changes to the console, but if I actually open the table in the dataset, nothing has been changed.
Am I missing some sort of save?
Thanks
|
|
|
|
|
With a quick look the code looks fine (although you don't have to call AcceptChanges on every iteration). What does the code look like where you investigate if data is changed.
|
|
|
|
|
Thanks Mike,
Dim test = dsADR_BASE.HasChanges
Produces false.
I swear I had this working a week ago on a different table. I have since added fields to this table and think maybe that screwed things up. I have refreshed everything.
|
|
|
|
|
It's because you call AcceptChanges in the loop. That method clears the modified flag from all rows. The idea behind AcceptChanges is that you let the rows to be modified (or inserted or updated) until you put the data to a safe place (for example database). After that the modification flags can be removed.
So in short, remove the AcceptChanges.
|
|
|
|
|
Thanks, I'll give that a try.
But, I added the AcceptChanges after I ran into the issue. But I will sure give it a go.
Thing is, the tables are being restored at the moment, and I have to stay out of it. I'll post results when I can. May be Monday.
Thanks
|
|
|
|
|
enipla wrote: But, I added the AcceptChanges after I ran into the issue
If that's the case, check that you don't have acceptChanges (or RejectChanges) anywhere else. When you can you can do the same HasChanges test using debugger after the loop. That way you can confirm that the dataset has been modified by your loop.
|
|
|
|
|
New DB, Same result. I got rid of the only AcceptChanges. There are no RejectChanges.
It still writes to the console
|
|
|
|
|
After the loop, did you test with the debugger, what's the value of dsADR_BASE.HasChanges
modified on Friday, January 23, 2009 4:38 PM
|
|
|
|
|
We are getting somewhere. .HasChanges = true.
When I open the table with the preview data option no data has been changed.
|
|
|
|
|
If you take a look at the row after you have modified it's values, do you see the change in the data and is the RowState Modified
|
|
|
|
|
row.RowState = Modified
And I can print the values out to the console AFTER I leave the loop.
I just don't see the changes when I open the table under the DataSouces Tab.
Thanks again Mike. I'll be leaving work in about a 1/2 hour if I don't get back to you today.
|
|
|
|
|
Okay. When you have the opportunity, check that you update the data somewhere in you code using SqlDataAdapter.Update and that you don't call Fill before that. That should make the modifications to the original data source.
|
|
|
|
|
That did it.
I was sure that I was looking at a DataSet, and that you don't have to update the source to see the changes in the set.
I must not be looking in the correct place. I'm in the DataSources Tab, and it only shows the Tables that I want in the dataset. I assumed I was looking at the DataSet, but I guess that's the DataSource.
Thanks Mike. You did it again for me.
|
|
|
|
|
No problem at all
|
|
|
|
|
I wrote a program with vb.net. Its banks are are created in access. Can I add a facility like Undo and Redo to my program?
for example when my user delete some records from my program, then by clicking on a button deleted records add to my banks again.
Or when the user edit some records by clicking on a button the record came back to its prime situation.
|
|
|
|
|
I'm not certain what you're after but if you want to undo modifications, it's done using transactions. However, transaction should never wait for user input. Other mechanism could be that you log changes to be able to revert them afterwards.
This could be done by adding more tables and logic to your application. But the downside is that mechanism like this would easily be very complex unless it's well restricted. For example what if a record insertion is reverted after few days and other records are added meanwhile whcih then depend on this record.
|
|
|
|
|
I got a requirement for a buisness web application that will handle 100 concurrent users requests. we designed our data tire. we will use SQl server 2005 and we will use OLAP services. the question is how can we determinate the best physical hardware requirment for database server. i mean we need to know min phyiscal memory size , processing power. for the server that will be managed by windows 2003 server OS. known that the client wants max page loading duration on max load(100 concurent users) is 8 seconds.
Note:
1-we did tried to set up server with 2 giga ram, and dual core processor. on first 30 concurent users it was ok. after user 50. the processor,memory utilization is 100% and things got worse . over 60 seconds per page
2- we use tool to test performance that simulate 100 users request the application every second.
marcoryos
|
|
|
|
|
My suggestion are as
1. Intel Xeon Quad Core2 (Two Processors i.e. 8 Cores)
2. RAM 4 GB
3. SATA 2 HDD
4. Windows 2003
We are able to handle nearly 500 Concurrent Users and CPU utilization is 10%(MAX).
|
|
|
|
|
Remove OLAP services into another DB that is not on your OLTP db. Reporting cripples OLTP databases.
I can squeeze 100 users onto an ASP.NET/SQL Express Business Application using a shared VM with good design techniques so if your dedicated server is not fast enough consider looking at the system architecture. Good design scales a lot faster than hardware upgrades.
Need software developed? Offering C# development all over the United States, ERL GLOBAL, Inc is the only call you will have to make.
If you don't ask questions the answers won't stand in your way.
Most of this sig is for Google, not ego.
|
|
|
|
|
Most likely the problem is in the overhead that OLAP services cause. The main target for OLAP services is reporting, not transaction handling. Instead of OLAP use the OLTP engine of SQL Server in other words the normal SQL Server engine and design the relational model so that it supports continuous data manipulation and retrieval.
|
|
|
|
|
Thanks alot for replies, so far i got that we should take alook at our design again, and try to seperate db engieen and OLAP to be in two servers instead of one, Am i correct?
marcoryos
|
|
|
|
|
I think that's a good way to go.
|
|
|
|
|
Dear all, I have a linked-server on my database. My senior in my company make it like that, because in the future, we will have some servers to handle some applications in the company. But after we make it so and running it, the transfer is so damn slow. Is it because the linked-server? Are there any better ways to access database from multi-server in one application. Using mirror or something? If you have any reference for this, please help me. Thanks.
- No Signature Available -
|
|
|
|