Click here to Skip to main content
15,891,981 members
Please Sign up or sign in to vote.
5.00/5 (1 vote)
See more:
Dear All,

I am developing a project which receives packets from a UDP Multicast channel decode it and saves it into database.

There are different packets in different formats.

Actually the broadcast is very fast. May be 600-800 packets per second. For the time I created a thread in which I join the group and receives the packets and adding in a packet processor object running in another thread. This packets processor process every packet and decompress, decodes the data inside it and raises an event which refreshes the data to user, at this time I am adding the decoded data in a data object which saves the data into the database running in a different thread.

This is all I am doing because I dont want to miss any packet.

The process is running OK, But my problem is in this process the CPU usage is very high.

Can you just tell me what will be the best practice for the problem. Remember there are almost 1000 insert statement is running in SQL Server in a second.

Regards,
Goutam
[e-mail address removed]
Posted
Updated 25-Mar-10 2:44am
v2

Just to clarify, you're refreshing the UI 600-800 times as the decompressed packet data is available from the processor thread? Which from what I understood means that there are 600-800 refreshes a second of the UI! Is my understanding correct?

If so, it is understandable why the CPU usage is immensly high. If you were to refresh once every second with the most upto date values, whilst potentially storing the values in the database you would greatly improve the design.

You could go one step further and store items in the databse in batches, say every 500 or so together.

This is assuming that the UI is being refreshed 600-800 times a second, with every decompress of a packet.
 
Share this answer
 
Thanks for the reply


Yes you are right. But I should change the question a little. actually in the packets there are details of diferrent items and their details are comming. But in a packet there are only 2 items and there details. In a second on an average there are almost 400-500 packets are comming from the broadcast. There is also such packets which i can ignore.

But The UI is refreshing only to the extent of selected items. Other items are getting ignored after it comes to UI. but I need to save all items and details in order to save the the log. but I cannot refresh the UI in a batch process, it makes a lots of diferrence if i delay even for a milisecond. The time is a big factor. One more thing I should mention there are clients who are consuming the updated data from hundreds of client terminals. They are running analitical queries depending the data saved in the server.

From today i am having another problem, in the morning the data object is running alright. But in the afternoon the data object (Save/Insert Procedure ) is running slower than the broadcast. so the data Q is getting larger than expected.

Still I am trying to save saving time by reducing the data length. But still i am not in a good position.


Regards
Goutam
 
Share this answer
 
v2

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)



CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900