Click here to Skip to main content
15,891,567 members
Please Sign up or sign in to vote.
0.00/5 (No votes)
See more:
I am sending data serially across the device from COM port to USB, and i need to implement interbyte timing? Could anyone explain me how can this be implemented?
Posted

1 solution

That's complicated - because the Serial port class is buffered by Windows, the serial port itself is (normally) buffered by it's driver, and the hardware generally has a couple of bytes worth of buffer in it as well. And to make life even more complicated, the serial output stream is effectively buffered because the data (and it's associated start, stop, and parity bits) is output one bit at a time, providing another layer of buffering.

You can do it on a macro scale - using a timer of some form so that you can only load the SerialPort instance with "n bytes per second" - but it's clumsy.
And you can do it on a micro scale by altering the number of stop bits (1, 1.5 or 2) which will add a small amount of "per character" delay - but you can't add a lot that way.

What exactly are you trying to get?
 
Share this answer
 
Comments
vinod b v 9-Apr-15 7:12am    
@ OriginalGriff Could you give me an example of how i do it in macro scale?
OriginalGriff 9-Apr-15 7:26am    
Create a timer, set to a (say) 1000ms - one second.
Set up a generic Queue of bytes, and feed your data into it.
In the timer Tick event, check the queue - if it has any data in it, extract the first 100 bytes (if there are that many) and feed them to your serial port.

Your comms will be limited to a maximum of 100 bytes per second.

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)



CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900