While the answer in solution 1 is entirely correct, I have some serious doubts that your goals are in fact anywhere close to reality. To perform any meaningful computation, you'd be hard pressed to find a processor which can compute it in less than 1 nanosecond. Considering real-time concerns at a timescale lower than that is simply unrealistic.
More importantly, you're interfacing with hardware, which itself also generates delays in signal processing. Most common hardware devices have reaction times in the microsecond range. If you need nanoseconds, it will get expensive. Lower than that? - I don't even know if it exists!
In other words, if you want to get into picosecond range, your concerns should be the hardware sensor and actor interfaces: these will be the bottleneck, not the programming!
P.S.: just to give you an idea of what is possible I checked on ADCs (analog digital converters), one of which you'll need to interpret whatever external signal is responsible for triggering your program:
http://en.wikipedia.org/wiki/Analog-to-digital_converter[
^]:
Quote:
Flash ADCs are certainly the fastest type of the three. The conversion is basically performed in a single parallel step. For an 8-bit unit, conversion takes place in a few tens of nanoseconds.
Since the conversion is performed in parallel, it won't matter (much) how many bits you require, but in the end you'll need a significant fraction of a nanosecond just to invoke your program. So even with a super computer, if you had one, you can't realistically program for a real-time response in the picosecond range, let alone femto second!