Click here to Skip to main content
15,889,604 members
Please Sign up or sign in to vote.
0.00/5 (No votes)
See more:
I used both and had the same accuracy.

Both have the same size in VS: 8 bytes.
So it seems that both have the same accuracy.

Using the high_resolution_clock it is possible to use the microseconds part.

So perhaps is better to try using the high_resolution_clock but if errors appeared then use the old method that is easier.

What I have tried:

cout<<"Size of std::chrono::time_point<std::chrono::system_clock> "<<sizeof(std::chrono::time_point<std::chrono::system_clock>)<<endl;
cout<<"Size of time_t                                             "<<sizeof(time_t)<<endl;
Posted
Updated 9-Oct-17 23:36pm

1 solution

It is not the same because std::system_clock is a real time clock and clock_t returned by clock() is CPU time based which is called monotonic or steady clock.

Monotonic clocks are only incrementing at a fixed rate while real time clocks may be also adjusted (set to a different value) at intervals by comparing with another clock.

The bit size of a type does not tell you anything about the resolution or accuracy. This applies especially to time types where even types of same bit size and unit may have different accuracies due to the used clock implemenation.

If you want to measure and compare time spans, you should use a monotonic clock. For short time spans use the high resolution clock. But note that this might have the same resolution as clock_t on some systems. The available resolution depends on the operating system and the hardware (CPU) and must be usually measured (functions like clock_getres() just return the resolution of the time value which is the theoretically max. achievable resolution).
 
Share this answer
 
Comments
Javier Luis Lopez 10-Oct-17 7:06am    
In my computer microseconds=0 when using system_clock.

Is usually available the high_resolution_clock in x64 systems?
How to check if the high_resolution_clock is available in order to use a #define to use one or another without generate an error?
Jochen Arndt 10-Oct-17 7:35am    
"In my computer microseconds=0 when using system_clock."
?
With Windows, the default clock resolution is 15 ms. The lowest (best) is 0.5 ms.

"Is usually available the high_resolution_clock in x64 systems?
How to check if the high_resolution_clock is available in order to use a #define to use one or another without generate an error?"

http://en.cppreference.com/w/cpp/chrono/high_resolution_clock:
"Class std::chrono::high_resolution_clock represents the clock with the smallest tick period provided by the implementation. It may be an alias of std::chrono::system_clock or std::chrono::steady_clock, or a third, independent clock."

So it is always present but might be the same as another clock. You can't test whith a define. If you need the real resolution, you have to measure it.
Javier Luis Lopez 11-Oct-17 2:33am    
The problem is not only the resolution but also to avoid error in a multiplatform sw using something like:
#ifndef _HIGH_RESOLUTION_CLOCK
#define CLOCK system_clock
#else
#define CLOCK high_resolution_clock
#endif
Jochen Arndt 11-Oct-17 2:45am    
It is not a problem when using a C++11 compiler. Then high_resolution_clock is present (but might be an alias).
When not using a C++11 compiler, system_clock and high_resolution_clock are both not present.

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)



CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900