Good day all,
I've been working on a project at work that transfers a users files from their old laptop or pc to their new one. WCF is used to transfer the files via a cross-over cable, or other network connection. We've been trying to do some calibration tests in order to predict the estimated copy time based on number of files and size of data. (Also accounting for hardware). Something we've noticed is that we receive much slower transfer speeds the first time the tool is run. Then after that it's almost as though some caching has taken place in the background, meaning that it gets through the same data (or different data) in a much faster time. Even re-booting the machine doesn't avoid this discrepancy, only re-building it from scratch does. Does anyone have any good thoughts on what might be occurring here, and how to avoid it? 99% of users will only ever run it once, hence they will always receive the slower speed. What I'd like to know is whether there is some cached memory I can release to reset the machine to a "first-run" state. Or is there something else I can do?
Thanks for your time people,
Jib