IE 11 developer tools are showing these values for a simple AJAX JSON GET request:
For the same request Fiddler shows
ACTUAL PERFORMANCE
--------------
ClientConnected: 19:59:32.433
ClientBeginRequest: 19:59:32.480
GotRequestHeaders: 19:59:32.480
ClientDoneRequest: 19:59:32.480
Determine Gateway: 0ms
DNS Lookup: 0ms
TCP/IP Connect: 0ms
HTTPS Handshake: 0ms
ServerConnected: 19:59:32.448
FiddlerBeginRequest:19:59:32.480
ServerGotRequest: 19:59:32.480
ServerBeginResponse:19:59:32.573
GotResponseHeaders: 19:59:32.573
ServerDoneResponse: 19:59:32.573
ClientBeginResponse:19:59:32.573
ClientDoneResponse: 19:59:32.573
Overall Elapsed: 0:00:00.093
What could be the reason for this difference (93ms VS 125ms) - are the developer tools just inaccurate or are they taking into account some other time (like looking up the resource in the local cache etc.)? Sometimes the difference is much larger (like 3ms VS 57ms). I looked at google.com to compare - there AJAX request timings are almost identical both in Fiddler and IE so I assume there is something that I could improve on my site.
The first thing to understand is that, in Windows, by default, the clock resolution is 15.6ms, so any measurements taken may vary by that much. In Fiddler, click Tools > Fiddler Options > Enable High Resolution Timers to change the system clock resolution while Fiddler is running.
Next, understand that Fiddler's showing you the actual network time, while the IE developer tools should be including both the network time as well as any associated overhead like checking the cache, internal queueing, etc.
Generally speaking, millisecond-level timing analysis isn't a great use of a web developer's time-- instead, look for duplicated resources, uncompressed data, redundant data, unnecessary redirects, etc, etc. Only bother getting down to the millisecond when you have complete control over your network infrastructure and are in a position to improve things at that low a level.