There are several means and ways to calculate the Packet Delay Variation (PDV or jitter), there's a RFC, and several definitions in various tools around these notions. Also, several delays can be concerned (ACK on TCP, specific response in UDP, protocol dependent...).
Apart for protocols that obviously need a constant throughput (as VOIP), do you use jitter ?, and how important is it ? Have you critical examples where information provided by jitter is crucial ?
In specific examples, could you explain which type of Jitter is involved ? And if it is calculated as "instant" Jitter or on a specified time frame ?
(As an optional question: do you use spectral packet delay variation graphs which gathers jitters in all different timeframes ? is this of any interests ?)