0

There are several means and ways to calculate the Packet Delay Variation (PDV or jitter), there's a RFC, and several definitions in various tools around these notions. Also, several delays can be concerned (ACK on TCP, specific response in UDP, protocol dependent...).

Apart for protocols that obviously need a constant throughput (as VOIP), do you use jitter ?, and how important is it ? Have you critical examples where information provided by jitter is crucial ?

In specific examples, could you explain which type of Jitter is involved ? And if it is calculated as "instant" Jitter or on a specified time frame ?

(As an optional question: do you use spectral packet delay variation graphs which gathers jitters in all different timeframes ? is this of any interests ?)

1 Answers1

1

Jitter might be part of calculating the quality of service. This is not limited to VOIP but also for streaming video/audio for instance.

You could basically use jitter (and of course other parameters) to get an early warning of worsening service quality. If you compare the measured jitter with the current server load, it might be possible to indicate even DDOS attacks or other developing problems.

HS.
  • 206
  • 1
  • 3
  • 6