Jitter is also referred technically as packet delay variation. This pertains to the variance in time delay in milliseconds (ms) between data packets over a network. This is typically a disruption in the normal sequence of sending data packets. It also means that there is a fluctuation in delay as packets are being transferred across a network. The level of delay throughout the transit would fluctuate and could lead to a 50 milliseconds delay on packet transfers. As a result, there is a congestion of networks because of how the devices fight for the same bandwidth space. Hence, the more it gets congested, the greater the possibility that packet loss will happen.
In transferring data and visiting a website, since the website is a collection of all data packets, the packets will be sent from a server over a network to the user’s computer or connected device and will be loaded by the web browser. With high jitter, there will be 3 packets that will not be sent when requested. When the lapse of time is already complete, all 3 packets will arrive at once. This causes an overload for the requesting computer device. This leads to congestion and a loss of data packets across the network. Jitter can be likened to a traffic jam in which the data cannot move at a reasonable speed because all the packets have come to a junction at the same time and nothing can be loaded. Then, the receiving computer device will not be able to process the information. As a result, there will be missing information. During packet loss, if these do not arrive consistently, the receiving endpoint has to make up for it and try to correct the loss. In some instances, exact corrections cannot be made and these losses become irretrievable. For network congestion, networks are unable to send an equal amount of the traffic that they receive and this is why the packet buffer will fill up and will start dropping the packets. Even though jitter is considered as an obstacle that causes delay, breach, or even loss of communication over the network, sometimes, there are anomalous fluctuations that do not really have a very long-lasting effect. In these situations, jitter is not really too much of a problem because there are acceptable levels of jitter that can be tolerated such as the following:
The above figures show conditions to consider where jitter is acceptable. Acceptable jitter simply refers to the willingness to accept irregular fluctuations in transferring data.[1:1]
For best performance, the jitter must be kept below 20 milliseconds. If this exceeds 30 milliseconds, then it will cause a noticeable impact on the quality of any real-time conversation that a user may have. At this rate, the user will start to experience distortion that will affect the conversation and make the messages difficult for other users. The effect brought by jitter depends on the service the user will be using. There are some services where jitter will be very noticeable but will still remain significant in other services like voice calls and video calls. Jitter becomes a problem during voice calls because it is the most cited service where jitter has been shown to be really detrimental. Primarily, this is due to the way VoIP data transfer occurs. The voice of the user will break down into different packets and it will be transmitted to the caller on the other side.[1:2]When measuring jitter, there is a need to calculate the average packet-to-packet delay time and this is done in several ways depending on the type of traffic.
Delay refers to the amount of time it takes for a bit of data to move from one endpoint to another endpoint. It usually affects the user experience and is highly dependent on several factors. Delay is made up of four components: processing delay, queueing delay, transmission delay and propagation delay. While Jitter specifically delays inconsistencies. It refers to the discrepancy between the delays of two packets. With this, it results in packet loss and network congestion. Therefore, Jitter and delay are typically tied to each other but they are not the same.
Latency is considered as a delay through the network. It is the time that is needed by a data packet to reach the destination from the source. As a result, there is a propagation delay, serialization, and buffering of packets. Latency is the period starting from the transmission of the packet from the sender to the reception of the packet at the receiver while Jitter is the difference between the delays of forwarding the two consecutive received packets in just the same streams.
Jitter is already a common user concern and, often times, its underlying cause is the difference in the average latency time of the packets. This concern can be solved by purchasing a powerful router, using an Ethernet cable, using high-speed internet, and eliminating jitter with haste. In order for the system to compensate for the effects of jitter, the main tool to be used is a buffer or a buffer memory. This is a system that will allow the storage of temporary data. It helps the device to adjust for irregular fluctuations in transferring data. Network jitter is really difficult to solve because it is unpredictable. That is why the importance of ensuring a quality network connection, good and adequate bandwidth, and predictable latency can help reduce network jitter.