The Inevitability of Network Congestion

reading time Reading Time: 4 minutes

Slow page load speeds are one thing, but the frustrations that people have with them are only just a small part of what grows out of digital network congestion. In the same motor vehicle traffic becomes even more of a problem when cities become more populated a network only has so much capacity. And when that capacity is exceeded, performance – and namely the speed at which requests are handled – starts to suffer. Interpersonal communications are way down the list of issues that are seen with urgency related to this though, and that doesn’t need explanation.

Excess latency with networks resulting from over congestion can be a big problem when major operations are relying on those networks. This is even more of a looming potential issue with the way healthcare is increasingly relying on 5G network connectivity, and that’s one are where they especially can’t have lapses or downtime because of network congestion. And what is being seen interestingly with this network congestion issue is that some key algorithms designed to control these delays on computer networks are actually allowing some users to have access to most of the bandwidth while others get essentially nothing.

Network speeds are of interest because of operations for a lot of service providers, and here at 4GoodHosting that will apply to us to a Canadian web hosting providers like any other. This is definitely a worthy topic of discussion, because everyone of us with a smartphone is relying on some network functioning as it should every day. So what’s to be made of increasing network congestion?

Average Algorithms / Jitters

A better understanding of how networks work may be the place to start. Computers and other devices that send data over the internet before breaking it into smaller packets and having special algorithms decide how fast these packets need to be sent. These congestion-control algorithms aim to discover and exploit all the available network capacity while sharing it with other users on the same network.

There are also congestion-control algorithms, but they don’t work very well as mentioned above. This is because a user’s computer does not know how fast to send data packets because it lacks knowledge about the network. Sending packets too slowly results in poor use of the available bandwidth but sending them too quickly may overwhelm a network and mean packets get dropped.

Congestion-control algorithms take notes on packet losses and delays as details to infer congestion and making decisions on how quickly data packets need to be sent. But they can get lost and delayed for reasons other than network congestion and one common way that is occurring now more than ever before is what is called a ‘jitter’ in the industry. This is where data may be held up and then released in a burst with other packets and inevitably some of them have to delayed in sending as the bulk of them can’t all go at once.

More Predictable Performance

Congestion-control algorithms are not able to distinguish the difference between delays caused by congestion and jitter. This can be problematic because delays caused by jitter are unpredictable and the resulting ambiguity with data packets confuses senders so that they estimate delays differently and send packets at unequal rates. The researchers found this eventually leads to what they call ‘starvation’ – the term for what was described above where some get most and many get next to nothing.

Even with tests for new and better data packet control and sending algorithms there were always scenarios with each algorithm where some people got all the bandwidth, and at least one person got basically nothing. Researchers found that all existing congestion-control algorithms that have been designed to curb delays are delay-convergent and this means that starvation continues to be a possibility.

Finding a fix for this is going to be essential if huge growth in network users is going to be the reality, and of course it is going to be the reality given the way world’s going and with population growth. The need is for better algorithms can enable predictable performance at a reduced cost and in the bigger picture to build systems with predictable performance, which is important since we rely on computers for increasingly critical things.

You may also like: