Glossary

Network Latency

Easy

Network latency refers to the amount of time it takes for a computer on one network to communicate with a computer on another network.

What Is Network Latency?

Network latency is the time it takes for a packet of data to travel from one point to another. This can be expressed either as an absolute time or a relative time with respect to when the request was sent. This includes both the physical distance the data travels and how long it takes for it to be processed. For example, let's say you're streaming a movie on Netflix. The movie you're watching is actually stored on a server in California, but when you click "play," your request goes through several other servers before reaching California. Once your request reaches the server that stores the movie, it has to be processed before it can be sent back to you. All of this takes time. A high latency results in more delay between the moment when you press the play button and the moment when the video is played.

The amount of latency that is acceptable depends on what you're doing online. Gaming and video chatting have higher latency thresholds than email and web browsing.

Information technology experts say that 50 milliseconds — that's one-tenth of a second — is about as much lag as the human brain can handle. While that's enough for most people, gamers may want lower latencies. Professional gamers consider anything over 100 milliseconds (1/10th of a second) too much lag for an optimal experience. 

The lower the displayed latency, the better. High latency can be caused by a low bandwidth connection, overloaded servers, or full router queues.

Network latency is expressed in milliseconds (ms). For example, 200 ms means 200 milliseconds (or 0.2 seconds), while 300 ms means 0.3 seconds.

The factor of network latency is quite crucial when it comes to blockchain and cryptocurrencies. This is because it directly affects the time taken for a transaction to be confirmed. 

Blockchains are based on a consensus mechanism where miners rely on speed to earn rewards. Every second that a new block isn't found and added to the blockchain, potential revenue is lost.

Correlation Between Latency and Throughput

Latency is a measurement of time; it is the amount of time it takes for a transaction to be completed. Throughput is the amount of work, measured in the number of transactions per second.

Trying to increase the throughput of HTTP requests without considering latency will only cause performance issues. The goal should be to increase throughput while keeping latency low.

The best way to optimize is to decrease the number of HTTP requests or to decrease the response time from your servers. It is also a good idea to consolidate database calls where possible and try to cache as much data as possible.