PadhaiTime Logo
Padhai Time

Latency vs Throughput

First let us understand one example and then we will proceed towards the actual definition:

 

Suppose, you are travelling from City A to City B and you have to cross a bridge which can carry 1000 vehicles per hour but due to heavy traffic on the bridge it can carry only 100 vehicles per hour.

 

Summarizing the situation:

- The maximum capacity of a bridge is 1000 vehicles/hour 

  We can call it as Bandwidth

- Due to the heavy traffic, it reduces to 100 vehicles/hour

  We can call it as Throughput

- And Time Taken from reaching city A to city B is 20mins (Let us assume).  

  We can call it as Latency

Now Let us learn the actual definitions:

 

1) Latency (delay):

The time taken by a data packet (or request) from source to reach the destination.

 

  • We can measure the latency in following units: microsec, sec , millisec
  • When Latency is Low, then the system is Good.
  • When Latency is High, then the system is Bad.

 

2) Throughput:

The no of data packets that are processed during a specific time period.

 

  • We can measure the Throughput in following units: bit per sec (mbps, gbps)
  • When Throughput is Minimum, then the system is Bad.
  • When Throughput is Maximum, then the system is Good.

 

Generally we should aim for low latency and maximum throughput.

 

Take a look at below image, it provides an easy illustration about both these concepts:

undefined

Image Credit: https://www.comparitech.com/net-admin/latency-vs-throughput/ 

Bengaluru, India
contact.padhaitime@gmail.com
  • We collect cookies and may share with 3rd party vendors for analytics, advertising and to enhance your experience. You can read more about our cookie policy by clicking on the 'Learn More' Button. By Clicking 'Accept', you agree to use our cookie technology.
    Our Privacy policy can be found by clicking here