First let us understand one example and then we will proceed towards the actual definition:
Suppose, you are travelling from City A to City B and you have to cross a bridge which can carry 1000 vehicles per hour but due to heavy traffic on the bridge it can carry only 100 vehicles per hour.
Summarizing the situation:
- The maximum capacity of a bridge is 1000 vehicles/hour
We can call it as Bandwidth
- Due to the heavy traffic, it reduces to 100 vehicles/hour
We can call it as Throughput
- And Time Taken from reaching city A to city B is 20mins (Let us assume).
We can call it as Latency
Now Let us learn the actual definitions:
1) Latency (delay):
The time taken by a data packet (or request) from source to reach the destination.
2) Throughput:
The no of data packets that are processed during a specific time period.
Generally we should aim for low latency and maximum throughput.
Take a look at below image, it provides an easy illustration about both these concepts:
Image Credit: https://www.comparitech.com/net-admin/latency-vs-throughput/
We collect cookies and may share with 3rd party vendors for analytics, advertising and to enhance your experience. You can read more about our cookie policy by clicking on the 'Learn More' Button. By Clicking 'Accept', you agree to use our cookie technology.
Our Privacy policy can be found by clicking here