First let us understand one example and then we will proceed towards the actual definition:
Suppose, you are travelling from City A to City B and you have to cross a bridge which can carry 1000 vehicles per hour but due to heavy traffic on the bridge it can carry only 100 vehicles per hour.
Summarizing the situation:
- The maximum capacity of a bridge is 1000 vehicles/hour
We can call it as Bandwidth
- Due to the heavy traffic, it reduces to 100 vehicles/hour
We can call it as Throughput
- And Time Taken from reaching city A to city B is 20mins (Let us assume).
We can call it as Latency
Now Let us learn the actual definitions:
1) Latency (delay):
The time taken by a data packet (or request) from source to reach the destination.
The no of data packets that are processed during a specific time period.
Generally we should aim for low latency and maximum throughput.
Take a look at below image, it provides an easy illustration about both these concepts: