Suppose that we need to distribute a file of size F = 2 Gbits to 5 peers. Suppose the server, s has an upload rate of uₛ = 50 Mbps. The 5 peers have upload rates of :
u₁ = 20 Mbps
u₂ = 25 Mbps
u₃ = 14 Mbps
u₄ = 10 Mbps
u₅ = 15 Mbps
The 5 peers have download rates of :
d₁ = 29 Mbps
d₂ = 33 Mbps
d₃ = 15 Mbps
d₄ = 20 Mbps
d₅ = 27 Mbps

What is the minimum time needed to distribute this file using client-server model? What is the limiting factor that contributes to the downloading time using client-server model: the'server upload rate', 'specific client download rate', or the 'combined upload of the clients and the server'?

Q&A Education