Title: Proxy-Assisted Techniques for Delivering Continuous Multimedia Streams
1Proxy-Assisted Techniques for Delivering
Continuous Multimedia Streams
- Lixin Gao, Zhi-Li Zhang, and Don Towsley
2Agenda
- Related work
- Proxy-Assisted Video Delivery Architecture
- Proxy-Assisted Catching
- Proxy-Assisted Selective Catching
- Simulation results
- Conclusion
3Related Work
- Server-push
- -gt Typically designed for hot (frequently
requested) objects - -gt Fixed number of multicast channels
4Limitations of current technology
- Server and network resources (Server I/O
bandwidth and network bandwidth) are major
limiting factors in widespread usage of video
streaming over the internet - Need techniques to efficiently utilize server and
network resources - Service latency and popularity of video object
should be considered
5Proxy-Assisted Video Delivery Architecture
6Advantages of proxy-assisted video delivery
- Latency reduction without increasing demand on
backbone network resources - Need to store only the initial frames hence
feasible with large data volume - I/O bandwidth requirement on proxy server is
insignificant, since responsible for limited
number of clients
7Classification
- Proxy-assisted catching Suited for hot video
objects - Proxy-assisted selective catching Even suited
for cold (less frequently requested) video
objects
8Advantages of proposed architectures
- Reduce the resources requirements at central
server - Reduce service latency experienced by clients
Assumptions
- Client can receive data from 2 channels
simultaneously
9Proxy-Assisted Catching
- Reduces service latency by allowing clients to
join an ongoing broadcast - Clients catch-up by retrieving initial frames
using unicast channel from proxy
10Proxy-Assisted Catching
Partition function used
11Optimizing
- Server and network bandwidth are major
bottleneck. Hence reducing total number of
channels required - Trade-off between
- -gt Number of dedicated channels by server
- -gt Storage space required by proxy
12Terms involved
- N No. of video objects on central server
- L Length of video
- ? Request rate (Poisson distribution)
- K Server channels to broadcast video
- K Optimal number of server channels
- i Video object no.
- j Broadcasting frame
13Calculation
-
- No. of proxy channels required
-
- Total no. of channels required
- Tradeoff between number of server channels and
expected number of proxy channels required for
catch-up
14Calculation contd..
- Optimization problem
- Expected number of channels
Optimal no. of server channels
Optimal no. of proxy channels
15Controlled Multicast
- Client pull technique
- Allows client to join the ongoing multicast if it
requests with a certain threshold time Ti - Else a new multicast channel is allocated
Proxy-assisted Controlled Multicast
- Proxy pre-store the initial Ti frames of video
- Missing portion of video is send separately
through a unicast channel - Good technique for cold video objects
16Comparison with Proxy-Assisted Controlled
Multicast
- Total no. of channels required for controlled
multicast is - For large value of ? no. of channels required by
proxy-assisted catching is less - Verified using following setup
- L 90 min. video object
-
17Observation
0.4
18Proxy-Assisted Selective Catching
- Combines Proxy-Assisted Catching and Controlled
Multicast - Broadcast most frequent videos using
Proxy-Assisted Catching and less frequent videos
using Controlled Multicast
19Classifying Hot and Cold videos
Total no. of channels required using catching
Total no. of channels required using controlled
multicast
20Simulation results
- Simulation settings
- N No. of video objects on central server
- ? Request rate (Poisson's distribution)
- Simulates 150 hours of client requests
- Ki Broadcasting channels for hot video
objects - Remaining channels for controlled multicast
- First-come-first-serve basis
21Assumptions
- Sufficient proxy resources to store prefixes for
all videos - Proxy server has 40GB of storage space and I/O
bandwidth of 88 Mb/s
22Waiting time vs. total number of channels
? 50
710
900
23Waiting time vs. Arrival rate
- ? varies from 40 to 80
- Total no. of channels 700
24Total no. of channels vs. arrival rate
100
150
Performance of selective catching and catching
same
25Waiting time vs. Server channels
700
460
- 36 saving in number of channels required at
central server
26Number of channels vs. Arrival rate
- Significant reduction in central server channel
requirement
27Waiting time vs. Server channels
- Advantage of proxy-assisted selective catching
does not critically depend on availability of
proxy storage space
28Conclusion
- Approach is proved using quite realistic
simulations without any major assumptions - If the arrival rate exceeds beyond certain
assumptions then the service latency will
increase
29(No Transcript)