ARC:%20A%20SELF-TUNING,%20LOW%20OVERHEAD%20REPLACEMENT%20CACHE - PowerPoint PPT Presentation

About This Presentation
Title:

ARC:%20A%20SELF-TUNING,%20LOW%20OVERHEAD%20REPLACEMENT%20CACHE

Description:

If missing page was in bottom B1 of L-1, ARC increases target_T1 ... If missing page was in bottom B2 of L-2, ARC decreases target_T1 ... – PowerPoint PPT presentation

Number of Views:162
Avg rating:3.0/5.0
Slides: 16
Provided by: jeha1
Learn more at: https://www2.cs.uh.edu
Category:

less

Transcript and Presenter's Notes

Title: ARC:%20A%20SELF-TUNING,%20LOW%20OVERHEAD%20REPLACEMENT%20CACHE


1
ARC A SELF-TUNING,LOW OVERHEAD REPLACEMENT CACHE
  • Nimrod Megiddo Dharmendra S. Modha
  • IBM Almaden Research Center

2
Introduction (I)
  • Caching is widely used in
  • storage systems
  • databases
  • web servers
  • processors
  • file systems
  • disk drives
  • RAID controllers
  • operating systems

3
Introduction (II)
  • ARC is a new cache replacement policy
  • Scan-resistant
  • Better than LRU
  • Self-tuning
  • Avoids problem of many recent cache replacement
    policies
  • Tested on numerous workloads

4
Our Model (I)
Cache/Main Memory (pages)
replacement policy
on demand
Secondary Storage (pages)
5
Our Model (II)
  • Caches stores uniformly sized items(pages)
  • On demand fetches into cache
  • Cache expulsions decided by cache replacement
    policy
  • Performance metrics include
  • Hit rate ( 1 - miss rate)
  • Overhead of policy

6
Previous Work (I)
  • Offline Optimal (MIN) replaces the page that has
    the greatest forward distance
  • Requires knowledge of future
  • Provides an upper-bound
  • Recency (LRU)
  • Most widely used policy
  • Frequency (LFU)
  • Optimal under independent reference model

7
Previous Work (II)
  • LRU-2 replaces page with the least recent
    penultimate reference
  • Better hit ratio
  • Needs to maintain a priority queue
  • Corrected in 2Q policy
  • Must still decide how long a page that has only
    been accessed once should be kept in the cache
  • 2Q policy has same problem

8
Example
Last two references to pages A and B
X A
XB
X A
XB
Time
LRU expels B because A was accessed after this
last reference to B
LRU -2 expels A because B was accessed twice
after this next to last reference to A
9
Previous Work (III)
  • Low Inter-Reference Recency Set (LIRS)
  • Frequency-Based Replacement (FBR)
  • Least Recently/Frequently Used(LRFU) subsumes
    LRU and LFU
  • All require a tuning parameter
  • Automatic LRFU (ALRFU)
  • Adaptive version of LRFU
  • Still requires a tuning parameter

10
ARC (I)
  • Maintains two LRU lists
  • Pages that have been referenced only once (L1)
  • Pages that have been referencedat least twice
    (L2)
  • Each list has same length c as cache
  • Cache contains tops of both lists T1 and T2
  • Bottoms B1 and B2 are not in cache

11
ARC (II)
L-2
T2
T1 T2 c
Ghost caches (not in memory)
B2
12
ARC (III)
  • ARC attempts to maintain a target size target_T1
    for list T1
  • When cache is full, ARC expels
  • LRU page from T1 if
  • T1 ? target_T1
  • LRU page from T2 otherwise

13
ARC (IV)
  • If missing page was in bottom B1 of L-1, ARC
    increases target_T1
  • target_T1 min(target_T1max(B2/B1,1),c)
  • If missing page was in bottom B2 of L-2,ARC
    decreases target_T1
  • target_T1 max(target_T1-max(B1/B2,1),0)

14
ARC (V)
  • Overall result is
  • Two heuristics compete with each other
  • Each heuristic gets rewarded any time it can
    show that adding more pages to its top list would
    have avoided a cache miss
  • Note that ARC has no tunable parameter
  • Cannot get it wrong!

15
Experimental Results
  • Tested over 23 traces
  • Always outperforms LRU
  • Performs as well as more sophisticated policies
    even when they are specifically tuned for the
    workload
  • Sole exception is 2Q
  • Still outperforms 2Q when 2Q has no advance
    knowledge of the workload characteristics.
Write a Comment
User Comments (0)
About PowerShow.com