GHT Geographic Hash Table for Data Centric Storage

1 / 35
About This Presentation
Title:

GHT Geographic Hash Table for Data Centric Storage

Description:

GHT Geographic Hash Table for Data Centric Storage. Sylvia et. al. Presented By - Jay ... Geographic boundaries are known. Nodes know their locations. Access Points ... –

Number of Views:313
Avg rating:3.0/5.0
Slides: 36
Provided by: jatinderas
Category:

less

Transcript and Presenter's Notes

Title: GHT Geographic Hash Table for Data Centric Storage


1
GHT Geographic Hash Table for Data Centric
Storage
  • Sylvia et. al
  • Presented By - Jay

2
Concepts
  • Data Centric Stroage
  • Data are named and communication abstractions
    refer to these names rather than to node network
    addresses.
  • Hash Table
  • Distribute evenly and provide fast lookup.
  • Geographic Hash Table
  • Distribute over geography

3
Assumptions
  • Large scale
  • Geographic boundaries are known
  • Nodes know their locations
  • Access Points
  • Minimize Energy Consumption
  • Total Usage
  • Hot Spot Usage

4
Sensor Net Data
  • Observations
  • Low level readings
  • Events
  • Derived from individual readings
  • Tasks
  • Users specified instructions
  • Actions
  • Taken after event occurence
  • Queries
  • Retrieve information from network

5
Data Dissemination
  • Communication Costs
  • Message Transmission for floods
  • O(n)
  • Point to Point Routing
  • O(vn)

6
Canonical Methods
  • External Storage
  • Relevant data sent to central storage
  • Costs O(vn) for each event
  • No cost for user queries
  • Local Storage
  • Data stored locally, queries flood.
  • Costs O(n) for queries
  • Response costs O(vn)

7
Canonical Methods
  • Data Centric Storage
  • Store event by name, query by name
  • O(vn) to store
  • O(vn) to query
  • O(vn) to return

8
Approximate Communication Costs
  • n nodes
  • T event types
  • Dtotal total no of events detected
  • Q no of event types for which queries are
    issued
  • Dq no of events detected for the types of
    events queried for.

9
Cost
  • External
  • Total Dtotal ( vn)
  • Hotspot Dtotal
  • Local
  • Total Q n Dq (vn
  • Hotspot Q Dq

10
Data Centric Stroage
  • Total
  • List
  • Q ( vn) Dtotal (vn) Dq (vn)
  • Summary
  • Q ( vn) Dtotal (vn) Q (vn)
  • Hotspot
  • List
  • Q Dq (list)
  • Summary
  • 2 Q

11
Important Points
  • As n get large, local storage incurs the highest
    packet count
  • ES incurs a lower total message count but not by
    much
  • If Dq gtgt Q and events are summarized, DCS has
    lowest load on ACCESS PATH
  • If events are listed and Dtotal gtgt Dq than DCS
    and LS have lower access loads than external
    storage.

12
Where to Use DCS
  • Sensor networks are large
  • There are many detected events but not all are
    queried.

13
Design Criteria
  • Node Failures
  • Topology Changes
  • Scalability
  • Energy Constraints

14
Goals
  • Persistence A (k, v) pair persists
  • Consistency query to be routed to the correct
    (k,v) node
  • Scalability Concentration of storage should be
    avoided

15
GHT
  • Put()
  • Get()
  • Key k to a location k. Node nearest to k stores
    the (k,v)
  • PRP Perimeter Refresh Protocol
  • SR Structured Replication

16
GPSR
  • Greedy Perimeter State Routing
  • Greedy
  • P
  • erimeter

17
Greedy
18
Greedy Problem
19
Perimeter
20
Home Node and Home Perimeter
  • Home Node
  • Node closest to the geographic location.
  • Home Perimeter
  • The perimeter enclosing the location.
  • What happens in case of mobility and failures.

21
Permiter Refresh Protocol
  • Stores a copy of a key-value pair at each node on
    the home perimeter.
  • Replica nodes.
  • Th seconds, home node generates a refresh packet.
  • Take over time , Tt runs at replicas.
  • Td, death timer.
  • In GHT, Td 3 Th and Tt 2 Th

22
PRP
23
PRP
24
PRP
25
Structured Replication
  • Augment event name with hierarchy depth
  • Given root r and given hierarchy depth d
  • 4d 1
  • Storage costs lower from
  • O(v n) to O(v n/ 2d)
  • Query costs go up
  • O(v n) to O(2d v n)

26
Simulation
  • Ns2
  • 802.11 MAC
  • 40 m lt 250m

27
Sim Results
28
SR
29
SR
30
Comparative Study
  • n no of nodes
  • T no of Event types
  • Q no. of event types
  • Di no of detected events of type i

31
(No Transcript)
32
(No Transcript)
33
(No Transcript)
34
(No Transcript)
35
Conclusion and Future Work
  • Sounds good
  • Effect of varying node density as DHT distributes
    by location
  • Boundary knowledge to avoid hashing outside the
    boundary
Write a Comment
User Comments (0)
About PowerShow.com