电子与信息学报
電子與信息學報
전자여신식학보
JOURNAL OF ELECTRONICS & INFORMATION TECHNOLOGY
2014年
12期
2795-2801
,共7页
葛国栋%郭云飞%刘彩霞%兰巨龙
葛國棟%郭雲飛%劉綵霞%蘭巨龍
갈국동%곽운비%류채하%란거룡
互联网%命名数据网络(NDN)%协作缓存%请求相关性%内容路由
互聯網%命名數據網絡(NDN)%協作緩存%請求相關性%內容路由
호련망%명명수거망락(NDN)%협작완존%청구상관성%내용로유
Internet%Named Date Networking (NDN)%Collaborative caching%Request correlation%Content-based routing
针对命名数据网络(Named Data Networking, NDN)存储空间的有效利用和应答内容的高效缓存问题,该文采用“差异化缓存”的方式,提出一种依据内容请求序列相关性的协作缓存算法。在内容请求中,预先发送对于后续相关数据单元的并行预测请求,增大内容请求的就近响应概率;缓存决策时,提出联合空间存储位置与缓存驻留时间的2维差异化缓存策略。根据内容活跃度的变化趋势,空间维度上逐跳推进内容存储位置,时间维度上动态调整内容缓存时间,以渐进式的方式将真正流行的请求内容推送至网络边缘存储。该算法减小了内容请求时延和缓存冗余,提高了缓存命中率,仿真结果验证了其有效性。
針對命名數據網絡(Named Data Networking, NDN)存儲空間的有效利用和應答內容的高效緩存問題,該文採用“差異化緩存”的方式,提齣一種依據內容請求序列相關性的協作緩存算法。在內容請求中,預先髮送對于後續相關數據單元的併行預測請求,增大內容請求的就近響應概率;緩存決策時,提齣聯閤空間存儲位置與緩存駐留時間的2維差異化緩存策略。根據內容活躍度的變化趨勢,空間維度上逐跳推進內容存儲位置,時間維度上動態調整內容緩存時間,以漸進式的方式將真正流行的請求內容推送至網絡邊緣存儲。該算法減小瞭內容請求時延和緩存冗餘,提高瞭緩存命中率,倣真結果驗證瞭其有效性。
침대명명수거망락(Named Data Networking, NDN)존저공간적유효이용화응답내용적고효완존문제,해문채용“차이화완존”적방식,제출일충의거내용청구서렬상관성적협작완존산법。재내용청구중,예선발송대우후속상관수거단원적병행예측청구,증대내용청구적취근향응개솔;완존결책시,제출연합공간존저위치여완존주류시간적2유차이화완존책략。근거내용활약도적변화추세,공간유도상축도추진내용존저위치,시간유도상동태조정내용완존시간,이점진식적방식장진정류행적청구내용추송지망락변연존저。해산법감소료내용청구시연화완존용여,제고료완존명중솔,방진결과험증료기유효성。
How to efficiently utilize the finite storage space and cache content chunks in the content store poses challenges to the caching policy in Named Data Networking (NDN). Using the differentiated caching strategy, a collaborative caching algorithm is proposed based on the request correlation. In the scheme, the subsequent correlated content chunks are requested in advance to increase the hit ratio for content requesting. When making the caching decision, a two-dimensional differentiated caching policy combining the caching location and cache-resident time is proposed. According to the change of content activity, the caching location is pushed downstream hop by hop in the spatial dimension in order to spread popular contents to the network edge in a gradual manner, and the cache-resident time is adjusted dynamically in the time dimension. The simulation results show that the proposed algorithm can efficiently decrease the request latency, reduce the cache redundancy, and achieve higher cache hit ratio than other caching strategies.