2003 International Conference on Parallel Processing, 2003. Proceedings.
Download PDF

Abstract

We introduce a novel approach to predict whether a block should be allocated in the cache or not based on past reuse behavior during its lifetime in the cache. Our evaluation of the scheme shows that the prediction accuracy is between 66% and 94% across the applications and can potentially result in a cache miss rate reduction of between 1% and 32% with an average of 12%. We also find that with a modest hardware cost - a table of around 300 bytes - we can cut the miss rate with up to 14% compared to a cache with an always-allocate strategy.
Like what you’re reading?
Already a member?
Get this article FREE with a new membership!

Related Articles