Skip to content

High-performance LRU cache in Rust with deterministic memory management, O(1) operations, and multi-level caching architecture.

License

Notifications You must be signed in to change notification settings

naviNBRuas/CacheLRU

Repository files navigation

LRU Cache Memory Management in Rust 🦀

An educational-but-production-ready multi-level LRU cache showcasing idiomatic Rust data structures, ownership, and memory-safety. Originally built for a university course and now polished for public consumption.

📦 What’s Inside

  • Generic LRUCache<K, V> with O(1) put/get/peek and deterministic LRU eviction
  • Hierarchical MultiLevelCache (L1/L2/L3) with automatic promotions and per-level stats
  • PageManager simulator for page caching, dirty tracking, and defragmentation
  • Extensive unit tests and Criterion benchmarks
  • Friendly demos in main.rs to visualize behavior

🧠 Core Ideas

LRU in Rust:

  • HashMap for O(1) lookup
  • Doubly linked list for recency tracking
  • Rc<RefCell<...>> to keep pointer updates ergonomic yet safe

Multi-level strategy:

  • Hot data in L1, warm in L2, cold in L3
  • Automatic promotion on hits, configurable placement on writes
  • Rich hit/miss accounting with printable stats

Page management demo:

  • Simulated pages with size limits and dirty flags
  • Smart placement based on access frequency
  • Safe flush and eviction rules to avoid losing dirty data

🚀 Quick Start

use mini_lru_cache::LRUCache;

let mut cache = LRUCache::new(2);
cache.put("a", 10);
cache.put("b", 20);
assert_eq!(cache.get(&"a"), Some(10)); // promotes "a"
cache.put("c", 30); // evicts "b" (LRU)
assert_eq!(cache.get(&"b"), None);

Multi-level usage:

use mini_lru_cache::{CacheLevel, MultiLevelCache};

let mut cache = MultiLevelCache::new(1, 2, 4);
cache.put_with_policy("config", "v1", CacheLevel::L1);
cache.put_with_policy("report", "v2", CacheLevel::L3);

assert_eq!(cache.get(&"config"), Some("v1"));
assert!(cache.get(&"report").is_some()); // promoted upward on hit
println!("Hit ratio: {:.2}%", cache.get_hit_ratio() * 100.0);

Page manager:

use mini_lru_cache::PageManager;

let mut pm = PageManager::new(2, 3, 4, 1024);
pm.create_page(1, vec![42u8; 128]).unwrap();
let page = pm.load_page(1).unwrap();
assert_eq!(page.id, 1);

🔑 API Highlights

LRUCache<K, V>

  • new(capacity: usize) -> Self (panics on zero capacity)
  • put(key: K, value: V) – insert or update, promotes to MRU
  • get(&mut self, key: &K) -> Option<V> – fetch + promote
  • peek(&self, key: &K) -> Option<V> – fetch without promotion
  • remove(&mut self, key: &K) -> Option<V> / pop_lru()
  • len(), is_empty(), contains_key(&K), capacity()

MultiLevelCache<K, V>

  • new(l1, l2, l3) – distinct capacities per level
  • put and put_with_policy for targeted placement
  • get – auto-promotes on hit and records stats
  • evict_from_level and clear_all
  • CacheStats with hit counts per level and hit ratio

PageManager

  • Size-aware page creation with dirty tracking
  • Smart cache placement based on access frequency
  • flush_dirty_pages, defragment, selective eviction

🧪 Testing & Benchmarks

  • Unit tests: cargo test
  • Benchmarks: cargo bench

If you don’t have Cargo installed locally, grab it via rustup and rerun the commands.

🤓 Why This Is Interesting

  • Demonstrates interior mutability with Rc<RefCell<T>>
  • Clean separation between cache core, multi-level orchestration, and domain logic (pages)
  • Teaches eviction mechanics, promotion policies, and cache-aware data placement

🛣️ Roadmap Ideas

  • Optional TTL/expiry
  • Async + tokio-friendly version
  • Persistent backing store for cold data
  • CLI playground and WebAssembly demo

🤝 Contributing

PRs and suggestions are welcome! Please include:

  • Expected vs. actual behavior
  • Repro steps or a small test
  • Environment details (OS, Rust toolchain)

📄 License

MIT – see LICENSE.

About

High-performance LRU cache in Rust with deterministic memory management, O(1) operations, and multi-level caching architecture.

Topics

Resources

License

Stars

Watchers

Forks