Sweep AI Documentation
A better Python cache for slow function calls
William Zeng - March 18th, 2024
We wrote a file cache - it's like Python's lru_cache, but it stores the values in files instead of in memory. This has saved us hours of time running our LLM benchmarks, and we'd like to share it as its own Python module. Thanks to Luke Jaggernauth (opens in a new tab) (former Sweep engineer) for building the initial version of this!
Here's the link:
https://github.com/sweepai/sweep/blob/main/docs/public/file_cache.py ...
Read more at docs.sweep.dev