-
Notifications
You must be signed in to change notification settings - Fork 73
Description
Hi,
I recognized that the overhead of managing the cache might be somewhat accumulative. While there might be other reasons hidden in my code, disabling the cache made a significant difference in runtime.
I am creating some knowledge graphs based on repeated api calls; one graph creation might have something from 1k to 10k api requests; and I cached the API calls. (I see that this might be not the prime scenario for cachier as the api calls do not take so long, however the issue might be systemic).
Here are two plots comparing runtime with cache and without.
Does anybody observe a similar issue?
My first guess is that the lookup time might not be constant (as in a hash-table) but linear growing with the number of entries stored.
What do people familiar with the implementation think about this observation?
P.S. thanks for the handy library (: