-
Notifications
You must be signed in to change notification settings - Fork 188
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
pathUtils | fix memory leak #418
base: main
Are you sure you want to change the base?
Conversation
|
@@ -14,6 +14,7 @@ | |||
}, | |||
"dependencies": { | |||
"graceful-fs": "^4.2.4", | |||
"lru-cache": "^10.2.2", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't think we need this big package here, you can rewrite this package and put it inside project, because we use only the max
value
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
i'll implement it and update
@@ -6,6 +6,7 @@ | |||
"use strict"; | |||
|
|||
const path = require("path"); | |||
const { LRUCache } = require("lru-cache"); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's use your implementation here and remove this package from here
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also there is simple and fast example:
class LRU {
constructor(max = 2048) {
this.max = max;
this.cache = new Map();
}
get(key) {
let item = this.cache.get(key);
if (item !== undefined) {
// refresh key
this.cache.delete(key);
this.cache.set(key, item);
}
return item;
}
set(key, val) {
// refresh key
if (this.cache.has(key)) this.cache.delete(key);
// evict oldest
else if (this.cache.size === this.max) this.cache.delete(this.first());
this.cache.set(key, val);
}
first() {
return this.cache.keys().next().value;
}
}
@vankop What do you think about it? Yeah, we have a memory leak here, if you run develpment your application very long our cache will be bigger and bigger, so I think using lru cache is good idea, but I don't know what is will be better number for max |
I personally think we dont need cache here.. maybe at first implementation there was some specific usage for |
on cold builds it does not make sense either since very low cache hit |
@alexander-akait sorry I'm a bit busy, I'll address it at the weekend, @vankop so do you recommend to just remove it instead? |
@vankop Can you test it on some repo and check it? |
we have a metrics repo, we just need to setup cache hits there... |
Closes #414