-
Notifications
You must be signed in to change notification settings - Fork 135
memory leak on MacOS? #1529
Comments
@pjv please do a |
@strib done. In the feedback i referenced this issue. |
Thanks! Logs: 3db962a332d3e73b56f64c1c I will look later today. |
@pjv hrm I don't see any activity in your logs, as you say, even connection issues (which were related to the latest leak we plugged). I also don't see any restarts (the logs that were sent only cover the last day or so), so I'm not 100% sure about the state it was in before your last restart. If the memory is still currently growing, would you be willing to share some more debugging with us please? If so, please do: cp /keybase/.kbfs_profiles/heap /keybase/private/strib,pjv/
cp /keybase/.kbfs_profiles/goroutine /keybase/private/strib,pjv/
tar -czf /keybase/private/strib,pjv/client-1529.tgz ~/Library/Logs/keybase.kbfs.* Thanks! |
@strib interesting, have not done anything keybase-related since sending that log earlier. Memory footprint for kbfs is half what it was then. happy to send the additional data if you still want it. Lmk. |
Ah ok cool, then maybe everything is ok. Our code is in Go, and it has some lazy memory deallocating, maybe that is what's going on. If you see it reach 1 GB again without using KBFS, please send the above data, otherwise I don't think there's much to look at yet. |
k, will-do. |
I recently deleted a folder 50 GB worth of data and ram usage jumped through the roof. Whole system went slow and started to freeze. Edit: Tried to send logs via ▶ INFO ignoring error getting keybase status: There were multiple errors: dial unix /Users/mirorauhala/Library/Group Containers/keybase/Library/Caches/Keybase/keybased.sock: connect: no such file or directory; dial unix /Users/mirorauhala/Library/Caches/Keybase/keybased.sock: connect: no such file or directory ID: fe1945470f3634b941a4461c |
@mirorauhala is it still in this state? If so, could you send me some more profiling info? Do something like:
|
Or I guess if KBFS is too overloaded, just copy those files somewhere local and get them to me some other way (maybe after KBFS is restarted). |
I had to quit the app. I'll try relaunching it. |
Ah bummer. Ok, I'll look at the logs. |
It's now happening again. 3gb and counting. |
Can you please send the heap and goroutine file my way, as mentioned above? |
Yup! |
Thanks! @jzila looks like that fun prefetcher leak again. I still have no clue about this one though. See the attached graph. @mirorauhala sorry about this. I've seen this before but we've never been able to reproduce it locally reliably, in order to debug it. The second time it happened, did you first access anything in KBFS, or did it just start happening on its own after the restart? |
A long time ago uploaded a folder with a lot of data. Now I deleted the folder in question using Finder. I noticed that the laptop got really slow and started to heat up. Ram usage was 5gb. Then I quit the app from the menubar. Restarted by just launching the app by hitting |
I uploaded new files if that helps. |
Ok thanks. It might still be draining data from your local journal as it sends the delete messages to our server, and that somehow is triggering this extra memory usage. It could be that just letting it run over night if possible, until it drains all those deletes, and then restarting the app will fix your problem. Deleting large folders is not our strong suit, and I've seen a leak like this before, but it's very mysterious still. Sorry for the bad experience, hopefully we can get to the bottom of this soon. |
Yeah, currently there are only two major issues holding me from using Keybase. Large file handling and selective sync. |
For selective sync an okay substitute can be to use teams and subteams and syncing those. |
I have not yet tried teams. Will give that a shot. |
Without doing anything with it (i.e. no file operations) the kbfs process on my mac has been steadily eating more memory day by day lately.
I've been killing it and letting it re-launch when it gets to around 1 GB...
The text was updated successfully, but these errors were encountered: