jarifibrahim commented :
I’m not aware of what causes badger to take up the amount of memory that it does. That seems like the first step towards introducing a flag for setting a fixed memory limit. May someone from the badger team weigh in?
The amount of memory being used depends on your DB options. For instance, each table has a bloom filter and these bloom filters are kept in memory. Each bloomfilter takes up 5 MB of memory. So if you have 100 GB of data, that means you have (1001000/64) = 1562 tables, and 15625 MB is about 7.8 GB of memory. So your bloom filters alone would take up 7.8 GB of memory. We have a separate cache in badger v2 to reduce the memory used by bloom filters.
Other things that might affect memory usage is the table loading mode. If you set the table loading mode to fileIO, the memory usage should reduce but then your reads would be very slow.