Problems with types and big datasets

Everything is fine with Ratel, I was using Docker mapping ports to 8081 because I had multiple instances running, I tried moving to 8080 but I have the same problem. I can access localhost:8080/health?all.

All the objects have the type set, they should be around 2.3 billions, but I can’t count them since I get another error (see: [BUG]: Allocator can not allocate more than 64 buffers · Issue #8840 · dgraph-io/dgraph · GitHub).

I think the problem is related to the amount of data I have. I tried importing just a smaller portion of data (let’s say a few millions instead of the 2.3 billions) and everything worked fine.
For context, the size of the p folder is 2.4TB. I have also other types with billions of entries.

Querying data using Type() works. For example:

{
  q(func: type(Log), first: 10) {
		count(uid)
  }
}

Gives me:

"data": {
    "q": [
      {
        "count": 10
      }
    ]
  }

It doesn’t work using expand(all) and the alpha panics when queried with:

curl localhost:8080/query -XPOST -H "Content-Type: application/dql" -d 'schema {}'

With the logs of the first message I posted.