Unconsistent @filter behaviour 1.2.0

I’m running a local instance of DGraph 1.2.0 with 4gb RAM.

I’m currently facing a strange behaviour filtering data.

The schema i’m working with is:

type Company {

company.involvement [uid]: CompanyInvolvement

}
type CompanyInvolvement{

involvement.role string @index(exact)

}

Running the query:

{
  f(func:eq(name@.,"Crunchyroll"))  {
    company.involvement {
		involvement.role
    }
     uid
    name@.
  }
}

i got all the involvement of the company as expected


and as the screen shows, the company do have AT LEAST one company.involvement with involvement.role = “internet_streaming”
But running the same query with a filter applied

{
  f(func:eq(name@.,"Crunchyroll"))  {
    company.involvement @filter(eq(involvement.role, "internet_streaming")){
		involvement.role
    }
     uid
    name@.
  }
}

it does not returns any node

and even more strange, if i change the filter with another role (EG: online_distribution) the query runs properly

EDIT:
by running the filter at root

{
  f(func:eq(involvement.role,"internet_streaming"))  {
	involvement.role
  }
}

the query returns the expected result

additional information:
the db has 360804 node with the attribute involvement.role

Is it right? it should be
company.involvement: [uid]

Feels like the issue is your schema.

the schema was just some kind of pseudocode to give an idea of the relations.
The problem how the filter is not catching the result it is supposed to

Would you mind to share an example that I can reproduce it on my end? (A fake dataset sample-based in your structure) By reading what you said I can’t see anything abnormal, other than the schema.

I did some more test,
It seems that for some reason, at some point, the indexing just breakes. It’s like data inserted before a “checkpoint” can’t be filtered while those inserted after have no problem.
The database i’m testing against has around half million nodes, i’m running dgraph in windows through docker desktop with 2 CPU (i5 6600K 3.5GHz) and 5 Gb of memory.
If it’s possible to send private message i would like to agree on how to provide the full dataset in order to understand the problem

Sure.

BTW, can you try to use Best-effort queries?
https://docs.dgraph.io/clients/#running-best-effort-queries

The cluster you have, has a single Alpha and a Single Zero?

Cheers.

Best effort do not changes the result, i tried to load the data in more chunks shutdowning the database after each chunk to be sure everithing got flushed, but the behaiviour remains.
Yes, the cluster i’m using has single alpha and zero.

I’ll provide some more information.
I have 22k file containing the rdf statement to build the graph for the “entity” thei represent. To load the data in the db i go sequentially through all the file and perform an upsert with dgraph-io. Each “Entitiy” has its how bulk upsert and commit. 10k have a type “Anime” and 12k have type “Manga”, i tried several time to import from 0 the whole db to figure out how the behaviour was triggered observing the status of the db with the query

{
  f (func:has(name@.)) @filter(eq(dgraph.type, "Anime")) {
    count: count(uid)   
  }
  d (func:has(name@.)) @filter(eq(dgraph.type, "Manga")) {
    count: count(uid)   
  }
}

to check if the filters work properly and every time, on the import of around the 13000 th entry (which, for instance impleas that all “Anime” have been imported and there are around 1-2k “Manga”) the first filter stops detecting the nodes while filtering in the root works normally.

Filter (broken):

{
  f (func:has(name@.)) @filter(eq(dgraph.type, "Anime")) {
    count: count(uid)   
  }
  d (func:has(name@.)) @filter(eq(dgraph.type, "Manga")) {
    count: count(uid)   
  }
}
{
  "data": {
    "f": [
      {
        "count": 0
      }
    ],
    "d": [
      {
        "count": 3525
      }
    ]
  },
  "extensions": {
    "server_latency": {
      "parsing_ns": 85400,
      "processing_ns": 75245902,
      "encoding_ns": 531600,
      "assign_timestamp_ns": 707200,
      "total_ns": 76635802
    },
    "txn": {
      "start_ts": 140010
    },
    "metrics": {
      "num_uids": {
        "dgraph.type": 3525,
        "name": 0
      }
    }
  }
}

Root (working):

{
  f (func:eq(dgraph.type, "Anime")) {
    count: count(uid)   
  }
  d (func:has(name@.)) @filter(eq(dgraph.type, "Manga")) {
    count: count(uid)   
  }
}
{
  "data": {
    "f": [
      {
        "count": 9716
      }
    ],
    "d": [
      {
        "count": 3525
      }
    ]
  },
  "extensions": {
    "server_latency": {
      "parsing_ns": 72400,
      "processing_ns": 75526400,
      "encoding_ns": 2112600,
      "assign_timestamp_ns": 1017600,
      "total_ns": 78824400
    },
    "txn": {
      "start_ts": 140009
    },
    "metrics": {
      "num_uids": {
        "dgraph.type": 3525,
        "name": 0
      }
    }
  }
}