Execution memory limit exceeded

We currently have some problems requesting large datasets (>7000 objects) through Atlas GraphQL. When running the query, we get the following error message:

["message":"execution memory limit exceeded","locations"]

  • is there any way to mitigate that?

Hi @Christian_Schulz,
Can you please provide us with the following details in order to analyze your situation better?

  1. Sample documents from the collection you are having issues with
  2. The Atlas cluster tier (M5, M10, M30) you are using to deploy your database and also the version of MongoDB.
  3. The GraphQL query that is causing this error
  4. If you are using a custom resolver for the concerned query, please share the aggregation pipeline with us.

If you have any doubts, please feel free to reach out to us.

Thanks and Regards.
Sourabh Bagrecha,
MongoDB

1 Like

Hi @SourabhBagrecha,

  1. Can I share this privately?
  2. we have a M50.
  3. Can I share this privately?
  4. No we don’t use a custom resolver.

Regards
Christian Schulz

Hi @Christian_Schulz,

I believe it’s best to post the information in the relevant thread you have already opened so that we can keep all the information and discussion in one place. I’m hoping what you experienced would be informative to the community at large, and would be very helpful to future community users who experienced a similar issue.

If you have sensitive information, please redact it before posting publicly :slight_smile:

And I would also like to thank you for contributing to the community!

If you have any doubts, please feel free to reach out to us.

Thanks and Regards.
Sourabh Bagrecha,
MongoDB

1 Like

Hi @SourabhBagrecha,

okay the error comes also with very less information:


        units (limit:3000000, query: {location:{name:"Germany"},

        healthText_in: ["working",  "defect", ],

        type_in: ["a123123sdaw1289zau908dh", "1238097asdjkh081923"]}) {

            _id

        }

    }

And if I try to get more then 8k entries I got the error.

Regards
Christian Schulz

Hi @Christian_Schulz,
You mentioned that you see this message when you’re trying to request a large dataset.

Do you also see this issue when you’re requesting a smaller result set? Is there any pattern that you can discern regarding this error?

Also, have you brought this up to the attention of Atlas support? It seems like your case is very particular, and Atlas support would have more visibility into the tools and information required to be able to troubleshoot this issue you’re having.

If you have any doubts, please feel free to reach out to us.

Thanks and Regards.
Sourabh Bagrecha,
MongoDB

1 Like

Hi @SourabhBagrecha ,

I said as long as the limit remains below 7000 everything is good.
As soon as I go above it, I get the error.
Even if I leave out the limit.

No I have only asked in this forum so far as we do not have support.

Regards
Christian Schulz