Help with query for large ammounts of data.

Hey there, im new to mongoDB and im trying through the mongolite package of R studio to make a query into a huge database (over 10M+ records).
i need to fetch the climate data for the last 3 years but i am getting the error:
Error: Executor error during getMore :: caused by :: $push used too much memory and cannot spill to disk. Memory limit: 104857600 bytes
this is the code and query that i am using:

Initialize variable to store all data

all_data ← list()
page_size ← 5000
skip ← 0

repeat {

Modify the query to paginate and use allowDiskUse

paginated_query ← paste0(‘[
{ “$match”: { “datetime”: { “$gte”: { “$date”: "’, start_date_iso, '" } } } },
{ “$sort”: { “datetime”: -1 } },
{ “$skip”: ', skip, ’ },
{ “$limit”: ‘, page_size, ’ },
{ “$project”: {
“datetime”: 1,
“tempMean”: 1,
“relativeHumidityMean”: 1,
“dewPointMean”: 1,
“radiation”: 1,
“orchard”: 1,
“client”: 1
}
}
]’)

Run the query with allowDiskUse enabled

data ← mongo$aggregate(pipeline = paginated_query, options = ‘{“allowDiskUse”: true}’)

If there is no more data, exit the loop

if (length(data) == 0) break

Add the data to the list of all data

all_data ← append(all_data, list(data))

Increase skip to next page

skip ← skip + page_size
}
How could i make the query so i dont run out of memory?