Database calls causing performance bottleneck

In my real application, I have sensor events that contain the ID of a sensor as well as some other data. My issue is that each event takes longer and longer to start processing. As I cannot post the code to for my actual project, I have made a test project that recreates the issue I am facing.

I have 2 collections:
pojo
a collection with 10,000 documents. Below is a document from the collection.

{
  "_id": "d171bf93-ce90-450d-9b10-8e51b9f9e245",
  "lastSeen": {
    "$date": "2025-02-10T09:57:23.610Z"
  },
  "_class": "org.example.Pojo"
}

relatedPojo
this has 10 documents. Each holds ids from the pojo collection as well as a random number.

{
  "_id": "c36edbd2-a25a-479e-b090-c5a10db6328b",
  "pojoIds": [
    "d78e286d-f980-4c4e-bcfc-fd2e7825aac8",
    "84d6a431-6de3-48d8-b2f6-a379c237c10a",
    "b5e89520-0ca5-481f-9cbe-51b741e5d0b4",
    "679a0ff7-efba-45ee-8dda-ea94b6e43b81",
    "203296ea-5f17-4636-a619-2aaec0918395"
  ],
  "randomNumber": 39,
  "_class": "org.example.RelatedPojo"
}

I am simulating an event for each of id in the pojo collection being sent at the same time every 5 seconds.

My requirements are

  • Every event contains 1 pojo id.
  • For every found relatedPojo set a random number. This number has to be generated per relatedPojo so I cannot generate 1 number and set it to all matching documents.

Main.java

    private static void simulateEventPerId(List<String> pojoIds, MongoTemplate mongoTemplate) {
        ScheduledExecutorService executorService = Executors.newSingleThreadScheduledExecutor();
        ExecutorService sensorProcessingService = Executors.newFixedThreadPool(10);

        int numberOfPojos = pojoIds.size();

        executorService.scheduleAtFixedRate(() -> {
            for (int i = 1; i <= numberOfPojos; i++) {
                String id = pojoIds.get(i - 1);
                LocalDateTime eventAdded = LocalDateTime.now();
                
                sensorProcessingService.execute(() -> {

                    LocalDateTime startProcessing = LocalDateTime.now();
                    long ms = ChronoUnit.MILLIS.between(eventAdded, startProcessing);
                    System.out.println(ms + "ms");

                    Pojo sensor = mongoTemplate.findById(id, Pojo.class);
                    if (sensor == null) {
                        return;
                    }
                    sensor.setLastSeen(LocalDateTime.now());
                    List<RelatedPojo> relatedPojos = mongoTemplate.find(new Query(Criteria.where("pojoIds").in(id)), RelatedPojo.class);

                    for (RelatedPojo relatedPojo : relatedPojos) {
                        
                        relatedPojo.setRandomNumber(new Random().nextInt(50));
                        mongoTemplate.save(relatedPojo);
                    }
                    mongoTemplate.save(sensor);

                });

            }
        }, 0, 5, TimeUnit.SECONDS);

    }

As this is supposed to simulate an event or request of some sort, you have to imagine that I don’t have a list of ids. This is why I haven’t done something like:
List<Pojo> allPojos = mongoTemplate.find(new Query(),Pojo.class);

The issue
I print out the time it takes between the event being added to the processing service and the time it takes to begin processing that event. My issue is that the time keeps going, eventually taking several minutes to begin processing an event. This is bad for data that needs to be handled in real time.

I have tried:

  • Increasing the fixed thread pool size
  • Using a cached thread pool - this does remove the delay. however it failing to protect the stack guard pages as shown in the below error.
[267.581s][warning][os,thread] Attempt to protect stack guard pages failed (0x00007f594991a000-0x00007f594991e000).
[thread 770497 also had an error]
Disconnected from the target VM, address: '127.0.0.1:37901', transport: 'socket'

What can I do in order to reduce the amount of time it takes?