$expr not using indices

I have an index on the “last_processed_time” field, which is a regular datetime.
In the first stage of my aggregation query I use this $match query:

{
    "$expr" : {
        "$gt" : [
            "$last_processed_time",
            {
                "$dateSubtract" : {
                    "startDate" : "$$NOW",
                    "unit" : "day",
                    "amount" : 21.0
                }
            }
        ]
    }
}

However the $match stage does not use my existing index on “last_processed_time”, no matter what. The explain plan always shows a COLLSCAN. If I use a index hint, it’s not using the index either, but add a new filter stage.

{
    "stage" : "FETCH",
    "filter" : {
        "$expr" : {
            "$gt" : [
                "$last_processed_time",
                {
                    "$dateSubtract" : {
                        "startDate" : "$$NOW",
                        "unit" : {
                            "$const" : "day"
                        },
                        "amount" : {
                            "$const" : 21.0
                        }
                    }
                }
            ]
        }
    }
}

If I manually calculate the date and build a query, it uses the existing index.

{
    "last_processed_time":{ $gte : ISODate("2024-03-29T00:00:00.000+0200") },
}

However it’s simply not an option to update my aggregation query 3 times a day. Also the $expr query did work in the past.

Rather than $dateSubstract and $now, which forces you to use $expr which prevents the use of the index, could you do the same using JS date:

d = new Date()
2024-04-20T14:51:56.111Z
d2 = new Date(d - 21*24*60*60*1000)
2024-03-30T14:51:56.111Z

With this your query would become

const day = 24 * 60 * 60 * 1000 ;
...
last_processed_time: { $gt : new Date( new Date() - 21*day ) }

Yes, new Date() on the client may differs from the server $$NOW, but for day related use-case it should not make a big difference. And without the $expr, the index should be used.

Thanks for the suggestion on calculating the date using JS. Unfortunately I need to create a view from the aggregation query, which does not support JS as far as I know.

The view is used for Atlas SQL which is loaded into PowerBI. Since the collection 1TB+ loading and filtering in PowerBI is not an option.

Like I said, the view is in use for ~2 years, and in the past the $expr stage did use the index just fine. I’m wonding if MongoDB changed something without us noticing, or if it is a bug.

I completely missed the parts where you have mentioned that it was a view and that it has been working for 2 years.

Please share the indexes you have. I asked since it was working and now it is not, then may be during the server migration the index were not recreated. How many documents do you have in your collection?

As a workaround, it should be easy to write a JS script that update the view.

Hi @Hermann_Baumgartl

What version of MongoDB, and is this self hosted or Atlas ?

We have quite an amount of indices on our collection. However we use the $expr stage on multiple collections, but on all our views the indices are not used anymore.

We have views that should use the ix_last_processed_time and ci_CPW.TIMESTAMP.ts_CPW.NAME indices.

[
    {
        "v" : 2.0,
        "key" : {
            "CUP_CODE" : 1.0
        },
        "name" : "ix_CUP_CODE"
    },
    {
        "v" : 2.0,
        "key" : {
            "LID_CODE" : 1.0
        },
        "name" : "ix_LID_CODE"
    },
    {
        "v" : 2.0,
        "key" : {
            "CPW.TIMESTAMP.ts" : 1.0,
            "CPW.NAME" : 1.0
        },
        "name" : "ci_CPW.TIMESTAMP.ts_CPW.NAME"
    },
    {
        "v" : 2.0,
        "key" : {
            "CPPG.TIMESTAMP.ts" : 1.0,
            "CPPG.NAME" : 1.0
        },
        "name" : "ci_CPPG.TIMESTAMP.ts_CPPG.NAME"
    },
    {
        "v" : 2.0,
        "key" : {
            "CPX.TIMESTAMP.ts" : 1.0,
            "CPX.NAME" : 1.0
        },
        "name" : "ci_CPX.TIMESTAMP.ts_CPX.NAME"
    },
    {
        "v" : 2.0,
        "key" : {
            "CPFORM.TIMESTAMP.ts" : 1.0
        },
        "name" : "ix_CPFORM.TIMESTAMP.ts"
    },
    {
        "v" : 2.0,
        "key" : {
            "CPFORM.OCV1_TIMESTAMP.ts" : 1.0
        },
        "name" : "ix_CPFORM.OCV1_TIMESTAMP.ts"
    },
    {
        "v" : 2.0,
        "key" : {
            "_id" : 1.0
        },
        "name" : "_id_"
    },
    {
        "v" : 2.0,
        "key" : {
            "CPS.TIMESTAMP.ts" : 1.0,
            "CPS.NAME" : 1.0
        },
        "name" : "ci_CPS.TIMESTAMP.ts_CPS.NAME"
    },
    {
        "v" : 2.0,
        "key" : {
            "CPM.TIMESTAMP.ts" : 1.0,
            "CPM.NAME" : 1.0
        },
        "name" : "ci_CPM.TIMESTAMP.ts_CPM.NAME"
    },
    {
        "v" : 2.0,
        "key" : {
            "last_processed_time" : 1.0
        },
        "name" : "ix_last_processed_time",
        "background" : true
    },
    {
        "v" : 2.0,
        "key" : {
            "CPM.ORDER_IDENTIFIER" : 1.0
        },
        "name" : "pix_CPM.ORDER_IDENTIFIER",
        "background" : true,
        "partialFilterExpression" : {
            "CPM.ORDER_IDENTIFIER" : {
                "$exists" : true
            }
        }
    },
    {
        "v" : 2.0,
        "key" : {
            "CPM.WICKEL_IDENTIFIER" : 1.0
        },
        "name" : "pix_CPM.WICKEL_IDENTIFIER",
        "background" : true,
        "partialFilterExpression" : {
            "CPM.WICKEL_IDENTIFIER" : {
                "$exists" : true
            }
        }
    },
    {
        "v" : 2.0,
        "key" : {
            "CPM.BECHER_IDENTIFIER" : 1.0
        },
        "name" : "pix_CPM.BECHER_IDENTIFIER",
        "background" : true,
        "partialFilterExpression" : {
            "CPM.BECHER_IDENTIFIER" : {
                "$exists" : true
            }
        }
    },
    {
        "v" : 2.0,
        "key" : {
            "CPW.KERN_IDENTIFIER" : 1.0
        },
        "name" : "pix_CPW.KERN_IDENTIFIER",
        "background" : true,
        "partialFilterExpression" : {
            "CPW.KERN_IDENTIFIER" : {
                "$exists" : true
            }
        }
    },
    {
        "v" : 2.0,
        "key" : {
            "CPW.ANODE_LE_IDENTIFIER" : 1.0
        },
        "name" : "pix_CPW.ANODE_LE_IDENTIFIER",
        "background" : true,
        "partialFilterExpression" : {
            "CPW.ANODE_LE_IDENTIFIER" : {
                "$exists" : true
            }
        }
    },
    {
        "v" : 2.0,
        "key" : {
            "CPW.ANODE_RI_IDENTIFIER" : 1.0
        },
        "name" : "pix_CPW.ANODE_RI_IDENTIFIER",
        "background" : true,
        "partialFilterExpression" : {
            "CPW.ANODE_RI_IDENTIFIER" : {
                "$exists" : true
            }
        }
    },
    {
        "v" : 2.0,
        "key" : {
            "CPW.KATHODE_LE_IDENTIFIER" : 1.0
        },
        "name" : "pix_CPW.KATHODE_LE_IDENTIFIER",
        "background" : true,
        "partialFilterExpression" : {
            "CPW.KATHODE_LE_IDENTIFIER" : {
                "$exists" : true
            }
        }
    },
    {
        "v" : 2.0,
        "key" : {
            "CPW.KATHODE_RI_IDENTIFIER" : 1.0
        },
        "name" : "pix_CPW.KATHODE_RI_IDENTIFIER",
        "background" : true,
        "partialFilterExpression" : {
            "CPW.KATHODE_RI_IDENTIFIER" : {
                "$exists" : true
            }
        }
    },
    {
        "v" : 2.0,
        "key" : {
            "CPW.INT_SEPARATOR_LE_IDENTIFIER" : 1.0
        },
        "name" : "pix_CPW.INT_SEPARATOR_LE_IDENTIFIER",
        "background" : true,
        "partialFilterExpression" : {
            "CPW.INT_SEPARATOR_LE_IDENTIFIER" : {
                "$exists" : true
            }
        }
    },
    {
        "v" : 2.0,
        "key" : {
            "CPW.INT_SEPARATOR_RI_IDENTIFIER" : 1.0
        },
        "name" : "pix_CPW.INT_SEPARATOR_RI_IDENTIFIER",
        "background" : true,
        "partialFilterExpression" : {
            "CPW.INT_SEPARATOR_RI_IDENTIFIER" : {
                "$exists" : true
            }
        }
    },
    {
        "v" : 2.0,
        "key" : {
            "CPW.EXT_SEPARATOR_LE_IDENTIFIER" : 1.0
        },
        "name" : "pix_CPW.EXT_SEPARATOR_LE_IDENTIFIER",
        "background" : true,
        "partialFilterExpression" : {
            "CPW.EXT_SEPARATOR_LE_IDENTIFIER" : {
                "$exists" : true
            }
        }
    },
    {
        "v" : 2.0,
        "key" : {
            "CPW.EXT_SEPARATOR_RI_IDENTIFIER" : 1.0
        },
        "name" : "pix_CPW.EXT_SEPARATOR_RI_IDENTIFIER",
        "background" : true,
        "partialFilterExpression" : {
            "CPW.EXT_SEPARATOR_RI_IDENTIFIER" : {
                "$exists" : true
            }
        }
    },
    {
        "v" : 2.0,
        "key" : {
            "CPW.ABLEITERCU_IDENTIFIER" : 1.0
        },
        "name" : "pix_CPW.ABLEITERCU_IDENTIFIER",
        "background" : true,
        "partialFilterExpression" : {
            "CPW.ABLEITERCU_IDENTIFIER" : {
                "$exists" : true
            }
        }
    },
    {
        "v" : 2.0,
        "key" : {
            "CPW.ABLEITERAL_IDENTIFIER" : 1.0
        },
        "name" : "pix_CPW.ABLEITERAL_IDENTIFIER",
        "background" : true,
        "partialFilterExpression" : {
            "CPW.ABLEITERAL_IDENTIFIER" : {
                "$exists" : true
            }
        }
    },
    {
        "v" : 2.0,
        "key" : {
            "CPW.AN_TAPE_UP_IDENTIFIER" : 1.0
        },
        "name" : "pix_CPW.AN_TAPE_UP_IDENTIFIER",
        "background" : true,
        "partialFilterExpression" : {
            "CPW.AN_TAPE_UP_IDENTIFIER" : {
                "$exists" : true
            }
        }
    },
    {
        "v" : 2.0,
        "key" : {
            "CPW.AN_TAPE_LO_IDENTIFIER" : 1.0
        },
        "name" : "pix_CPW.AN_TAPE_LO_IDENTIFIER",
        "background" : true,
        "partialFilterExpression" : {
            "CPW.AN_TAPE_LO_IDENTIFIER" : {
                "$exists" : true
            }
        }
    },
    {
        "v" : 2.0,
        "key" : {
            "CPW.CA_TAPE_UP_IDENTIFIER" : 1.0
        },
        "name" : "pix_CPW.CA_TAPE_UP_IDENTIFIER",
        "background" : true,
        "partialFilterExpression" : {
            "CPW.CA_TAPE_UP_IDENTIFIER" : {
                "$exists" : true
            }
        }
    },
    {
        "v" : 2.0,
        "key" : {
            "CPW.CA_TAPE_LO_IDENTIFIER" : 1.0
        },
        "name" : "pix_CPW.CA_TAPE_LO_IDENTIFIER",
        "background" : true,
        "partialFilterExpression" : {
            "CPW.CA_TAPE_LO_IDENTIFIER" : {
                "$exists" : true
            }
        }
    },
    {
        "v" : 2.0,
        "key" : {
            "CPW.SCHEIBE_IDENTIFIER" : 1.0
        },
        "name" : "pix_CPW.SCHEIBE_IDENTIFIER",
        "background" : true,
        "partialFilterExpression" : {
            "CPW.SCHEIBE_IDENTIFIER" : {
                "$exists" : true
            }
        }
    },
    {
        "v" : 2.0,
        "key" : {
            "CPW.DEDI_IDENTIFIER" : 1.0
        },
        "name" : "pix_CPW.DEDI_IDENTIFIER",
        "background" : true,
        "partialFilterExpression" : {
            "CPW.DEDI_IDENTIFIER" : {
                "$exists" : true
            }
        }
    },
    {
        "v" : 2.0,
        "key" : {
            "CPXSP.SETD_X_IDENTIFIER" : 1.0
        },
        "name" : "pix_CPXSP.SETD_X_IDENTIFIER",
        "background" : true,
        "partialFilterExpression" : {
            "CPXSP.SETD_X_IDENTIFIER" : {
                "$exists" : true
            }
        }
    },
    {
        "v" : 2.0,
        "key" : {
            "CPW.TIMESTAMP.ts" : 1.0
        },
        "name" : "ix_CPW.TIMESTAMP.ts"
    }
]

@chris It’s MongoDB 7.0.8 hosted on Atlas (dedicated cluster, AWS).

Sorry forgot to mention, but it is happening on two separated clusters aswell. Both clusters aswell. And the collections range from 2.5 million docs to 50 million docs. So we experience the issue on 2 clusters and 4 collections in total.

Would the change of behaviour coincide with an upgrade from 6.0 to 7.0 ?

When I test it with 6.0 v 7.0 there is a change in plan from an IXSCAN with the lower bound matching result of the $dateSubtract 7.0 is the COLLSCAN.

Possibly a bug, try opening an issue on jira.mongodb.com

As a work around adding last_processed_time:{$gt: MinKey()} to the match will use the index but it will still scan more keys than what I observed in 6.0

A preceeding {$sort:{last_processed_time:1}} triggers that same plan as the $gt .

1 Like

Thanks, that was out of my league.

I got a very similar trouble, when using $$NOW, that I described here : https://www.mongodb.com/community/forums/t/aggregation-with-indexed-date-field-not-working/281443/5

Should I open a bug?

Here’s the bug if you want to follow the issue for updates. SERVER-83524

4 Likes

Thanks! I opened a similar post few days ago, and here the workaround found :