py4j.protocol.Py4JJavaError: An error occurred while calling o29.load. : org.apache.spark.SparkClassNotFoundException: [DATA_SOURCE_NOT_FOUND] Failed to find the data source: mongodb

i run data pipelines in airflow gcp using pyspark, it was running perfectly before but suddenly the py4j.protocol.Py4JJavaError: An error occurred while calling o29.load.
: org.apache.spark.SparkClassNotFoundException: [DATA_SOURCE_NOT_FOUND] Failed to find the data source: mongodb appears. is there any new updates about mongodb spark connector? i get more confused when i run it in dataproc, it runs perfectly

Which version of mongoDB Spark Connector are you using? This error seems to be stemming from the missing Spark connector in the environment. How are you including the Spark connector - is it installed in the environment or something you are passing when invoking the Spark execution?

Since you mentioned that the pipeline runs perfectly on Dataproc, but not on other environments, there might be a difference in the cluster configurations. Compare the configuration of your Dataproc cluster to the other Spark environment where you are encountering issues. Pay attention to the Spark version, the installed libraries, and the classpath configurations.

i have the same problem, you have any resolve for this issue