Docs Home → View & Analyze Data → Spark Connector
Batch Mode
On this page
Overview
In batch mode, you can use the Spark Dataset and DataFrame APIs to process data at a specified time interval.
The following sections show you how to use the Spark Connector to read data from MongoDB and write data to MongoDB in batch mode:
Tip
Apache Spark Documentation
To learn more about using Spark to process batches of data, see the Spark Programming Guide.