Docs Menu
Docs Home
/
Spark Connector

Batch Mode

On this page

  • Overview

In batch mode, you can use the Spark Dataset and DataFrame APIs to process data at a specified time interval.

The following sections show you how to use the Spark Connector to read data from MongoDB and write data to MongoDB in batch mode:

  • Read from MongoDB in Batch Mode

  • Write to MongoDB in Batch Mode

Tip

Apache Spark Documentation

To learn more about using Spark to process batches of data, see the Spark Programming Guide.

Back

Configure TLS/SSL

On this page