Docs Menu
Docs Home
/
Spark Connector

Streaming Mode

On this page

  • Overview

The Spark Connector supports streaming mode, which uses Spark Structured Streaming to process data as soon as it's available instead of waiting for a time interval to pass. Spark Structured Streaming is a data-stream-processing engine that you can access by using the Dataset or DataFrame API.

Important

Apache Spark contains two different stream-processing engines:

  • Spark Streaming with DStreams, now an unsupported legacy engine

  • Spark Structured Streaming.

This guide pertains only to Spark Structured Streaming.

The following sections show you how to use the Spark Connector to read data from MongoDB and write data to MongoDB in streaming mode:

  • Read from MongoDB in Streaming Mode

  • Write to MongoDB in Streaming Mode

Tip

Apache Spark Documentation

To learn more about using Spark to process streams of data, see the Spark Programming Guide.

Back

Batch Write Configuration Options

On this page