Connectors
MongoDB Connector for Apache Kafka
Natively integrate MongoDB data within the Kafka ecosystem.
Configure your connection
The official MongoDB Connector for Apache® Kafka® is developed and supported by MongoDB and verified by Confluent. The Connector enables MongoDB to be configured as both a sink and a source for Apache Kafka.
Easily build robust, reactive data pipelines that stream events between applications and services in near real time.
Why MongoDB and Apache Kafka?
MongoDB and Kafka are at the heart of modern data architectures. Kafka is designed for boundless streams of data that sequentially write events into commit logs, allowing low latency data movement between your services.
Configure as a Sink
Map and persist events from Kafka topics directly to MongoDB collections with ease. Ingest events from your Kafka topics directly into MongoDB collections, exposing the data to your services for efficient querying, enrichment, and analytics.
Configure as a Source
Publish data changes from MongoDB into Kafka topics for streaming to consuming apps. Data is captured via Change Streams within the MongoDB cluster and published into Kafka topics. This enables consuming apps to react to data changes in real time using an event-driven programming style.
Why MongoDB?
MongoDB customers have experienced success with the Kafka Connector across a span of industries and companies for a variety of use cases.
eCommerce and Customer Single View
ao.com, a leading online electrical retailer, uses Kafka to push all data changes from its source databases to MongoDB Atlas. This creates a single source of truth for all customer data to drive new and enhanced applications and business processes including customer service, fraud detection, and GDPR compliance. Employees with appropriate permissions can access customer data from one easy-to-consume operational data layer.
IoT
Josh Software, part of a project in India to house more than 100,000 people in affordable smart homes, pushes data from millions of sensors to Kafka, processes it in Apache Spark, and writes the results to MongoDB, which connects the operational and analytical data sets. By streaming data from millions of sensors in near real-time, the project is creating truly smart homes, and citizens can access data via a mobile app to better manage their homes.
Financial Services
AHL, a subsidiary of The Man Group, which is one of the world’s largest hedge fund investment firms, used MongoDB to create a single platform for all of its financial data. The system receives data for up to 150,000 ticks per second from multiple financial sources and writes it to Kafka. Kafka provides both consolidation and buffering of events before they are stored in MongoDB, where the data can be analyzed.
Opinion and Polling
State, an intelligent opinion network connecting people with similar beliefs, writes survey data to MongoDB and leverages MongoDB Change Streams to push database changes into Kafka topics where they are consumed by its user recommendation engine. This engine suggests potentially interesting users and updates instantly as soon as a user contributes a new opinion.