Atlas Stream Processing lets developers build real-time data pipelines directly within MongoDB, without stitching together separate streaming infrastructure. Today, we're extending that reach with native Google Cloud Pub/Sub sink support.
Pub/Sub is an industry-standard messaging service in the Google Cloud ecosystem, connecting data sources to downstream services like BigQuery, Dataflow, and Cloud Functions. Direct integration with Atlas Stream Processing eliminates intermediate infrastructure layers and reduces operational complexity.
How it works
Authentication and setup
Getting started requires configuring a GCP Service Account with the appropriate permissions. Within Atlas, navigate to the Project Integrations section and open the GCP integration modal. Atlas provides a service account that you'll grant Pub/Sub viewer and publisher permissions in your Google Cloud project:
For more granular control, permissions can be scoped to specific topics. Atlas Stream Processing also supports Google Cloud Private Service Connect for organizations that require private network connectivity.
Defining your connection
Add Google Cloud Pub/Sub as a connection in the Connection Registry through the Atlas UI, CLI, Admin API, or Terraform. The connection stores your authentication details and can be validated before use:
Streaming data to Pub/Sub
Once configured, use the $emit stage in your stream processor to publish messages to Pub/Sub topics:
The $emit stage supports:
Dynamic topic routing: Use expressions to route messages to different topics based on document content
Message ordering: Specify ordering keys to ensure related messages maintain publish sequence
Custom attributes: Attach metadata for downstream filtering and routing
Flexible formatting: Choose from multiple JSON serialization formats
The integration handles message delivery, retries, and error management automatically.
Monitoring and observability
Atlas Stream Processing provides built-in monitoring for your Pub/Sub integrations through the metrics dashboard. Track message delivery rates, monitor latency, and troubleshoot issues with detailed performance metrics. The monitoring UI provides real-time visibility into your streaming pipelines.
Key use cases
Powering agentic AI with real-time context
Agentic AI applications require continuous streams of fresh data to make informed decisions. Stream operational events from MongoDB to Pub/Sub, enabling AI agents to react to inventory changes, customer behavior shifts, or system events as they happen.
Real-time analytics in BigQuery
Stream operational data from MongoDB directly into Pub/Sub, routing to BigQuery for SQL-based analytics. Combine transactional MongoDB data with analytical workloads without complex ETL pipelines.
Event-driven architectures
Propagate database changes to Pub/Sub topics, triggering Cloud Functions, Dataflow jobs, or other GCP services in response to data events. Build responsive systems that act on data changes the moment they occur.
IoT and sensor data
Ingest high-velocity IoT data through MongoDB, enrich and transform it with stream processing, then emit to Pub/Sub for distribution across your GCP ecosystem.
Google Cloud Pub/Sub support is part of MongoDB's continued investment in making Atlas Stream Processing a native part of your cloud data ecosystem, whether you're building AI pipelines, event-driven architectures, or real-time analytics workflows.
Next Steps
Google Cloud Pub/Sub support is available now for all Atlas Stream Processing users across all tiers. Visit the Atlas Stream Processing documentation to configure your first Pub/Sub connection, or explore our connection registry documentation for detailed setup instructions.