Docs Menu
Docs Home
/
Relational Migrator
/ /

Install with a Containerized Cluster Using Docker

On this page

  • About this Task
  • Before you Begin
  • Steps
  • Next Steps
  • Learn More

You can install Relational Migrator with Kafka in a containerized environment using Docker. This method configures Kafka to store data locally in Docker containers. Use this method if you want to learn how to configure your own multi-server Kafka environment.

Warning

This deployment method is ideal for quick evaluation. It is not recommended for production workloads as it may not provide a resilient production environment.

This deployment method uses a docker-compose file to set up a Kafka node, a Kafka Connect node, and a Relational Migrator node.

You must have Docker installed on your computer. For more information, see Install Docker Engine.

1

In the Download Center, select Docker as the platform. Then select the Kafka reference implementation file.

2
  1. Configure MIGRATOR_DATA_PATH

    In your docker-compose file, configure the MIGRATOR_DATA_PATH variable to a path where Relational Migrator saves data for persistence.

  2. (Optional) If your source database is MySQL or Oracle, configure MIGRATOR_PATH_DRIVER

    Relational Migrator uses the JDBC driver of the respective source database to read database schema. It bundles SQL Server and PostgreSQL JDBC drivers. For MySQL and Oracle, you must add their drivers.

    In your docker-compose file, configure the MIGRATOR_PATH_DRIVER variable to the location of the .jar file for the additional JDBC drivers.

3

Run the following command to download the Docker images for your setup:

docker-compose -f docker-compose-migrator-kafka.yml pull
4

Run the following command to start Relational Migrator with Docker:

docker-compose -f docker-compose-migrator-kafka.yml up
  • Projects

  • Create a Sync Job

Back

Use Existing Kafka Cluster