Install with a Containerized Cluster Using Docker
You can install Relational Migrator with Kafka in a containerized environment using Docker. This method configures Kafka to store data locally in Docker containers. Use this method if you want to learn how to configure your own multi-server Kafka environment.
Warning
This deployment method is ideal for quick evaluation. It is not recommended for production workloads as it may not provide a resilient production environment.
About this Task
This deployment method uses a docker-compose
file to set up a Kafka node,
a Kafka Connect node, and a Relational Migrator node.
Before you Begin
You must have Docker installed on your computer. For more information, see Install Docker Engine.
Steps
Download the docker-compose file
In the Download Center, select Docker as the platform. Then select the Kafka reference implementation file.
Configure environment variables
Configure
MIGRATOR_DATA_PATH
In your
docker-compose
file, configure theMIGRATOR_DATA_PATH
variable to a path where Relational Migrator saves data for persistence.(Optional) If your source database is MySQL or Oracle, configure
MIGRATOR_PATH_DRIVER
Relational Migrator uses the JDBC driver of the respective source database to read database schema. It bundles SQL Server and PostgreSQL JDBC drivers. For MySQL and Oracle, you must add their drivers.
In your
docker-compose
file, configure theMIGRATOR_PATH_DRIVER
variable to the location of the.jar
file for the additional JDBC drivers.