A Spring Boot-based tool for migrating data between relational databases (MySQL, PostgreSQL, Oracle) and generating mock datasets. Built with Spring Batch for robust, transactional workflows.
- Features
- Tech Stack
- Getting Started
- API Documentation
- Workflow Details
- Configuration
- Roadmap
- Contributing
- License
- Cross-Database Migration: Move data between MySQL, PostgreSQL, and Oracle.
- Mock Data Generation: Insert customizable test datasets into any supported database.
- Batch Processing: Chunk-based (100 records/transaction) processing for large datasets.
- Multi-DB Configuration: Isolated connection pools and JPA configurations for each database.
- REST API Integration: Trigger operations via Swagger-documented endpoints.
- Core: Spring Boot 3.4.2, Java 21
- Persistence: Spring Data JPA, Hibernate
- Batch Processing: Spring Batch
- Databases: MySQL, PostgreSQL, Oracle
- Utilities: Lombok, MapStruct, SpringDoc (OpenAPI 3)
- Build: Maven
- Java 21, Maven, Docker (optional).
- Running instances of MySQL, PostgreSQL, and/or Oracle.
-
Clone the repository:
git clone https://github.com/CodexParas/db-migrator.git cd db-migrator
-
Install dependencies:
mvn clean install
-
Configure databases:
Updatesrc/main/resources/application.yaml
with your database credentials. -
Run the application:
mvn spring-boot:run
Start databases using Docker:
docker-compose up -d # Launches MySQL, PostgreSQL, and Oracle containers
Interactive Swagger UI:
Access at http://localhost:9121/api/swagger-ui.html
after starting the app.
POST /api/migrate/toMySql
Content-Type: application/json
{
"source": "POSTGRES" # [MYSQL, POSTGRES, ORACLE]
}
Response:
{
"status": "SUCCESS",
"message": "Data migrated",
"data": null
}
POST /api/data/insert/mysql
Content-Type: application/json
{
"count": 1000 # Number of records to generate
}
Response:
{
"status": "SUCCESS",
"message": "Data inserted",
"data": null
}
sequenceDiagram
User->>API: POST /migrate/toMySql {source: "POSTGRES"}
API->>MigrationService: Trigger Job
MigrationService->>Spring Batch: Run migrateToMysqlJob
Spring Batch->>Source DB: Read data via DbDataSupplier
Spring Batch->>Target DB: Write data via JPA Repository
Spring Batch-->>User: Return success
Steps:
- Reader: Fetches data from the source DB (e.g., PostgreSQL).
- Processor: Pass-through (no transformation; extend for custom logic).
- Writer: Saves data to the target DB (e.g., MySQL) in chunks (100 records/transaction).
sequenceDiagram
User->>API: POST /data/insert/mysql {count: 500}
API->>DataService: Trigger Job
DataService->>Spring Batch: Run MySqlDataInsertJob
Spring Batch->>MockDataGenerator: Generate 500 records
Spring Batch->>MySQL: Save data via JPA Repository
Spring Batch-->>User: Return success
Steps:
- Reader: Generates mock data (e.g., names, emails) using
MockDataGenerator
. - Writer: Inserts records into the target database.
server:
port: 9121
servlet:
context-path: /api
spring:
datasource:
mysql:
jdbcUrl: jdbc:mysql://localhost:3306/db_migrator
username: root
password: root
batch:
job.enabled: false # Disable auto-startup of jobs
jdbc.initialize-schema: ALWAYS # Create batch tables on startup
- Chunk Size: Adjust in job configurations (e.g.,
MigrateToMysql.java
):.<MySqlClientEntity, MySqlClientEntity>chunk(200, mySqlTransactionManager)
- MongoDB Support: Migrate to/from MongoDB collections.
- Rollback Mechanism: API-driven undo for migrations.
- Enhanced Monitoring: Integrate Spring Actuator for job metrics.
- Error Handling: Retry policies, dead-letter queues.
- Authentication: Basic API key/JWT support.
Contributions welcome!
- Fork the repository.
- Create a feature branch (
git checkout -b feature/your-feature
). - Add tests for new functionality.
- Submit a pull request with a detailed description.