wgzhao / Addax

Addax is a versatile open-source ETL tool that can seamlessly transfer data between various RDBMS and NoSQL databases, making it an ideal solution for data migration.
https://wgzhao.github.io/Addax/
Apache License 2.0
1.12k stars 290 forks source link
clickhouse data-integrity database datax etl excel hadoop hdfs hive impala influxdb kudu mysql oracle postgresql sqlserver trino

Addax Logo

Addax is a versatile open-source ETL tool

The documentation describes in detail how to install and use the plugins. It provides detailed instructions and sample configuration documentation for each plugin.

release version Maven Package

English | 简体中文

The project's initial code originated from Ali's DataX, and has been greatly improved on this basis. It also provides more read and write plugins. For more details, please refer to the difference document.

Supported Data Sources

Addax supports more than 20 SQL and NoSQL data sources. It can also be extended to support more.

Cassandra Clickhouse IMB DB2 dBase
Doris Elasticsearch Excel Greenplum
Apache HBase Hive InfluxDB Kafka
Kudu MinIO MongoDB MySQL
Oracle Phoenix PostgreSQL Presto
Redis Amazon S3 SQLite SQLServer
Starrocks Sybase TDengine Trino
Access SAP HANA

Getting Started

Use docker image

docker pull wgzhao/addax:latest
docker run -ti --rm --name addax wgzhao/addax:latest /opt/addax/bin/addax.sh /opt/addax/job/job.json

If you want to use common reader and writer plugins, you can pull the image whose name ends with -lite, it's very small.

docker pull wgzhao/addax:4.0.12-lite
docker run -ti --rm --name addax wgzhao/addax:4.0.12-lite /opt/addax/bin/addax.sh /opt/addax/job/job.json

Use install script

/bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/wgzhao/Addax/master/install.sh)"

This script installs Addax to its preferred prefix (/usr/local for macOS Intel, /opt/addax for Apple Silicon and /opt/addax/ for Linux)

Compile and Package

git clone https://github.com/wgzhao/addax.git addax
cd addax
mvn clean package
mvn package assembly:single

After successful compilation and packaging, a addax-<version> folder will be created in the target/datax directory of the project directory, where <version indicates the version.

Begin your first task

The job subdirectory contains many sample jobs, of which job.json can be used as a smoke-out test and executed as follows

bin/addax.sh job/job.json

The output of the above command is roughly as follows.

Click to expand ```shell $ bin/addax.sh job/job.json ___ _ _ / _ \ | | | | / /_\ \ __| | __| | __ ___ __ | _ |/ _` |/ _` |/ _` \ \/ / | | | | (_| | (_| | (_| |> < \_| |_/\__,_|\__,_|\__,_/_/\_\ :: Addax version :: (v4.0.13-SNAPSHOT) 2023-05-14 11:43:38.040 [ main] INFO VMInfo - VMInfo# operatingSystem class => sun.management.OperatingSystemImpl 2023-05-14 11:43:38.062 [ main] INFO Engine - { "setting":{ "speed":{ "byte":-1, "channel":1, "record":-1 } }, "content":{ "reader":{ "name":"streamreader", "parameter":{ "sliceRecordCount":10, "column":[ { "value":"addax", "type":"string" }, { "value":19890604, "type":"long" }, { "value":"1989-06-04 11:22:33 123456", "type":"date", "dateFormat":"yyyy-MM-dd HH:mm:ss SSSSSS" }, { "value":true, "type":"bool" }, { "value":"test", "type":"bytes" } ] } }, "writer":{ "name":"streamwriter", "parameter":{ "print":true, "encoding":"UTF-8" } } } } 2023-05-14 11:43:38.092 [ main] INFO JobContainer - The jobContainer begins to process the job. 2023-05-14 11:43:38.107 [ job-0] INFO JobContainer - The Reader.Job [streamreader] perform prepare work . 2023-05-14 11:43:38.107 [ job-0] INFO JobContainer - The Writer.Job [streamwriter] perform prepare work . 2023-05-14 11:43:38.108 [ job-0] INFO JobContainer - Job set Channel-Number to 1 channel(s). 2023-05-14 11:43:38.108 [ job-0] INFO JobContainer - The Reader.Job [streamreader] is divided into [1] task(s). 2023-05-14 11:43:38.108 [ job-0] INFO JobContainer - The Writer.Job [streamwriter] is divided into [1] task(s). 2023-05-14 11:43:38.130 [ job-0] INFO JobContainer - The Scheduler launches [1] taskGroup(s). 2023-05-14 11:43:38.138 [ taskGroup-0] INFO TaskGroupContainer - The taskGroupId=[0] started [1] channels for [1] tasks. 2023-05-14 11:43:38.141 [ taskGroup-0] INFO Channel - The Channel set byte_speed_limit to -1, No bps activated. 2023-05-14 11:43:38.141 [ taskGroup-0] INFO Channel - The Channel set record_speed_limit to -1, No tps activated. addax 19890604 1989-06-04 11:24:36 true test addax 19890604 1989-06-04 11:24:36 true test addax 19890604 1989-06-04 11:24:36 true test addax 19890604 1989-06-04 11:24:36 true test addax 19890604 1989-06-04 11:24:36 true test addax 19890604 1989-06-04 11:24:36 true test addax 19890604 1989-06-04 11:24:36 true test addax 19890604 1989-06-04 11:24:36 true test addax 19890604 1989-06-04 11:24:36 true test addax 19890604 1989-06-04 11:24:36 true test 2023-05-14 11:43:41.157 [ job-0] INFO AbstractScheduler - The scheduler has completed all tasks. 2023-05-14 11:43:41.158 [ job-0] INFO JobContainer - The Writer.Job [streamwriter] perform post work. 2023-05-14 11:43:41.159 [ job-0] INFO JobContainer - The Reader.Job [streamreader] perform post work. 2023-05-14 11:43:41.162 [ job-0] INFO StandAloneJobContainerCommunicator - Total 10 records, 260 bytes | Speed 86B/s, 3 records/s | Error 0 records, 0 bytes | All Task WaitWriterTime 0.000s | All Task WaitReaderTime 0.000s | Percentage 100.00% 2023-05-14 11:43:41.596 [ job-0] INFO JobContainer - Job start at : 2023-05-14 11:43:38 Job end at : 2023-05-14 11:43:41 Job took secs : 3ss Average bps : 86B/s Average rps : 3rec/s Number of rec : 10 Failed record : 0 ```

Here and Here provides all kinds of job configuration examples

Runtime Requirements

Documentation

compile

First, you need install the following python3 modules

python3 -m pip install mkdocs-material

you can using mkdocs command to build or preview on local

mkdocs build
mkdocs serve -a 0.0.0.0:8888

using the following command to publish release doc

export version=4.1.5
git checkout $version
mike deploy $version
git checkout gh-pages
git push -u origin gh-pages

Code Style

We recommend you use IntelliJ as your IDE. The code style template for the project can be found in the codestyle repository along with our general programming and Java guidelines. In addition to those you should also adhere to the following:

Star History

Star History Chart

License

This software is free to use under the Apache License Apache license.

Special Thanks

Special thanks to JetBrains for his supports to this project.