Skip to content

Latest commit

 

History

History
52 lines (43 loc) · 5.33 KB

File metadata and controls

52 lines (43 loc) · 5.33 KB
title weight type aliases
Overview
1
docs
/connectors/pipeline-connectors/overview

Connectors

Flink CDC provides several source and sink connectors to interact with external systems. You can use these connectors out-of-box, by adding released JARs to your Flink CDC environment, and specifying the connector in your YAML pipeline definition.

Supported Connectors

Connector Supported Type External System
[Apache Doris]({{< ref "docs/connectors/pipeline-connectors/doris" >}}) Sink
  • Apache Doris: 1.2.x, 2.x.x
  • [Elasticsearch]({{< ref "docs/connectors/pipeline-connectors/elasticsearch" >}}) Sink
  • Elasticsearch: 6.x, 7.x, 8.x
  • [Kafka]({{< ref "docs/connectors/pipeline-connectors/kafka" >}}) Sink
  • Kafka
  • [MySQL]({{< ref "docs/connectors/pipeline-connectors/mysql" >}}) Source
  • MySQL: 5.6, 5.7, 8.0.x
  • RDS MySQL: 5.6, 5.7, 8.0.x
  • PolarDB MySQL: 5.6, 5.7, 8.0.x
  • Aurora MySQL: 5.6, 5.7, 8.0.x
  • MariaDB: 10.x
  • PolarDB X: 2.0.1
  • [Paimon]({{< ref "docs/connectors/pipeline-connectors/paimon" >}}) Sink
  • Paimon: 0.6, 0.7, 0.8
  • [StarRocks]({{< ref "docs/connectors/pipeline-connectors/starrocks" >}}) Sink
  • StarRocks: 2.x, 3.x
  • Develop Your Own Connector

    If provided connectors cannot fulfill your requirement, you can always develop your own connector to get your external system involved in Flink CDC pipelines. Check out [Flink CDC APIs]({{< ref "docs/developer-guide/understand-flink-cdc-api" >}}) to learn how to develop your own connectors.

    {{< top >}}