1.1. Destinations
A destination stage represents the target for a pipeline. You can use one or more destinations in a pipeline.
You can use different destinations based on the execution mode of the pipeline.
In standalone or cluster pipelines, you can use the following destinations:
- Aerospike - Writes data to Aerospike.
- Amazon S3 - Writes data to Amazon S3.
- Azure Data Lake Store - Writes data to the Azure Data Lake Store.
- Azure Event Hub Producer - Writes data to Azure Event Hub.
- Azure IoT Hub Producer - Writes data to Microsoft Azure IoT Hub.
- Cassandra - Writes data to a Cassandra cluster.
- CoAP Client - Writes data to a CoAP endpoint.
- Elasticsearch - Writes data to an Elasticsearch cluster.
- Einstein Analytics - Writes data to Salesforce Einstein Analytics.
- Flume - Writes data to a Flume source.
- Google BigQuery - Streams data into Google BigQuery.
- Google Bigtable - Writes data to Google Cloud Bigtable.
- Google Cloud Storage - Writes data to Google Cloud Storage.
- Google Pub/Sub Publisher - Publishes messages to Google Pub/Sub.
- Hadoop FS - Writes data to the Hadoop Distributed File System (HDFS).
- HBase - Writes data to an HBase cluster.
- Hive Metastore - Creates and updates Hive tables as needed.
- Hive Streaming - Writes data to Hive.
- HTTP Client - Writes data to an HTTP endpoint.
- InfluxDB - Writes data to InfluxDB.
- JDBC Producer - Writes data to JDBC.
- JMS Producer - Writes data to JMS.
- Kafka Producer - Writes data to a Kafka cluster.
- Kinesis Firehose - Writes data to a Kinesis Firehose delivery stream.
- Kinesis Producer - Writes data to Kinesis Streams.
- KineticaDB - Writes data to a table in a Kinetica cluster.
- Kudu - Writes data to Kudu.
- Local FS - Writes data to a local file system.
- MapR DB - Writes data as text, binary data, or JSON strings to MapR DB binary tables.
- MapR DB JSON - Writes data as JSON documents to MapR DB JSON tables.
- MapR FS - Writes data to MapR FS.
- MapR Streams Producer - Writes data to MapR Streams.
- MongoDB - Writes data to MongoDB.
- MQTT Publisher - Publishes messages to a topic on an MQTT broker.
- Named Pipe - Writes data to a named pipe.
- Redis - Writes data to Redis.
- Salesforce - Writes data to Salesforce.
- SDC RPC - Passes data to an SDC RPC origin in an SDC RPC pipeline.
- Solr - Writes data to a Solr node or cluster.
- To Error - Passes records to the pipeline for error handling.
- Trash - Removes records from the pipeline.
- WebSocket Client - Writes data to a WebSocket endpoint.
In standalone pipelines, you can also use the following destination:
- Rabbit MQ Producer - Writes data to RabbitMQ.
In edge pipelines, you can use the following destinations:
- CoAP Client - Writes data to a CoAP endpoint.
- HTTP Client - Writes data to an HTTP endpoint.
- Kafka Producer - Writes data to a Kafka cluster.
- MQTT Publisher - Publishes messages to a topic on an MQTT broker.
- WebSocket Client - Writes data to a WebSocket endpoint.
To help create or test pipelines, you can use the following development destination:
- To Event
For more information, see Development Stages.