Flink系列:Sink数据输出之ES连接器

目录

效果

基础

版本选择

pom

Flink2ESDemo.java

kibana中验证


效果

Flink系列:Sink数据输出之ES连接器_第1张图片

 

基础

Flink系列:IDEA创建flink项目开发环境(附完整WordCount例子)

Flink系列:使用Kafka连接器作为Source数据输入

ES系列:Kibana安装配置步骤

ES系列:ES(elasticsearch)安装配置步骤

 

版本选择

Flink系列:Sink数据输出之ES连接器_第2张图片

pom



    4.0.0

    csdn.xdoctorx
    flink-demo01
    1.0-SNAPSHOT

    
        
        1.9.2
        
        2.12
        
        1.2.28
    


    

        
        
            org.projectlombok
            lombok
            1.18.6
        

        
        
            com.alibaba
            fastjson
            ${fastjson.version}
        

        
        
            org.apache.flink
            flink-java
            ${flink.version}
        
        
            org.apache.flink
            flink-streaming-java_${scala.binary.version}
            ${flink.version}
        
        
            org.apache.flink
            flink-clients_${scala.binary.version}
            ${flink.version}
        
        
            org.apache.flink
            flink-connector-wikiedits_${scala.binary.version}
            ${flink.version}
        

        
        
            org.apache.flink
            flink-connector-kafka_${scala.binary.version}
            ${flink.version}
        

        
        
            org.apache.flink
            flink-connector-elasticsearch6_${scala.binary.version}
            ${flink.version}
        

    


 

Flink2ESDemo.java

package csdn.xdoctorx;

import org.apache.flink.api.common.functions.MapFunction;
import org.apache.flink.api.common.functions.RuntimeContext;
import org.apache.flink.api.common.serialization.SimpleStringSchema;
import org.apache.flink.api.java.utils.ParameterTool;
import org.apache.flink.streaming.api.datastream.DataStream;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.streaming.connectors.elasticsearch.ElasticsearchSinkFunction;
import org.apache.flink.streaming.connectors.elasticsearch.RequestIndexer;
import org.apache.flink.streaming.connectors.elasticsearch.util.RetryRejectedExecutionFailureHandler;
import org.apache.flink.streaming.connectors.elasticsearch6.ElasticsearchSink;
import org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer;
import org.apache.http.HttpHost;
import org.elasticsearch.action.index.IndexRequest;
import org.elasticsearch.client.Requests;

import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
import java.util.Map;

/**
 * author: xdoctorx.blog.csdn.net
 * date: 2020.05.31 23.02.58
 * description: flink2ES
 */
public class Flink2ESDemo {

        public static void main(String[] args) throws Exception {
            StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();

            Map properties= new HashMap();
            properties.put("bootstrap.servers", "192.168.40.148:9092");
            properties.put("group.id", "flink-xdoctorx");
            properties.put("enable.auto.commit", "true");
            properties.put("auto.commit.interval.ms", "1000");
            properties.put("auto.offset.reset", "earliest");
            properties.put("session.timeout.ms", "30000");
            properties.put("key.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
            properties.put("value.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
            properties.put("topic", "xdoctorx01");
            // parse user parameters

            ParameterTool parameterTool = ParameterTool.fromMap(properties);

            FlinkKafkaConsumer consumer  = new FlinkKafkaConsumer(
                    parameterTool.getRequired("topic"), new SimpleStringSchema(), parameterTool.getProperties());


            DataStream messageStream = env.addSource(consumer);


            List esHttphost = new ArrayList();
            esHttphost.add(new HttpHost("192.168.40.148", 9200, "http"));

            ElasticsearchSink.Builder esSinkBuilder = new ElasticsearchSink.Builder (
                    esHttphost,
                    new ElasticsearchSinkFunction() {
                        public IndexRequest createIndexRequest(String element) {
                            Map json = new HashMap ();
                            json.put("data", element);
                            
                            return Requests.indexRequest()
                                    .index("topic-flink-xdoctorx01")
                                    .type("flink-xdoctorx01")
                                    .source(json);
                        }

                        public void process(String element, RuntimeContext ctx, RequestIndexer indexer) {
                            indexer.add(createIndexRequest(element));
                        }
                    }
            );

            esSinkBuilder.setBulkFlushMaxActions(1);
//        esSinkBuilder.setRestClientFactory(
//                restClientBuilder -> {
//                    restClientBuilder.setDefaultHeaders()
//                }
//        );
//            esSinkBuilder.setRestClientFactory(new RestClientFactoryImpl());
            esSinkBuilder.setFailureHandler(new RetryRejectedExecutionFailureHandler());

            messageStream.addSink(esSinkBuilder.build());
            env.execute("flink learning connectors kafka");

        }

}

 

kibana中验证

在kibana中配置索引以后就可以查询了

ES系列:Kibana安装配置步骤

ES系列:ES(elasticsearch)安装配置步骤

Flink系列:Sink数据输出之ES连接器_第3张图片

你可能感兴趣的:(Flink)