zoukankan      html  css  js  c++  java
  • Flink学习笔记——DataStream API

    Flink中的DataStream任务用于实现data streams的转换,data stream可以来自不同的数据源,比如消息队列,socket,文件等。

    Ref 

    https://ci.apache.org/projects/flink/flink-docs-stable/zh/dev/datastream_api.html
    

     使用DataStream API需要使用stream env

    StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
    

    DataStream支持的Data Source有:File-based,Socket-based,Collection-based,Custom

    1.File-based

    readTextFile(path) - Reads text files, i.e. files that respect the TextInputFormat specification, line-by-line and returns them as Strings.
    
    readFile(fileInputFormat, path) - Reads (once) files as dictated by the specified file input format.
    
    readFile(fileInputFormat, path, watchType, interval, pathFilter, typeInfo) - This is the method called internally by the two previous ones. It reads files in the path based on the given fileInputFormat. Depending on the provided watchType, this source may periodically monitor (every interval ms) the path for new data (FileProcessingMode.PROCESS_CONTINUOUSLY), or process once the data currently in the path and exit (FileProcessingMode.PROCESS_ONCE). Using the pathFilter, the user can further exclude files from being processed.
    

    2.Socket-based

    socketTextStream - Reads from a socket. Elements can be separated by a delimiter
    

    3.Collection-based

    fromCollection(Collection) - Creates a data stream from the Java Java.util.Collection. All elements in the collection must be of the same type.
    
    fromCollection(Iterator, Class) - Creates a data stream from an iterator. The class specifies the data type of the elements returned by the iterator.
    
    fromElements(T ...) - Creates a data stream from the given sequence of objects. All objects must be of the same type.
    
    fromParallelCollection(SplittableIterator, Class) - Creates a data stream from an iterator, in parallel. The class specifies the data type of the elements returned by the iterator.
    
    generateSequence(from, to) - Generates the sequence of numbers in the given interval, in parallel.
    

    4.Custom

    addSource - Attach a new source function. For example, to read from Apache Kafka you can use addSource(new FlinkKafkaConsumer<>(...)). See connectors for more details
    

     Data Stream支持的transformations算子

    https://ci.apache.org/projects/flink/flink-docs-release-1.12/zh/dev/stream/operators/
    

      

     DataStream支持的Data Sink有:

    writeAsText() / TextOutputFormat - Writes elements line-wise as Strings. The Strings are obtained by calling the toString() method of each element.
    
    writeAsCsv(...) / CsvOutputFormat - Writes tuples as comma-separated value files. Row and field delimiters are configurable. The value for each field comes from the toString() method of the objects.
    
    print() / printToErr() - Prints the toString() value of each element on the standard out / standard error stream. Optionally, a prefix (msg) can be provided which is prepended to the output. This can help to distinguish between different calls to print. If the parallelism is greater than 1, the output will also be prepended with the identifier of the task which produced the output.
    
    writeUsingOutputFormat() / FileOutputFormat - Method and base class for custom file outputs. Supports custom object-to-bytes conversion.
    
    writeToSocket - Writes elements to a socket according to a SerializationSchema
    
    addSink - Invokes a custom sink function. Flink comes bundled with connectors to other systems (such as Apache Kafka) that are implemented as sink functions.
    

      

  • 相关阅读:
    使用LR编写windows sockets协议xml报文格式脚本实战
    使用LR编写HTTP协议Json报文格式接口脚本实战
    web类协议脚本-飞机订票系统示例
    使用LR编写下载类脚本
    python算法-选择排序
    python算法-冒泡排序
    用户在浏览器中输入一个url发生的奥秘
    浅谈cookie和session
    selenium加载配置参数,让chrome浏览器不出现‘Chrome正在受到自动软件的控制’的提示语,以及后台静默模式启动自动化测试,不占用桌面的方法
    Python之文件和目录操作
  • 原文地址:https://www.cnblogs.com/tonglin0325/p/14121337.html
Copyright © 2011-2022 走看看