zoukankan      html  css  js  c++  java
  • Fundmentals in Stream Computing

    Spark programs are structured on RDDs: they invole reading data from stable storage into the RDD format, performing a number of computations and

    data transformations on the RDD, and writing the result RDD to stable storage on collecting to the driver. Thus, most of the power of Spark comes from

    its transformation: operations that are defined on RDDs and return RDDs.

    1. Need core underlying layer as basic fundmentals

    2. Providing the API to high level

    3. Stream computing = core underlying API + Distributed RPC + Computing Template + Cluster of executor

    4.What will be computed, the Sequence of computed  and definition of (K,V) are totally in hand of Users through the defined Computing Template.

    5. We can say that Distributed Computing is a kind of platform to provide more Computing Template to operate the user data which is splited and distributed in cluster.

    6. The ML/Bigdata SQL alike use these Stream API to do there jobs.  

    7. Remmeber that Stream Computing is a platform or runtime of operating distributed data with Computing Template (transformation API).

    8. We can see a lot of common between  StreamComputing and OS, which all provide the API to have operation on Data in Stream and on Hardeware in OS.

    9.Stream Computing Runtime has API of Computing Template / Computing Generic;  OS has API of Resource Operation on PC hardware.

    Operators transform one or more DataStreams into a new DataStream. Programs can combine multiple transformations into sophisticated dataflow topologies.

    ransformationDescription
    Map
    DataStream → DataStream

    Takes one element and produces one element. A map function that doubles the values of the input stream:

    DataStream<Integer> dataStream = //...
    dataStream.map(new MapFunction<Integer, Integer>() {
        @Override
        public Integer map(Integer value) throws Exception {
            return 2 * value;
        }
    });
        
    FlatMap
    DataStream → DataStream

    Takes one element and produces zero, one, or more elements. A flatmap function that splits sentences to words:

    dataStream.flatMap(new FlatMapFunction<String, String>() {
        @Override
        public void flatMap(String value, Collector<String> out)
            throws Exception {
            for(String word: value.split(" ")){
                out.collect(word);
            }
        }
    });
        
    Filter
    DataStream → DataStream

    Evaluates a boolean function for each element and retains those for which the function returns true. A filter that filters out zero values:

    dataStream.filter(new FilterFunction<Integer>() {
        @Override
        public boolean filter(Integer value) throws Exception {
            return value != 0;
        }
    });
        
    KeyBy
    DataStream → KeyedStream

    Logically partitions a stream into disjoint partitions. All records with the same key are assigned to the same partition. Internally, keyBy() is implemented with hash partitioning. There are different ways to specify keys.

    This transformation returns a KeyedStream, which is, among other things, required to use keyed state.

    dataStream.keyBy("someKey") // Key by field "someKey"
    dataStream.keyBy(0) // Key by the first element of a Tuple
        

    Attention A type cannot be a key if:

    1. it is a POJO type but does not override the hashCode() method and relies on the Object.hashCode() implementation.
    2. it is an array of any type.
    Reduce
    KeyedStream → DataStream

    A "rolling" reduce on a keyed data stream. Combines the current element with the last reduced value and emits the new value. 

    A reduce function that creates a stream of partial sums:

    keyedStream.reduce(new ReduceFunction<Integer>() {
        @Override
        public Integer reduce(Integer value1, Integer value2)
        throws Exception {
            return value1 + value2;
        }
    });
                
    Fold
    KeyedStream → DataStream

    A "rolling" fold on a keyed data stream with an initial value. Combines the current element with the last folded value and emits the new value. 

    A fold function that, when applied on the sequence (1,2,3,4,5), emits the sequence "start-1", "start-1-2", "start-1-2-3", ...

    DataStream<String> result =
      keyedStream.fold("start", new FoldFunction<Integer, String>() {
        @Override
        public String fold(String current, Integer value) {
            return current + "-" + value;
        }
      });
              
    Aggregations
    KeyedStream → DataStream

    Rolling aggregations on a keyed data stream. The difference between min and minBy is that min returns the minimum value, whereas minBy returns the element that has the minimum value in this field (same for max and maxBy).

    keyedStream.sum(0);
    keyedStream.sum("key");
    keyedStream.min(0);
    keyedStream.min("key");
    keyedStream.max(0);
    keyedStream.max("key");
    keyedStream.minBy(0);
    keyedStream.minBy("key");
    keyedStream.maxBy(0);
    keyedStream.maxBy("key");
        
    Window
    KeyedStream → WindowedStream

    Windows can be defined on already partitioned KeyedStreams. Windows group the data in each key according to some characteristic (e.g., the data that arrived within the last 5 seconds). See windows for a complete description of windows.

    dataStream.keyBy(0).window(TumblingEventTimeWindows.of(Time.seconds(5))); // Last 5 seconds of data
        
    WindowAll
    DataStream → AllWindowedStream

    Windows can be defined on regular DataStreams. Windows group all the stream events according to some characteristic (e.g., the data that arrived within the last 5 seconds). See windows for a complete description of windows.

    WARNING: This is in many cases a non-parallel transformation. All records will be gathered in one task for the windowAll operator.

    dataStream.windowAll(TumblingEventTimeWindows.of(Time.seconds(5))); // Last 5 seconds of data
      
    Window Apply
    WindowedStream → DataStream
    AllWindowedStream → DataStream

    Applies a general function to the window as a whole. Below is a function that manually sums the elements of a window.

    Note: If you are using a windowAll transformation, you need to use an AllWindowFunction instead.

    windowedStream.apply (new WindowFunction<Tuple2<String,Integer>, Integer, Tuple, Window>() {
        public void apply (Tuple tuple,
                Window window,
                Iterable<Tuple2<String, Integer>> values,
                Collector<Integer> out) throws Exception {
            int sum = 0;
            for (value t: values) {
                sum += t.f1;
            }
            out.collect (new Integer(sum));
        }
    });
    
    // applying an AllWindowFunction on non-keyed window stream
    allWindowedStream.apply (new AllWindowFunction<Tuple2<String,Integer>, Integer, Window>() {
        public void apply (Window window,
                Iterable<Tuple2<String, Integer>> values,
                Collector<Integer> out) throws Exception {
            int sum = 0;
            for (value t: values) {
                sum += t.f1;
            }
            out.collect (new Integer(sum));
        }
    });
        
    Window Reduce
    WindowedStream → DataStream

    Applies a functional reduce function to the window and returns the reduced value.

    windowedStream.reduce (new ReduceFunction<Tuple2<String,Integer>>() {
        public Tuple2<String, Integer> reduce(Tuple2<String, Integer> value1, Tuple2<String, Integer> value2) throws Exception {
            return new Tuple2<String,Integer>(value1.f0, value1.f1 + value2.f1);
        }
    });
        
    Window Fold
    WindowedStream → DataStream

    Applies a functional fold function to the window and returns the folded value. The example function, when applied on the sequence (1,2,3,4,5), folds the sequence into the string "start-1-2-3-4-5":

    windowedStream.fold("start", new FoldFunction<Integer, String>() {
        public String fold(String current, Integer value) {
            return current + "-" + value;
        }
    });
        
    Aggregations on windows
    WindowedStream → DataStream

    Aggregates the contents of a window. The difference between min and minBy is that min returns the minimum value, whereas minBy returns the element that has the minimum value in this field (same for max and maxBy).

    windowedStream.sum(0);
    windowedStream.sum("key");
    windowedStream.min(0);
    windowedStream.min("key");
    windowedStream.max(0);
    windowedStream.max("key");
    windowedStream.minBy(0);
    windowedStream.minBy("key");
    windowedStream.maxBy(0);
    windowedStream.maxBy("key");
        
    Union
    DataStream* → DataStream

    Union of two or more data streams creating a new stream containing all the elements from all the streams. Note: If you union a data stream with itself you will get each element twice in the resulting stream.

    dataStream.union(otherStream1, otherStream2, ...);
        
    Window Join
    DataStream,DataStream → DataStream

    Join two data streams on a given key and a common window.

    dataStream.join(otherStream)
        .where(<key selector>).equalTo(<key selector>)
        .window(TumblingEventTimeWindows.of(Time.seconds(3)))
        .apply (new JoinFunction () {...});
        
    Interval Join
    KeyedStream,KeyedStream → DataStream

    Join two elements e1 and e2 of two keyed streams with a common key over a given time interval, so that e1.timestamp + lowerBound <= e2.timestamp <= e1.timestamp + upperBound

    // this will join the two streams so that
    // key1 == key2 && leftTs - 2 < rightTs < leftTs + 2
    keyedStream.intervalJoin(otherKeyedStream)
        .between(Time.milliseconds(-2), Time.milliseconds(2)) // lower and upper bound
        .upperBoundExclusive(true) // optional
        .lowerBoundExclusive(true) // optional
        .process(new IntervalJoinFunction() {...});
        
    Window CoGroup
    DataStream,DataStream → DataStream

    Cogroups two data streams on a given key and a common window.

    dataStream.coGroup(otherStream)
        .where(0).equalTo(1)
        .window(TumblingEventTimeWindows.of(Time.seconds(3)))
        .apply (new CoGroupFunction () {...});
        
    Connect
    DataStream,DataStream → ConnectedStreams

    "Connects" two data streams retaining their types. Connect allowing for shared state between the two streams.

    DataStream<Integer> someStream = //...
    DataStream<String> otherStream = //...
    
    ConnectedStreams<Integer, String> connectedStreams = someStream.connect(otherStream);
        
    CoMap, CoFlatMap
    ConnectedStreams → DataStream

    Similar to map and flatMap on a connected data stream

    connectedStreams.map(new CoMapFunction<Integer, String, Boolean>() {
        @Override
        public Boolean map1(Integer value) {
            return true;
        }
    
        @Override
        public Boolean map2(String value) {
            return false;
        }
    });
    connectedStreams.flatMap(new CoFlatMapFunction<Integer, String, String>() {
    
       @Override
       public void flatMap1(Integer value, Collector<String> out) {
           out.collect(value.toString());
       }
    
       @Override
       public void flatMap2(String value, Collector<String> out) {
           for (String word: value.split(" ")) {
             out.collect(word);
           }
       }
    });
        
    Split
    DataStream → SplitStream

    Split the stream into two or more streams according to some criterion.

    SplitStream<Integer> split = someDataStream.split(new OutputSelector<Integer>() {
        @Override
        public Iterable<String> select(Integer value) {
            List<String> output = new ArrayList<String>();
            if (value % 2 == 0) {
                output.add("even");
            }
            else {
                output.add("odd");
            }
            return output;
        }
    });
                    
    Select
    SplitStream → DataStream

    Select one or more streams from a split stream.

    SplitStream<Integer> split;
    DataStream<Integer> even = split.select("even");
    DataStream<Integer> odd = split.select("odd");
    DataStream<Integer> all = split.select("even","odd");
                    
    Iterate
    DataStream → IterativeStream → DataStream

    Creates a "feedback" loop in the flow, by redirecting the output of one operator to some previous operator. This is especially useful for defining algorithms that continuously update a model. The following code starts with a stream and applies the iteration body continuously. Elements that are greater than 0 are sent back to the feedback channel, and the rest of the elements are forwarded downstream. See iterations for a complete description.

    IterativeStream<Long> iteration = initialStream.iterate();
    DataStream<Long> iterationBody = iteration.map (/*do something*/);
    DataStream<Long> feedback = iterationBody.filter(new FilterFunction<Long>(){
        @Override
        public boolean filter(Integer value) throws Exception {
            return value > 0;
        }
    });
    iteration.closeWith(feedback);
    DataStream<Long> output = iterationBody.filter(new FilterFunction<Long>(){
        @Override
        public boolean filter(Integer value) throws Exception {
            return value <= 0;
        }
    });
                    
    Extract Timestamps
    DataStream → DataStream

    Extracts timestamps from records in order to work with windows that use event time semantics. See Event Time.

    stream.assignTimestamps (new TimeStampExtractor() {...});
                    
  • 相关阅读:
    【原】AMFObject数据格式详解
    STL总结 (C++)
    git相关项目迁移
    OBS_Classic经典版框架
    windows线程同步的几种方式
    面试题之strcpy / strlen / strcat / strcmp的实现
    python实用技巧
    Python Flask学习笔记之数据库
    Python Flask学习笔记之Web表单
    必应每日壁纸批量下载
  • 原文地址:https://www.cnblogs.com/iiiDragon/p/9802332.html
Copyright © 2011-2022 走看看