zoukankan      html  css  js  c++  java
  • Complexities

    Searching

    AlgorithmData StructureTime ComplexitySpace Complexity
      AverageWorstWorst
    Depth First Search (DFS) Graph of |V| vertices and |E| edges - O(|E| + |V|) O(|V|)
    Breadth First Search (BFS) Graph of |V| vertices and |E| edges - O(|E| + |V|) O(|V|)
    Binary search Sorted array of n elements O(log(n)) O(log(n)) O(1)
    Linear (Brute Force) Array O(n) O(n) O(1)
    Shortest path by Dijkstra,
    using a Min-heap as priority queue
    Graph with |V| vertices and |E| edges O((|V| + |E|) log |V|) O((|V| + |E|) log |V|) O(|V|)
    Shortest path by Dijkstra,
    using an unsorted array as priority queue
    Graph with |V| vertices and |E| edges O(|V|^2) O(|V|^2) O(|V|)
    Shortest path by Bellman-Ford Graph with |V| vertices and |E| edges O(|V||E|) O(|V||E|) O(|V|)

    Sorting

    AlgorithmData StructureTime ComplexityWorst Case Auxiliary Space Complexity
      BestAverageWorstWorst
    Quicksort Array O(n log(n)) O(n log(n)) O(n^2) O(n)
    Mergesort Array O(n log(n)) O(n log(n)) O(n log(n)) O(n)
    Heapsort Array O(n log(n)) O(n log(n)) O(n log(n)) O(1)
    Bubble Sort Array O(n) O(n^2) O(n^2) O(1)
    Insertion Sort Array O(n) O(n^2) O(n^2) O(1)
    Select Sort Array O(n^2) O(n^2) O(n^2) O(1)
    Bucket Sort Array O(n+k) O(n+k) O(n^2) O(nk)
    Radix Sort Array O(nk) O(nk) O(nk) O(n+k)

    Data Structures

    Data StructureTime ComplexitySpace Complexity
     AverageWorstWorst
     IndexingSearchInsertionDeletionIndexingSearchInsertionDeletion 
    Basic Array O(1) O(n) - - O(1) O(n) - - O(n)
    Dynamic Array O(1) O(n) O(n) O(n) O(1) O(n) O(n) O(n) O(n)
    Singly-Linked List O(n) O(n) O(1) O(1) O(n) O(n) O(1) O(1) O(n)
    Doubly-Linked List O(n) O(n) O(1) O(1) O(n) O(n) O(1) O(1) O(n)
    Skip List O(log(n)) O(log(n)) O(log(n)) O(log(n)) O(n) O(n) O(n) O(n) O(n log(n))
    Hash Table - O(1) O(1) O(1) - O(n) O(n) O(n) O(n)
    Binary Search Tree O(log(n)) O(log(n)) O(log(n)) O(log(n)) O(n) O(n) O(n) O(n) O(n)
    Cartresian Tree - O(log(n)) O(log(n)) O(log(n)) - O(n) O(n) O(n) O(n)
    B-Tree O(log(n)) O(log(n)) O(log(n)) O(log(n)) O(log(n)) O(log(n)) O(log(n)) O(log(n)) O(n)
    Red-Black Tree O(log(n)) O(log(n)) O(log(n)) O(log(n)) O(log(n)) O(log(n)) O(log(n)) O(log(n)) O(n)
    Splay Tree - O(log(n)) O(log(n)) O(log(n)) - O(log(n)) O(log(n)) O(log(n)) O(n)
    AVL Tree O(log(n)) O(log(n)) O(log(n)) O(log(n)) O(log(n)) O(log(n)) O(log(n)) O(log(n)) O(n)

    Heaps

    HeapsTime Complexity
     HeapifyFind MaxExtract MaxIncrease KeyInsertDeleteMerge 
    Linked List (sorted) - O(1) O(1) O(n) O(n) O(1) O(m+n)
    Linked List (unsorted) - O(n) O(n) O(1) O(1) O(1) O(1)
    Binary Heap O(n) O(1) O(log(n)) O(log(n)) O(log(n)) O(log(n)) O(m+n)
    Binomial Heap - O(log(n)) O(log(n)) O(log(n)) O(log(n)) O(log(n)) O(log(n))
    Fibonacci Heap - O(1) O(log(n))* O(1)* O(1) O(log(n))* O(1)

    Graphs

    Node / Edge ManagementStorageAdd VertexAdd EdgeRemove VertexRemove EdgeQuery
    Adjacency list O(|V|+|E|) O(1) O(1) O(|V| + |E|) O(|E|) O(|V|)
    Incidence list O(|V|+|E|) O(1) O(1) O(|E|) O(|E|) O(|E|)
    Adjacency matrix O(|V|^2) O(|V|^2) O(1) O(|V|^2) O(1) O(1)
    Incidence matrix O(|V| ⋅ |E|) O(|V| ⋅ |E|) O(|V| ⋅ |E|) O(|V| ⋅ |E|) O(|V| ⋅ |E|) O(|E|)

    Notation for asymptotic growth

    letterboundgrowth
    (theta) Θ upper and lower, tight[1] equal[2]
    (big-oh) O upper, tightness unknown less than or equal[3]
    (small-oh) o upper, not tight less than
    (big omega) Ω lower, tightness unknown greater than or equal
    (small omega) ω lower, not tight greater than

    [1] Big O is the upper bound, while Omega is the lower bound. Theta requires both Big O and Omega, so that's why it's referred to as a tight bound (it must be both the upper and lower bound). For example, an algorithm taking Omega(n log n) takes at least n log n time but has no upper limit. An algorithm taking Theta(n log n) is far preferential since it takes AT LEAST n log n (Omega n log n) and NO MORE THAN n log n (Big O n log n).SO

    [2] f(x)=Θ(g(n)) means f (the running time of the algorithm) grows exactly like g when n (input size) gets larger. In other words, the growth rate of f(x) is asymptotically proportional to g(n).

    [3] Same thing. Here the growth rate is no faster than g(n). big-oh is the most useful because represents the worst-case behavior.

    In short, if algorithm is __ then its performance is __

    algorithmperformance
    o(n) < n
    O(n) ≤ n
    Θ(n) = n
    Ω(n) ≥ n
    ω(n) > n

    Big-O Complexity Chart

    This interactive chart, created by our friends over at MeteorCharts, shows the number of operations (y axis) required to obtain a result as the number of elements (x axis) increase.  O(n!) is the worst complexity which requires 720 operations for just 6 elements, while O(1) is the best complexity, which only requires a constant number of operations for any number of elements.

    MeteorCharts Line Chart Description

    The title of the chart is "Big-O Complexity Chart". The x axis begins at "0" and ends at "100" from left to right. The y axis begins at "0k" and ends at "1k" from bottom to top. There are 7 series lines. The line titled "O(n!)" begins at "0k" and rises to "5k". The line titled "O(2^n)" begins at "0k" and rises to "4.1k". The line titled "O(n^2)" begins at "0k" and rises to "5.9k". The line titled "O(n log(n))" begins at "0k" and rises to "0.5k". The line titled "O(n)" begins at "0k" and rises to "0.1k". The line titled "O(log(n))" begins at "0k" and rises to "0k". The line titled "O(1)" begins at "0k" and rises to "0k".

    Reference:

     http://bigocheatsheet.com/
  • 相关阅读:
    接口中解决默认方法冲突
    继承中的访问域问题
    继承中的多态问题
    Java中方法的调用过程
    【JS】表格获取每一列方法
    【Git报错】 ! [rejected] master -> master (fetch first)
    【Vue】vue-cli配置proxyTable调用服务器接口
    layui监听多个radio事件
    【总结】display属性inline,block,inline-block
    【实例总结】fixed定位元素内部滚动显示
  • 原文地址:https://www.cnblogs.com/winscoder/p/3535525.html
Copyright © 2011-2022 走看看