zoukankan      html  css  js  c++  java
  • [GeeksForGeeks] Minimum length unsorted subarray, sorting which makes the array sorted

    Given an array, find the minimum length unsorted subarray. After soring this unsorted subarray,

    the whole array is sorted.

    Example, if the input array is [10, 12, 20, 30, 25, 40, 32, 31, 45, 50, 60], your program should be 

    able to find that the minimum length unsorted subarray lies between the indices 3 and 7.

    Solution 1. O(n * logn) runtime, O(n) space

    1. make a copy of the given array.

    2. sort the copy.

    3. find the first and last indices where the given array and its sorted copy don't match .

    Solution 2. O(n) runtime, O(1) space

    The core idea of this solution is to first find the start index i of the unsorted subarray where arr[i] < arr[i- 1], 

    and use arr[i - 1] as the current max value. 

    Then keep scaning the rest of the array, as long as arr[i] < arr[i - 1] or arr[i] is smaller than the max value, 

    we know that arr[i] is not in its sorted place. Update end index to i.

    If arr[i] is bigger than max, update the current max to arr[i].

     1 public int[] getMinLenUnsortedSubarray(int[] arr) {
     2     int[] r = {-1, -1};
     3     if(arr == null || arr.length <= 1) {
     4         return r;
     5     }
     6     int i = 1;
     7     for(; i < arr.length; i++){
     8         if(arr[i] < arr[i - 1]) {
     9             break;
    10         }
    11     }
    12     if(i == arr.length - 1){
    13         return r;            
    14     }
    15     r[0] = i - 1;
    16     r[1] = i;
    17     int max = arr[i - 1];
    18     for(i++; i < arr.length; i++){
    19         if(arr[i] < arr[i - 1] || arr[i] < max){
    20             r[1] = i;
    21         }
    22         else if(arr[i] >= max){
    23             max = arr[i];
    24         }
    25     }
    26     return r;
    27 }
  • 相关阅读:
    Spark之 SparkSql整合hive
    Spark之 使用SparkSql操作Hive的Scala程序实现
    Spark之 RDD转换成DataFrame的Scala实现
    Spark之 SparkSql、DataFrame、DataSet介绍
    Spark之 RDD
    Spark scala和java的api使用
    设计模式之四观察者模式
    设计模式之三静态代理模式
    设计模式之二装饰者模式
    设计思想之二面向接口编程
  • 原文地址:https://www.cnblogs.com/lz87/p/7337204.html
Copyright © 2011-2022 走看看