zoukankan      html  css  js  c++  java
  • deeplearning.ai 作业中的Python常用命令

    1. print大法

    test = Hello World
    print ("test:" + test)
    

    2. math和numpy的区别:math只对单个元素,numpy会broadcasting。  

    import math
    import numpy as np
    x = [1, 2, 3]
    s = 1/(1+math.exp(-x)  #这条语句会报错
    s = 1/(1+np.exp(-x)) #这条语句没问题。
    

    3. 定义函数

    def sigmoid_derivative(x):
        s = 1/(1+np.exp(-x)
        ds = s*(1-s)
        return ds
    
    x = np.array([1, 2, 3])
    print ("sigmoid_derivative(x) = " + str(sigmoid_derivative(x))
    

    4. shape和reshape:array的维度是从最外层[]开始数,最外层[]的元素数量就是第一个维度的大小,最内层[]的元素数量是最后一个维度的大小。array的序号是从0开始的。

    reshape(x.shape[0], -1)中的-1是留这个维度给程序自己算出结果,其他维度必须指定。

    image = np.array([[[ 0.67826139,  0.29380381],
            [ 0.90714982,  0.52835647],
            [ 0.4215251 ,  0.45017551]],
    
           [[ 0.92814219,  0.96677647],
            [ 0.85304703,  0.52351845],
            [ 0.19981397,  0.27417313]],
    
           [[ 0.60659855,  0.00533165],
            [ 0.10820313,  0.49978937],
            [ 0.34144279,  0.94630077]],
                      
           [[ 0.85304703,  0.52835647],
            [ 0.10820313,  0.45017551],
            [ 0.34144279,  0.90714982]]])
    
    print ("image[3][2][1]: " + str(image[3][2][1])) # image[2][3][1]: 0.90714982
    
    print ("image.shape = " + str(image.shape))  # image.shape = (4, 3, 2)
    
    vector = image.reshape(image.shape[0]*image.shape[1]*image.shape[2], 1)
    print ("vector.shape = " + str(vector.shape)) # vector.shape = (24, 1)
    

    5. Normalization: x_norm = np.linalg.norm(x, ord = 2, axis = 1, keepdims = True),其中ord=2是默认值可以不写,axis=1是对横向量归一化,对于一维向量,axis只可以为0,keepdims=True是保持array的shape,防止出现(2, )这种shape,以防万一尽量都写上keepdims=True。

    x = np.array([[0,3,4],[2,6,4]])
    x_norm = np.linalg.norm(x, ord=2, axis =1, keepdims=True)
    x_new = x/x_norm
    print ("x: " + str(x))
    print ("x_norm: "+str(x_norm))
    print ("x_new: "+str(x_new))
    
    输出:
    x: [[0 3 4]
     [2 6 4]]
    x_norm: [[ 5.        ]
     [ 7.48331477]]
    x_new: [[ 0.          0.6         0.8       ]
     [ 0.26726124  0.80178373  0.53452248]]
    

    6. 求和:x_sum = np.sum(x, axis = 1, keepdims = True),其中axis=1是对第1个维度求和,比如shape是(4,2,3)的array,求和后是(4,1,3)。对于二维图像来说第1个维度就是横向量。

    x_sum = np.sum(x),是把x的所有元素都加起来得到一个scalar。尽量axis和keepdims的参数都填写,避免出错。

    x = np.array([[0,3,4],[2,6,4]])
    x_sum = np.sum(x, axis = 1, keepdims = True)
    print ("x_sum: "+str(x_sum))
    
    输出:
    x_sum: [[ 7]
     [12]]
    

    7. 不同的乘法:

    np.dot(x1, x2)对于矩阵是正常的矩阵乘法,对于一维向量是对应元素相乘再求和,Z = WX+b的WX部分就用np.dot。

    np.multiply(x1, x2)是一维向量对应元素相乘,得到一个一维向量。

    这里time是计时的方法。  

    import time
    
    x1 = [9, 2, 5, 0, 0, 7, 5, 0, 0, 0, 9, 2, 5, 0, 0]
    x2 = [9, 2, 2, 9, 0, 9, 2, 5, 0, 0, 9, 2, 5, 0, 0]
    
    ### 向量点乘,对应元素相乘再求和 ###
    tic = time.process_time()
    dot = np.dot(x1,x2)
    toc = time.process_time()
    print ("dot = " + str(dot) + "
     ----- Computation time = " + str(1000*(toc - tic)) + "ms")
    
    ### x1和x2的转置做矩阵乘法,n*1的矩阵乘以1*n的矩阵 ###
    tic = time.process_time()
    outer = np.outer(x1,x2)
    toc = time.process_time()
    print ("outer = " + str(outer) + "
     ----- Computation time = " + str(1000*(toc - tic)) + "ms")
    
    ### 对应元素相乘得到1*n的向量 ###
    tic = time.process_time()
    mul = np.multiply(x1,x2)
    toc = time.process_time()
    print ("elementwise multiplication = " + str(mul) + "
     ----- Computation time = " + str(1000*(toc - tic)) + "ms")
    
    ### 正常的矩阵乘法 ###
    W = np.random.rand(3,len(x1)) # Random 3*len(x1) numpy array
    tic = time.process_time()
    dot = np.dot(W,x1)
    toc = time.process_time()
    print ("gdot = " + str(dot) + "
     ----- Computation time = " + str(1000*(toc - tic)) + "ms")
    
    输出:
    dot = 278
     ----- Computation time = 0.0ms
    outer = [[81 18 18 81  0 81 18 45  0  0 81 18 45  0  0]
     [18  4  4 18  0 18  4 10  0  0 18  4 10  0  0]
     [45 10 10 45  0 45 10 25  0  0 45 10 25  0  0]
     [ 0  0  0  0  0  0  0  0  0  0  0  0  0  0  0]
     [ 0  0  0  0  0  0  0  0  0  0  0  0  0  0  0]
     [63 14 14 63  0 63 14 35  0  0 63 14 35  0  0]
     [45 10 10 45  0 45 10 25  0  0 45 10 25  0  0]
     [ 0  0  0  0  0  0  0  0  0  0  0  0  0  0  0]
     [ 0  0  0  0  0  0  0  0  0  0  0  0  0  0  0]
     [ 0  0  0  0  0  0  0  0  0  0  0  0  0  0  0]
     [81 18 18 81  0 81 18 45  0  0 81 18 45  0  0]
     [18  4  4 18  0 18  4 10  0  0 18  4 10  0  0]
     [45 10 10 45  0 45 10 25  0  0 45 10 25  0  0]
     [ 0  0  0  0  0  0  0  0  0  0  0  0  0  0  0]
     [ 0  0  0  0  0  0  0  0  0  0  0  0  0  0  0]]
     ----- Computation time = 0.0ms
    elementwise multiplication = [81  4 10  0  0 63 10  0  0  0 81  4 25  0  0]
     ----- Computation time = 0.0ms
    gdot = [ 14.98632469  18.30746169  17.30396991]
     ----- Computation time = 0.0ms
    

    8. Broadcasting:loss = np.sum((yhat - y)**2, keepdims = True),这种**的运算,也是对每个元素计算平方。类似的+-*/也都是如此。

    9. 初始化: w = np.zeros((dim, 1), dtype=float),注意这里shape是要加括号的,例如w = np.zeros((4,2))。 

    10. 画图:matplotlib。

    import matplotlib.pyplot as plt
    
    # Plot learning curve (with costs)
    costs = np.squeeze(d['costs'])
    plt.plot(costs)
    plt.ylabel('cost')
    plt.xlabel('iterations (per hundreds)')
    plt.title("Learning rate =" + str(d["learning_rate"]))
    plt.show()  

    画散点图:plt.scatter(X[0, :], X[1, :], c=Y, s=40, cmap=plt.cm.Spectral)。第一个参数和第二个参数分别是横轴和纵轴的坐标;参数c是指明散点颜色,一般用‘b’(蓝色)这种指明,这里使用Y这样的数值为0或者1的数组指明;参数s是散点的大小,数值越大点越大;cmap指明了颜色,这个网站列出了全部http://scipy-cookbook.readthedocs.io/items/Matplotlib_Show_colormaps.html。

    11. 图像处理:scipy

    改变图像尺寸:my_image = scipy.misc.imresize(image, size=(num_px,num_px))

      

      

      

  • 相关阅读:
    Android 解决小米手机Android Studio安装app 报错的问题It is possible that this issue is resolved by uninstalling an existi
    Android Unresolved Dependencies
    Android studio 自定义打包apk名
    Android Fragment与Activity交互的几种方式
    魅族和三星Galaxy 5.0webView 问题Android Crash Report
    Android几种常见的多渠道(批量)打包方式介绍
    Android批量打包 如何一秒内打完几百个apk渠道包
    上周热点回顾(9.30-10.6)团队
    上周热点回顾(9.23-9.29)团队
    上周热点回顾(9.16-9.22)团队
  • 原文地址:https://www.cnblogs.com/zonghaochen/p/7739899.html
Copyright © 2011-2022 走看看