zoukankan      html  css  js  c++  java
  • pyspark设置python的版本

    liuf2@liuf2-virtual-machine ~/a/s/bin> ./pyspark                                                                                                     
    Python 2.7.15rc1 (default, Nov 12 2018, 14:31:15)                                                                                                    
    [GCC 7.3.0] on linux2                                                                                                                                
    Type "help", "copyright", "credits" or "license" for more information.                                                                               
    2019-03-26 23:57:06 WARN  Utils:66 - Your hostname, liuf2-virtual-machine resolves to a loopback address: 127.0.1.1; using 192.168.124.129 instead (o
    n interface ens33)                                                                                                                                   
    2019-03-26 23:57:06 WARN  Utils:66 - Set SPARK_LOCAL_IP if you need to bind to another address                                                       
    2019-03-26 23:57:07 WARN  NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
    Setting default log level to "WARN".                                                                                                                 
    To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).                                                         
    Welcome to                                                                                                                                           
          ____              __                                                                                                                           
         / __/__  ___ _____/ /__                                                                                                                         
        _ / _ / _ `/ __/  '_/                                                                                                                         
       /__ / .__/\_,_/_/ /_/\_   version 2.4.0                                                                                                          
          /_/                                                                                                                                            
                                                                                                                                                         
    Using Python version 2.7.15rc1 (default, Nov 12 2018 14:31:15)                                                                                       
    SparkSession available as 'spark'.                                                                                                                   
    >>> exit() 

    这是我启动spark后的输出信息, 我尝试更改spark默认版本

    1.   对以下文件进行编辑

    liuf2@liuf2-virtual-machine ~/a/s/conf> vim spark-env.sh.template
    liuf2@liuf2-virtual-machine ~/a/s/conf> mv spark-env.sh.template spark-env.sh
    liuf2@liuf2-virtual-machine ~/a/s/conf> vim spark-env.sh

    在底部添加
    export PYSPARK_PYTHON=/usr/bin/python3

    2.  在spark的bin目录下进行以下编辑

    liuf2@liuf2-virtual-machine ~/a/s/bin> vim pyspark
    PYSPARK_PYTHON=python
    修改为

    PYSPARK_PYTHON=python3
    # Determine the Python executable to use for the executors:
    if [[ -z "$PYSPARK_PYTHON" ]]; then
      if [[ $PYSPARK_DRIVER_PYTHON == *ipython* && ! $WORKS_WITH_IPYTHON ]]; then
        echo "IPython requires Python 2.7+; please install python2.7 or set PYSPARK_PYTHON" 1>&2
        exit 1
      else
        PYSPARK_PYTHON=python3
      fi
    fi
    export PYSPARK_PYTHON

    3. 重新启动pyspark

    liuf2@liuf2-virtual-machine ~/a/s/bin> ./pyspark
    Python 3.6.7 (default, Oct 22 2018, 11:32:17)
    [GCC 8.2.0] on linux
    Type "help", "copyright", "credits" or "license" for more information.
    2019-03-27 00:02:00 WARN  Utils:66 - Your hostname, liuf2-virtual-machine resolves to a loopback address: 127.0.1.1; using 192.168.124.129 instead (on interface ens33)
    2019-03-27 00:02:00 WARN  Utils:66 - Set SPARK_LOCAL_IP if you need to bind to another address
    2019-03-27 00:02:00 WARN  NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
    Setting default log level to "WARN".
    To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
    Welcome to
          ____              __
         / __/__  ___ _____/ /__
        _ / _ / _ `/ __/  '_/
       /__ / .__/\_,_/_/ /_/\_   version 2.4.0
          /_/
    
    Using Python version 3.6.7 (default, Oct 22 2018 11:32:17)
    SparkSession available as 'spark'.
  • 相关阅读:
    Notepad++ 配置信息导出导入(快捷键配置导出导入等等)
    SQL 删除重复数据
    PostgreSQL Update 根据B表更新A表
    桌面应用基本创建流程
    Android shape和selector完全总结
    Android 第三方框架之Charts
    java常见五种排序方式
    Objective-c之字典精讲
    OC语言之---NSArray
    Objective-c编程之NSString精讲
  • 原文地址:https://www.cnblogs.com/tcppdu/p/10604841.html
Copyright © 2011-2022 走看看