zoukankan      html  css  js  c++  java
  • 错误总结

    20/12/12 15:49:47 ERROR SparkContext: Error initializing SparkContext. java.lang.IllegalArgumentException: System memory 259522560 must be at least 471859200. Please increase heap size using the --driver-memory option or spark.driver.memory in Spark configuration. at org.apache.spark.memory.UnifiedMemoryManager$.getMaxMemory(UnifiedMemoryManager.scala:216) at org.apache.spark.memory.UnifiedMemoryManager$.apply(UnifiedMemoryManager.scala:198) at org.apache.spark.SparkEnv$.create(SparkEnv.scala:330) at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:174) at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:257) at org.apache.spark.SparkContext.<init>(SparkContext.scala:432) at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2313) at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:868) at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:860) at scala.Option.getOrElse(Option.scala:121) at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:860) at apple.TFIDF$.main(TFIDF.scala:11) at apple.TFIDF.main(TFIDF.scala) 20/12/12 15:49:47 INFO SparkContext: Successfully stopped SparkContext Exception in thread "main" java.lang.IllegalArgumentException: System memory 259522560 must be at least 471859200. Please increase heap size using the --driver-memory option or spark.driver.memory in Spark configuration. at org.apache.spark.memory.UnifiedMemoryManager$.getMaxMemory(UnifiedMemoryManager.scala:216) at org.apache.spark.memory.UnifiedMemoryManager$.apply(UnifiedMemoryManager.scala:198) at org.apache.spark.SparkEnv$.create(SparkEnv.scala:330) at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:174) at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:257) at org.apache.spark.SparkContext.<init>(SparkContext.scala:432) at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2313) at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:868) at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:860) at scala.Option.getOrElse(Option.scala:121) at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:860) at apple.TFIDF$.main(TFIDF.scala:11) at apple.TFIDF.main(TFIDF.scala)

    val ss= SparkSession.builder().master("local").appName("hello").getOrCreate()
    val sc=ss.sparkContext
    sc.setLogLevel("ERROR")
    这个报错是JVM申请的memory不够导致无法启动SparkContext
    本地测试的话,可以直接在代码中conf里设置一下spark.testing.memory
    val sparkConf = new SparkConf().set("spark.testing.memory", "2147480000")
    val ss= SparkSession.builder().config(sparkConf).master("local").appName("tfidf").getOrCreate()
    val sc=ss.sparkContext
  • 相关阅读:
    奶酪(NOIP2017 Day2 T1)
    图的遍历(某谷P3916)
    20154331 EXP9web安全基础实践
    20154331 EXP8 web基础
    20154331EXP7 网络欺诈
    20154331 Exp6 信息搜集与漏洞扫描
    Exp5 MSF基础应用
    Exp4 恶意代码分析
    Exp3 免杀原理与实践
    20154331黄芮EXP2 后门原理与实践
  • 原文地址:https://www.cnblogs.com/ShyPeanut/p/14125001.html
Copyright © 2011-2022 走看看