当前位置: 首页 > news >正文

Spark2.2出现异常:ERROR SparkUI: Failed to bind SparkUI

详细错误信息如下:
复制代码
19/03/19 11:04:18 INFO util.log: Logging initialized @5402ms
19/03/19 11:04:18 INFO server.Server: jetty-9.3.z-SNAPSHOT
19/03/19 11:04:18 INFO server.Server: Started @5604ms
19/03/19 11:04:18 WARN util.Utils: Service ‘SparkUI’ could not bind on port 4040. Attempting port 4041.
19/03/19 11:04:18 WARN util.Utils: Service ‘SparkUI’ could not bind on port 4041. Attempting port 4042.
19/03/19 11:04:18 WARN util.Utils: Service ‘SparkUI’ could not bind on port 4042. Attempting port 4043.
19/03/19 11:04:18 WARN util.Utils: Service ‘SparkUI’ could not bind on port 4043. Attempting port 4044.
19/03/19 11:04:18 WARN util.Utils: Service ‘SparkUI’ could not bind on port 4044. Attempting port 4045.
19/03/19 11:04:18 WARN util.Utils: Service ‘SparkUI’ could not bind on port 4045. Attempting port 4046.
19/03/19 11:04:18 WARN util.Utils: Service ‘SparkUI’ could not bind on port 4046. Attempting port 4047.
19/03/19 11:04:18 WARN util.Utils: Service ‘SparkUI’ could not bind on port 4047. Attempting port 4048.
19/03/19 11:04:18 WARN util.Utils: Service ‘SparkUI’ could not bind on port 4048. Attempting port 4049.
19/03/19 11:04:18 WARN util.Utils: Service ‘SparkUI’ could not bind on port 4049. Attempting port 4050.
19/03/19 11:04:18 WARN util.Utils: Service ‘SparkUI’ could not bind on port 4050. Attempting port 4051.
19/03/19 11:04:18 WARN util.Utils: Service ‘SparkUI’ could not bind on port 4051. Attempting port 4052.
19/03/19 11:04:18 WARN util.Utils: Service ‘SparkUI’ could not bind on port 4052. Attempting port 4053.
19/03/19 11:04:18 WARN util.Utils: Service ‘SparkUI’ could not bind on port 4053. Attempting port 4054.
19/03/19 11:04:18 WARN util.Utils: Service ‘SparkUI’ could not bind on port 4054. Attempting port 4055.
19/03/19 11:04:18 WARN util.Utils: Service ‘SparkUI’ could not bind on port 4055. Attempting port 4056.
19/03/19 11:04:18 ERROR ui.SparkUI: Failed to bind SparkUI
java.net.BindException: 地址已在使用: Service ‘SparkUI’ failed after 16 retries (starting from 4040)! Consider explicitly setting the appropriate port for the service ‘SparkUI’ (for example spark.ui.port for SparkUI) to an available port or increasing spark.port.maxRetries.
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:433)
at sun.nio.ch.Net.bind(Net.java:425)
at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
at org.spark_project.jetty.server.ServerConnector.open(ServerConnector.java:317)
at org.spark_project.jetty.server.AbstractNetworkConnector.doStart(AbstractNetworkConnector.java:80)
at org.spark_project.jetty.server.ServerConnector.doStart(ServerConnector.java:235)
at org.spark_project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
at org.apache.spark.ui.JettyUtils . o r g .org .orgapache s p a r k spark sparkui J e t t y U t i l s JettyUtils JettyUtils$newConnector 1 ( J e t t y U t i l s . s c a l a : 333 ) a t o r g . a p a c h e . s p a r k . u i . J e t t y U t i l s 1(JettyUtils.scala:333) at org.apache.spark.ui.JettyUtils 1(JettyUtils.scala:333)atorg.apache.spark.ui.JettyUtils.org a p a c h e apache apachespark u i ui uiJettyUtilsKaTeX parse error: Can't use function '$' in math mode at position 12: httpConnect$̲1(JettyUtils.sc…anonfun 7. a p p l y ( J e t t y U t i l s . s c a l a : 368 ) a t o r g . a p a c h e . s p a r k . u i . J e t t y U t i l s 7.apply(JettyUtils.scala:368) at org.apache.spark.ui.JettyUtils 7.apply(JettyUtils.scala:368)atorg.apache.spark.ui.JettyUtils$anonfun 7. a p p l y ( J e t t y U t i l s . s c a l a : 368 ) a t o r g . a p a c h e . s p a r k . u t i l . U t i l s 7.apply(JettyUtils.scala:368) at org.apache.spark.util.Utils 7.apply(JettyUtils.scala:368)atorg.apache.spark.util.Utils a n o n f u n anonfun anonfunstartServiceOnPort 1. a p p l y 1.apply 1.applymcVI s p ( U t i l s . s c a l a : 2237 ) a t s c a l a . c o l l e c t i o n . i m m u t a b l e . R a n g e . f o r e a c h sp(Utils.scala:2237) at scala.collection.immutable.Range.foreach sp(Utils.scala:2237)atscala.collection.immutable.Range.foreachmVc s p ( R a n g e . s c a l a : 160 ) a t o r g . a p a c h e . s p a r k . u t i l . U t i l s sp(Range.scala:160) at org.apache.spark.util.Utils sp(Range.scala:160)atorg.apache.spark.util.Utils.startServiceOnPort(Utils.scala:2229)
at org.apache.spark.ui.JettyUtils . s t a r t J e t t y S e r v e r ( J e t t y U t i l s . s c a l a : 368 ) a t o r g . a p a c h e . s p a r k . u i . W e b U I . b i n d ( W e b U I . s c a l a : 130 ) a t o r g . a p a c h e . s p a r k . S p a r k C o n t e x t .startJettyServer(JettyUtils.scala:368) at org.apache.spark.ui.WebUI.bind(WebUI.scala:130) at org.apache.spark.SparkContext .startJettyServer(JettyUtils.scala:368)atorg.apache.spark.ui.WebUI.bind(WebUI.scala:130)atorg.apache.spark.SparkContext$anonfun 11. a p p l y ( S p a r k C o n t e x t . s c a l a : 460 ) a t o r g . a p a c h e . s p a r k . S p a r k C o n t e x t 11.apply(SparkContext.scala:460) at org.apache.spark.SparkContext 11.apply(SparkContext.scala:460)atorg.apache.spark.SparkContext$anonfun 11. a p p l y ( S p a r k C o n t e x t . s c a l a : 460 ) a t s c a l a . O p t i o n . f o r e a c h ( O p t i o n . s c a l a : 257 ) a t o r g . a p a c h e . s p a r k . S p a r k C o n t e x t . < i n i t > ( S p a r k C o n t e x t . s c a l a : 460 ) a t o r g . a p a c h e . s p a r k . S p a r k C o n t e x t 11.apply(SparkContext.scala:460) at scala.Option.foreach(Option.scala:257) at org.apache.spark.SparkContext.<init>(SparkContext.scala:460) at org.apache.spark.SparkContext 11.apply(SparkContext.scala:460)atscala.Option.foreach(Option.scala:257)atorg.apache.spark.SparkContext.<init>(SparkContext.scala:460)atorg.apache.spark.SparkContext.getOrCreate(SparkContext.scala:2509)
at org.apache.spark.sql.SparkSession B u i l d e r Builder Builder$anonfun 6. a p p l y ( S p a r k S e s s i o n . s c a l a : 909 ) a t o r g . a p a c h e . s p a r k . s q l . S p a r k S e s s i o n 6.apply(SparkSession.scala:909) at org.apache.spark.sql.SparkSession 6.apply(SparkSession.scala:909)atorg.apache.spark.sql.SparkSessionBuilderKaTeX parse error: Can't use function '$' in math mode at position 8: anonfun$̲6.apply(SparkSe…runMain(SparkSubmit.scala:755)
at org.apache.spark.deploy.SparkSubmit$.doRunMain 1 ( S p a r k S u b m i t . s c a l a : 180 ) a t o r g . a p a c h e . s p a r k . d e p l o y . S p a r k S u b m i t 1(SparkSubmit.scala:180) at org.apache.spark.deploy.SparkSubmit 1(SparkSubmit.scala:180)atorg.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
复制代码
错误原因:
每一个Spark任务都会占用一个SparkUI端口,默认为4040,如果被占用则依次递增端口重试。但是有个默认重试次数,为16次。16次重试都失败后,会放弃该任务的运行。

解决方法
初始化SparkConf时,添加conf.set(“spark.port.maxRetries”,“100”)语句;使用spark-submit提交任务时,在启动命令行中添加–conf spark.port.maxRetries=100 \

http://www.lryc.cn/news/120704.html

相关文章:

  • LeetCode 2811. Check if it is Possible to Split Array【脑筋急转弯;前缀和+动态规划或记忆化DFS】中等
  • 【学习日记】【FreeRTOS】链表结构体及函数详解
  • 【云原生•监控】基于Prometheus实现自定义指标弹性伸缩(HPA)
  • Windows、 Linux 等操作系统的基本概念及其常见操作
  • 【RabbitMQ】golang客户端教程5——使用topic交换器
  • SpringBoot对接OpenAI
  • (C++)继承
  • 图像处理技巧形态学滤波之膨胀操作
  • 机器学习基础之《特征工程(4)—特征降维》
  • 学生管理系统(Python版本)
  • Linux下快速创建大文件的4种方法总结
  • 用 Rufus 制作 Ubuntu 系统启动盘时,选择分区类型为MBR还是GPT?
  • Nodejs+vue+elementui汽车租赁管理系统_1ma2x
  • Prometheus入门
  • RISC-V云测平台:Compiling The Fedora Linux Kernel Natively on RISC-V
  • Vim学习(三)—— Git Repo Gerrit
  • 论坛项目之用户部分
  • golang内存对齐
  • 【CheatSheet】Python、R、Julia数据科学编程极简入门
  • 【golang】怎样判断一个变量的类型?
  • 怎么学习AJAX相关技术? - 易智编译EaseEditing
  • JDK、JRE、JVM:揭秘Java的关键三者关系
  • 【reactNative混合安卓开发~使用问题持续更】
  • OCR的发明人是谁?
  • 笔记本电脑连上WiFi之后的IP为什么会变?如何让它不变固定住?
  • 【数学建模】--因子分析模型
  • RAM不够?CUBEIDE使用CCMRAM
  • npm ERR! code ERESOLVEnpm ERR! ERESOLVE unable to resolve dependency tree
  • 使用 prometheus client SDK 暴露指标
  • Jmeter之BeanShell取出参数进行四则运算,并判断是否正确