2023-05-29 22:16:14 WARN Utils:66 - Your hostname, xfw-VirtualBox resolves to a loopback address: 127.0.1.1; using 10.218.226.129 instead (on interface enp0s3) 2023-05-29 22:16:14 WARN Utils:66 - Set SPARK_LOCAL_IP if you need to bind to another address 2023-05-29 22:16:24 WARN NativeCodeLoader:60 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable Setting default log level to "WARN". To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel). Traceback (most recent call last): File "shiyan4.2.py", line 7, in <module> sc = SparkContext("local","Simple App") File "/home/hadoop/.local/lib/python2.7/site-packages/pyspark/context.py", line 136, in __init__ conf, jsc, profiler_cls) File "/home/hadoop/.local/lib/python2.7/site-packages/pyspark/context.py", line 213, in _do_init self._encryption_enabled = self._jvm.PythonUtils.isEncryptionEnabled(self._jsc) File "/home/hadoop/.local/lib/python2.7/site-packages/py4j/java_gateway.py", line 1531, in __getattr__ "{0}.{1} does not exist in the JVM".format(self._fqn, name)) py4j.protocol.Py4JError: org.apache.spark.api.python.PythonUtils.isEncryptionEnabled does not exist in the JVM
时间: 2024-02-19 08:00:48 浏览: 412
这是一个报错信息,看起来是在运行一个名为 "shiyan4.2.py" 的 Python 脚本时出现了问题。报错信息中提到了 SparkContext,这可能意味着该脚本在使用 Apache Spark 进行分布式计算。报错信息中还提到了一个警告,提示设置 SPARK_LOCAL_IP 变量以绑定到另一个地址。该警告是因为主机名被解析为回环地址而引起的。需要进一步排查代码和环境配置是否正确。
阅读全文
相关推荐


















