Invoking stop() from shutdown hook 2023-06-08 20:00:25,905 INFO (AbstractConnector.java:318)- Stopped Spark@4978bcd5{HTTP/1.1,[http/1.1]}{0.0.0.0:4040} 2023-06-08 20:00:25,906 INFO (Logging.scala:54)- Stopped Spark web UI at http://DESKTOP-UN228O4:4040 2023-06-08 20:00:25,919 INFO (Logging.scala:54)- MapOutputTrackerMasterEndpoint stopped! 2023-06-08 20:00:25,929 INFO (Logging.scala:54)- MemoryStore cleared 2023-06-08 20:00:25,929 INFO (Logging.scala:54)- BlockManager stopped 2023-06-08 20:00:25,935 INFO (Logging.scala:54)- BlockManagerMaster stopped 2023-06-08 20:00:25,937 INFO (Logging.scala:54)- OutputCommitCoordinator stopped! 2023-06-08 20:00:25,950 INFO (Logging.scala:54)- Successfully stopped SparkContext 2023-06-08 20:00:25,950 INFO (Logging.scala:54)- Shutdown hook called 2023-06-08 20:00:25,951 INFO (Logging.scala:54)- Deleting directory C:\Users\HUAWEI\AppData\Local\Temp\spark-87e84c73-3cf7-4db0-bf7d-ab42db0a5a54
时间: 2024-03-19 12:41:04 浏览: 32
这段日志是 Spark 服务正常关闭时的输出信息,其中包含了一些停止服务时的操作。具体来说,它首先停止了一个名为 Spark 的 HTTP 服务,然后关闭了 Spark 的 web UI,接着停止了 MapOutputTrackerMasterEndpoint,清空了内存存储器,最后停止了 BlockManager 和 BlockManagerMaster。最后,它成功停止了 SparkContext,并在最后调用了一个 Shutdown hook,最后删除了一个临时目录。
相关问题
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.spark.sql.functions$.to_timestamp(Lorg/apache/spark/sql/Column;Ljava/lang/String;)Lorg/apache/spark/sql/Column; at org.example.Cleaning$.main(Cleaning.scala:32) at org.example.Cleaning.main(Cleaning.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:736) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 23/06/11 20:06:43 INFO spark.SparkContext: Invoking stop() from shutdown hook
这个错误是由于在你的代码中调用了 `org.apache.spark.sql.functions.to_timestamp` 方法,但是该方法不存在或者版本不兼容所致。你需要检查你的代码,确定你所使用的 Spark 版本和相关依赖的版本是否与你的代码兼容。如果你使用的是 Spark 2.x 版本,则应该使用 `org.apache.spark.sql.functions.unix_timestamp` 方法代替 `to_timestamp` 方法。如果你使用的是 Spark 3.x 版本,则应该确保你的代码中引用的 Spark 依赖的版本与你的代码相匹配。
2023-06-10 16:51:11.329824: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library cudnn64_7.dll 2023-06-10 16:56:47.339190: W tensorflow/stream_executor/cuda/redzone_allocator.cc:312] Internal: Invoking ptxas not supported on Windows Relying on driver to perform ptx compilation. This message will be only logged once. 2023-06-10 16:56:47.537030: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library cublas64_100.dll Traceback (most recent call last):
看起来这是一个 TensorFlow 在 Windows 操作系统上运行时遇到的警告和错误消息。警告消息 "Internal: Invoking ptxas not supported on Windows" 表示在 Windows 操作系统上使用 ptxas 编译器的操作不受支持,因此必须依赖驱动程序来执行 ptx 编译。错误消息的具体内容需要查看 Traceback 中的栈跟踪信息才能确定。请提供更多上下文信息以获得更准确的答案。
阅读全文