Executing WordCount example with default input data set. Use --input to specify file input. Printing
时间: 2024-09-25 08:06:58 浏览: 87
"Executing WordCount example with default input data set"是在描述Flink(流处理框架)中的一个示例程序运行过程。这个例子通常用于演示单词计数的基本操作,它默认读取输入文件中的内容进行处理。当你运行`./bin/flink run examples/streaming/WordCount.jar`命令时,Flink会自动开始执行这个WordCount程序。
`Use --input to specify file input`这部分意味着在实际操作中,你需要通过`--input`选项来指定要分析的文件路径,比如`./bin/flink run examples/streaming/WordCount.jar --input /path/to/input/file.txt`。
`Printing result to stdout`表示程序执行的结果会被直接打印到标准输出(通常是控制台),这样你可以看到每个单词及其出现次数。
如果程序成功完成,你会看到类似于这样的输出:
```shell
Job with JobID d189e57967370c46f106b48dd30c0cc7 has finished. Job Runtime: 238 ms
```
或者
```shell
Job with JobID 94bbcabbc2e6a0cf6c534b3ad47d1fec has finished. Job Runtime: 288 ms
```
这表明任务已经结束并且给出了运行时间。
相关问题
CREATE TEMPORARY FUNCTION UUIDUDF AS 'com.haierubic.bigdata.commons.udf.UUIDUDF' . . . . . . . . . . . . . . . . . > USING JAR 'oss://datalake-01.cn-beijing.oss-dls.aliyuncs.com/config/bigdata-hiveudf-2.1-jar-with-dependencies.jar'; Error: Error while compiling statement: FAILED: HiveAccessControlException Permission denied: user [hdop_upbrain] does not have [TEMPUDFADMIN] privilege on [global=uuidudf] (state=42000,code=40000) 0: jdbc:hive2://10.204.11.45:10000> CREATE FUNCTION UUIDUDF AS 'com.haierubic.bigdata.commons.udf.UUIDUDF' . . . . . . . . . . . . . . . . . > USING JAR 'oss://datalake-01.cn-beijing.oss-dls.aliyuncs.com/config/bigdata-hiveudf-2.1-jar-with-dependencies.jar'; INFO : Compiling command(queryId=hive_20230602122812_92858e15-5136-4e7d-9f51-3020f864aef2): CREATE FUNCTION UUIDUDF AS 'com.haierubic.bigdata.commons.udf.UUIDUDF' USING JAR 'oss://datalake-01.cn-beijing.oss-dls.aliyuncs.com/config/bigdata-hiveudf-2.1-jar-with-dependencies.jar' INFO : Concurrency mode is disabled, not creating a lock manager INFO : Semantic Analysis Completed (retrial = false) INFO : Returning Hive schema: Schema(fieldSchemas:null, properties:null) INFO : Completed compiling command(queryId=hive_20230602122812_92858e15-5136-4e7d-9f51-3020f864aef2); Time taken: 0.067 seconds INFO : Concurrency mode is disabled, not creating a lock manager INFO : Executing command(queryId=hive_20230602122812_92858e15-5136-4e7d-9f51-3020f864aef2): CREATE FUNCTION UUIDUDF AS 'com.haierubic.bigdata.commons.udf.UUIDUDF' USING JAR 'oss://datalake-01.cn-beijing.oss-dls.aliyuncs.com/config/bigdata-hiveudf-2.1-jar-with-dependencies.jar' INFO : Starting task [Stage-0:FUNC] in serial mode INFO : Added [/tmp/b11e4544-4a21-4dcf-87c5-fff8d91021e9_resources/bigdata-hiveudf-2.1-jar-with-dependencies.jar] to class path INFO : Added resources: [oss://datalake-01.cn-beijing.oss-dls.aliyuncs.com/config/bigdata-hiveudf-2.1-jar-with-dependencies.jar] INFO : Completed executing command(queryId=hive_20230602122812_92858e15-5136-4e7d-9f51-3020f864aef2); Time taken: 0.789 seconds INFO : OK INFO : Concurrency mode is disabled, not creating a lock manager No rows affected (0.881 seconds)
根据错误信息显示,您的用户hdop_upbrain没有TEMPUDFADMIN权限,因此无法创建名为UUIDUDF的临时UDF函数。您需要联系Hive管理员或具有TEMPUDFADMIN权限的用户授权hdop_upbrain用户。授权语句可能类似于:
```
GRANT TEMPUDFADMIN ON DATABASE your_database TO USER hdop_upbrain;
```
请注意将your_database替换为您要授权的数据库名称。如果您仍然遇到问题,请提供更多详细信息,我可以为您提供更多帮助。
Executing: /tmp/apt-key-gpghome.DI0VWrO2Tz/gpg.1.sh --keyserver keyserver.ubuntu.com --recv-keys 40976EAF437D05B5 3B4FE6ACC0B21F32
这是一行 Linux 命令,用于从 Ubuntu 的 keyserver 服务器接收公钥。具体来说,这个命令会将 ID 为 40976EAF437D05B5 和 3B4FE6ACC0B21F32 的两个公钥从 keyserver.ubuntu.com 服务器上下载并导入到当前用户的 GPG 密钥环中。这个命令通常用于添加软件源的签名密钥,以确保从这些源中下载的软件包是可信的。
阅读全文