get_module_logger

时间: 2023-11-26 16:04:15 浏览: 34
`get_module_logger` is a function provided by the Python `logging` module that returns a logger object with the specified name. It allows you to create a logger object for a specific module or package, which can then be used to log messages with various severity levels (e.g. debug, info, warning, error, critical). Here's an example of how to use `get_module_logger`: ```python import logging # create a logger object for the current module logger = logging.getLogger(__name__) # log a message with INFO level logger.info('This is an info message') # log a message with ERROR level logger.error('This is an error message') ``` In this example, `__name__` represents the name of the current module, which is passed to `get_module_logger` to create a logger object. The logger object is then used to log messages with different severity levels. By default, the messages are output to the console, but you can configure the logging system to write messages to a file or send them to a remote server.

相关推荐

def parse_corpus(infile, outfile): '''parse the corpus of the infile into the outfile''' space = ' ' i = 0 def tokenize(text): return [lemma(token) for token in text.split()] with open(outfile, 'w', encoding='utf-8') as fout: # wiki = WikiCorpus(infile, lemmatize=False, dictionary={}) # gensim中的维基百科处理类WikiCorpus wiki = WikiCorpus(infile, tokenizer_func=tokenize, dictionary={}) # gensim中的维基百科处理类WikiCorpus for text in wiki.get_texts(): fout.write(space.join(text) + '\n') i += 1 if i % 10000 == 0: logger.info('Saved ' + str(i) + ' articles') 报错D:\软件\python\lib\site-packages\gensim\utils.py:1333: UserWarning: detected Windows; aliasing chunkize to chunkize_serial warnings.warn("detected %s; aliasing chunkize to chunkize_serial" % entity) Traceback (most recent call last): File "D:\pythonFiles\图灵\Python_project\self_learn\大语言模型\WikiExtractor.py", line 52, in <module> parse_corpus(infile, outfile) File "D:\pythonFiles\图灵\Python_project\self_learn\大语言模型\WikiExtractor.py", line 29, in parse_corpus for text in wiki.get_texts(): File "D:\软件\python\lib\site-packages\gensim\corpora\wikicorpus.py", line 693, in get_texts for tokens, title, pageid in pool.imap(_process_article, group): File "D:\软件\python\lib\multiprocessing\pool.py", line 870, in next raise value File "D:\软件\python\lib\multiprocessing\pool.py", line 537, in _handle_tasks put(task) File "D:\软件\python\lib\multiprocessing\connection.py", line 211, in send self._send_bytes(_ForkingPickler.dumps(obj)) File "D:\软件\python\lib\multiprocessing\reduction.py", line 51, in dumps cls(buf, protocol).dump(obj) AttributeError: Can't pickle local object 'parse_corpus.<locals>.tokenize' 怎么优化

Traceback (most recent call last): File "DT_001_X01_P01.py", line 150, in DT_001_X01_P01.Module.load_model File "/home/kejia/Server/tf/Bin_x64/DeepLearning/DL_Lib_02/mmdet/apis/inference.py", line 42, in init_detector checkpoint = load_checkpoint(model, checkpoint, map_location=map_loc) File "/home/kejia/Server/tf/Bin_x64/DeepLearning/DL_Lib_02/mmcv/runner/checkpoint.py", line 529, in load_checkpoint checkpoint = _load_checkpoint(filename, map_location, logger) File "/home/kejia/Server/tf/Bin_x64/DeepLearning/DL_Lib_02/mmcv/runner/checkpoint.py", line 467, in _load_checkpoint return CheckpointLoader.load_checkpoint(filename, map_location, logger) File "/home/kejia/Server/tf/Bin_x64/DeepLearning/DL_Lib_02/mmcv/runner/checkpoint.py", line 244, in load_checkpoint return checkpoint_loader(filename, map_location) File "/home/kejia/Server/tf/Bin_x64/DeepLearning/DL_Lib_02/mmcv/runner/checkpoint.py", line 261, in load_from_local checkpoint = torch.load(filename, map_location=map_location) File "torch/serialization.py", line 594, in load return _load(opened_zipfile, map_location, pickle_module, **pickle_load_args) File "torch/serialization.py", line 853, in _load result = unpickler.load() File "torch/serialization.py", line 845, in persistent_load load_tensor(data_type, size, key, _maybe_decode_ascii(location)) File "torch/serialization.py", line 834, in load_tensor loaded_storages[key] = restore_location(storage, location) File "torch/serialization.py", line 175, in default_restore_location result = fn(storage, location) File "torch/serialization.py", line 157, in _cuda_deserialize return obj.cuda(device) File "torch/_utils.py", line 71, in _cuda with torch.cuda.device(device): File "torch/cuda/__init__.py", line 225, in __enter__ self.prev_idx = torch._C._cuda_getDevice() File "torch/cuda/__init__.py", line 164, in _lazy_init "Cannot re-initialize CUDA in forked subprocess. " + msg) RuntimeError: Cannot re-initialize CUDA in forked subprocess. To use CUDA with multiprocessing, you must use the 'spawn' start method ('异常抛出', None) DT_001_X01_P01 load_model ret=1, version=V1.0.0.0

如何解决:/usr/lib/python3/dist-packages/requests/__init__.py:89: RequestsDependencyWarning: urllib3 (1.26.15) or chardet (3.0.4) doesn't match a supported version! warnings.warn("urllib3 ({}) or chardet ({}) doesn't match a supported " Traceback (most recent call last): File "/home/data/minjie.yu/.local/bin/streamlit", line 5, in <module> from streamlit.web.cli import main File "/home/data/minjie.yu/.local/lib/python3.8/site-packages/streamlit/__init__.py", line 55, in <module> from streamlit.delta_generator import DeltaGenerator as _DeltaGenerator File "/home/data/minjie.yu/.local/lib/python3.8/site-packages/streamlit/delta_generator.py", line 36, in <module> from streamlit import config, cursor, env_util, logger, runtime, type_util, util File "/home/data/minjie.yu/.local/lib/python3.8/site-packages/streamlit/cursor.py", line 18, in <module> from streamlit.runtime.scriptrunner import get_script_run_ctx File "/home/data/minjie.yu/.local/lib/python3.8/site-packages/streamlit/runtime/__init__.py", line 16, in <module> from streamlit.runtime.runtime import Runtime as Runtime File "/home/data/minjie.yu/.local/lib/python3.8/site-packages/streamlit/runtime/runtime.py", line 29, in <module> from streamlit.proto.BackMsg_pb2 import BackMsg File "/home/data/minjie.yu/.local/lib/python3.8/site-packages/streamlit/proto/BackMsg_pb2.py", line 5, in <module> from google.protobuf.internal import builder as _builder ImportError: cannot import name 'builder' from 'google.protobuf.internal' (/home/data/minjie.yu/.local/lib/python3.8/site-packages/google/protobuf/internal/__init__.py)

Running wrapper {"command": "/opt/conda/bin/python3 -u /opt/nuclio/_nuclio_wrapper.py --handler main:handler --socket-path /tmp/nuclio-rpc-cipbe4b85k6ne6scq1gg.sock --platform-kind local --namespace nuclio --worker-id 0 --trigger-kind http --trigger-name myHttpTrigger --decode-event-strings"} /opt/nuclio/_nuclio_wrapper.py:395: DeprecationWarning: There is no current event loop loop = asyncio.get_event_loop() {"datetime": "2023-07-15 15:11:13,541", "level": "error", "message": "Caught unhandled exception while initializing", "with": {"err": "'return' outside function (main.py, line 64)", "traceback": "Traceback (most recent call last):\n File \"/opt/nuclio/_nuclio_wrapper.py\", line 400, in run_wrapper\n wrapper_instance = Wrapper(root_logger,\n File \"/opt/nuclio/_nuclio_wrapper.py\", line 71, in __init__\n self._entrypoint = self._load_entrypoint_from_handler(handler)\n File \"/opt/nuclio/_nuclio_wrapper.py\", line 195, in _load_entrypoint_from_handler\n module = __import__(module_name)\n File \"/opt/nuclio/main.py\", line 64\n return context.Response(body=json.dumps(results), headers={},\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\nSyntaxError: 'return' outside function\n", "worker_id": "0"}} 23.07.15 15:11:13.550 sor.http.w0.python.logger (E) Unexpected termination of child process {"error": null, "status": "exit status 1"} panic: Wrapper process for worker 0 exited unexpectedly with: exit status 1 goroutine 13 [running]: github.com/nuclio/nuclio/pkg/processor/runtime/rpc.(*AbstractRuntime).watchWrapperProcess(0xc0007f4300) /nuclio/pkg/processor/runtime/rpc/abstract.go:471 +0x445 created by github.com/nuclio/nuclio/pkg/processor/runtime/rpc.(*AbstractRuntime).startWrapper /nuclio/pkg/processor/runtime/rpc/abstract.go:244 +0x1c5 /nuclio/pkg/platform/local/platform.go:1168 Failed to deploy function ...//nuclio/pkg/platform/abstract/platform.go:197

lidar_file path: /root/autodl-tmp/project/data/KITTI/object/testing/velodyne/000204.bin lidar_file path: /root/autodl-tmp/project/data/KITTI/object/testing/velodyne/000205.bin lidar_file path: /root/autodl-tmp/project/data/KITTI/object/testing/velodyne/000206.bin lidar_file path: /root/autodl-tmp/project/data/KITTI/object/testing/velodyne/000207.bin eval: 39%|█████████████████████████████▍ | 44/112 [00:06<00:07, 8.56it/s, mode=TEST, recall=0/0, rpn_iou=0]Traceback (most recent call last): File "eval_rcnn.py", line 908, in <module> eval_single_ckpt(root_result_dir) File "eval_rcnn.py", line 771, in eval_single_ckpt eval_one_epoch(model, test_loader, epoch_id, root_result_dir, logger) File "eval_rcnn.py", line 694, in eval_one_epoch ret_dict = eval_one_epoch_rpn(model, dataloader, epoch_id, result_dir, logger) File "eval_rcnn.py", line 143, in eval_one_epoch_rpn for data in dataloader: File "/root/miniconda3/lib/python3.8/site-packages/torch/utils/data/dataloader.py", line 435, in __next__ lidar_file path: /root/autodl-tmp/project/data/KITTI/object/testing/velodyne/000208.bin data = self._next_data() File "/root/miniconda3/lib/python3.8/site-packages/torch/utils/data/dataloader.py", line 1085, in _next_data return self._process_data(data) File "/root/miniconda3/lib/python3.8/site-packages/torch/utils/data/dataloader.py", line 1111, in _process_data data.reraise() File "/root/miniconda3/lib/python3.8/site-packages/torch/_utils.py", line 428, in reraise raise self.exc_type(msg) AssertionError: Caught AssertionError in DataLoader worker process 0.

var createError = require('http-errors'); var express = require('express'); var path = require('path'); var cookieParser = require('cookie-parser'); var logger = require('morgan'); var indexRouter = require('./routes/index'); var usersRouter = require('./routes/users'); const UserRouter = require('./routes/admin/UserRouter'); const JWT = require('./util/JWT'); const NewsRouter = require('./routes/admin/NewsRouter'); const webNewsRouter = require('./routes/web/NewsRouter'); const ProductRouter = require('./routes/admin/ProductRouter'); var app = express(); // view engine setup app.set('views', path.join(__dirname, 'views')); app.set('view engine', 'jade'); app.use(logger('dev')); app.use(express.json()); app.use(express.urlencoded({ extended: false })); app.use(cookieParser()); app.use(express.static(path.join(__dirname, 'public'))); app.use('/', indexRouter); app.use('/users', usersRouter); app.use(webNewsRouter) /* /adminapi/* - 后台系统用的 /webapi/* - 企业官网用的 */ app.use((req, res, next) => { // 如果token有效 ,next() // 如果token过期了, 返回401错误 if (req.url === "/adminapi/user/login") { next() return; } const token = req.headers["authorization"].split(" ")[1] if (token) { var payload = JWT.verify(token) // console.log(payload) if (payload) { const newToken = JWT.generate({ _id: payload._id, username: payload.username }, "1d") res.header("Authorization", newToken) next() } else { res.status(401).send({ errCode: "-1", errorInfo: "token过期" }) } } }) app.use(UserRouter) app.use(NewsRouter) app.use(ProductRouter) // catch 404 and forward to error handler app.use(function (req, res, next) { next(createError(404)); }); // error handler app.use(function (err, req, res, next) { // set locals, only providing error in development res.locals.message = err.message; res.locals.error = req.app.get('env') === 'development' ? err : {}; // render the error page res.status(err.status || 500); res.render('error'); }); module.exports = app;

最新推荐

recommend-type

【图像压缩】 GUI矩阵的奇异值分解SVD灰色图像压缩【含Matlab源码 4359期】.zip

Matlab领域上传的视频均有对应的完整代码,皆可运行,亲测可用,适合小白; 1、代码压缩包内容 主函数:main.m; 调用函数:其他m文件;无需运行 运行结果效果图; 2、代码运行版本 Matlab 2019b;若运行有误,根据提示修改;若不会,私信博主; 3、运行操作步骤 步骤一:将所有文件放到Matlab的当前文件夹中; 步骤二:双击打开main.m文件; 步骤三:点击运行,等程序运行完得到结果; 4、仿真咨询 如需其他服务,可私信博主或扫描视频QQ名片; 4.1 博客或资源的完整代码提供 4.2 期刊或参考文献复现 4.3 Matlab程序定制 4.4 科研合作
recommend-type

node-v0.9.2-x86.msi

Node.js,简称Node,是一个开源且跨平台的JavaScript运行时环境,它允许在浏览器外运行JavaScript代码。Node.js于2009年由Ryan Dahl创立,旨在创建高性能的Web服务器和网络应用程序。它基于Google Chrome的V8 JavaScript引擎,可以在Windows、Linux、Unix、Mac OS X等操作系统上运行。 Node.js的特点之一是事件驱动和非阻塞I/O模型,这使得它非常适合处理大量并发连接,从而在构建实时应用程序如在线游戏、聊天应用以及实时通讯服务时表现卓越。此外,Node.js使用了模块化的架构,通过npm(Node package manager,Node包管理器),社区成员可以共享和复用代码,极大地促进了Node.js生态系统的发展和扩张。 Node.js不仅用于服务器端开发。随着技术的发展,它也被用于构建工具链、开发桌面应用程序、物联网设备等。Node.js能够处理文件系统、操作数据库、处理网络请求等,因此,开发者可以用JavaScript编写全栈应用程序,这一点大大提高了开发效率和便捷性。 在实践中,许多大型企业和组织已经采用Node.js作为其Web应用程序的开发平台,如Netflix、PayPal和Walmart等。它们利用Node.js提高了应用性能,简化了开发流程,并且能更快地响应市场需求。
recommend-type

【尺寸检测】机器视觉图像目标尺寸测量【含Matlab源码 4087期】.zip

Matlab领域上传的视频均有对应的完整代码,皆可运行,亲测可用,适合小白; 1、代码压缩包内容 主函数:main.m; 调用函数:其他m文件;无需运行 运行结果效果图; 2、代码运行版本 Matlab 2019b;若运行有误,根据提示修改;若不会,私信博主; 3、运行操作步骤 步骤一:将所有文件放到Matlab的当前文件夹中; 步骤二:双击打开main.m文件; 步骤三:点击运行,等程序运行完得到结果; 4、仿真咨询 如需其他服务,可私信博主或扫描视频QQ名片; 4.1 博客或资源的完整代码提供 4.2 期刊或参考文献复现 4.3 Matlab程序定制 4.4 科研合作
recommend-type

【图像加密】双随机相位图像加密解密【含Matlab源码 4118期】.zip

Matlab领域上传的视频均有对应的完整代码,皆可运行,亲测可用,适合小白; 1、代码压缩包内容 主函数:main.m; 调用函数:其他m文件;无需运行 运行结果效果图; 2、代码运行版本 Matlab 2019b;若运行有误,根据提示修改;若不会,私信博主; 3、运行操作步骤 步骤一:将所有文件放到Matlab的当前文件夹中; 步骤二:双击打开main.m文件; 步骤三:点击运行,等程序运行完得到结果; 4、仿真咨询 如需其他服务,可私信博主或扫描视频QQ名片; 4.1 博客或资源的完整代码提供 4.2 期刊或参考文献复现 4.3 Matlab程序定制 4.4 科研合作
recommend-type

金融支付:浅析如何用定期资产设计活期产品.docx

金融支付:浅析如何用定期资产设计活期产品.docx
recommend-type

zigbee-cluster-library-specification

最新的zigbee-cluster-library-specification说明文档。
recommend-type

管理建模和仿真的文件

管理Boualem Benatallah引用此版本:布阿利姆·贝纳塔拉。管理建模和仿真。约瑟夫-傅立叶大学-格勒诺布尔第一大学,1996年。法语。NNT:电话:00345357HAL ID:电话:00345357https://theses.hal.science/tel-003453572008年12月9日提交HAL是一个多学科的开放存取档案馆,用于存放和传播科学研究论文,无论它们是否被公开。论文可以来自法国或国外的教学和研究机构,也可以来自公共或私人研究中心。L’archive ouverte pluridisciplinaire
recommend-type

实现实时数据湖架构:Kafka与Hive集成

![实现实时数据湖架构:Kafka与Hive集成](https://img-blog.csdnimg.cn/img_convert/10eb2e6972b3b6086286fc64c0b3ee41.jpeg) # 1. 实时数据湖架构概述** 实时数据湖是一种现代数据管理架构,它允许企业以低延迟的方式收集、存储和处理大量数据。与传统数据仓库不同,实时数据湖不依赖于预先定义的模式,而是采用灵活的架构,可以处理各种数据类型和格式。这种架构为企业提供了以下优势: - **实时洞察:**实时数据湖允许企业访问最新的数据,从而做出更明智的决策。 - **数据民主化:**实时数据湖使各种利益相关者都可
recommend-type

云原生架构与soa架构区别?

云原生架构和SOA架构是两种不同的架构模式,主要有以下区别: 1. 设计理念不同: 云原生架构的设计理念是“设计为云”,注重应用程序的可移植性、可伸缩性、弹性和高可用性等特点。而SOA架构的设计理念是“面向服务”,注重实现业务逻辑的解耦和复用,提高系统的灵活性和可维护性。 2. 技术实现不同: 云原生架构的实现技术包括Docker、Kubernetes、Service Mesh等,注重容器化、自动化、微服务等技术。而SOA架构的实现技术包括Web Services、消息队列等,注重服务化、异步通信等技术。 3. 应用场景不同: 云原生架构适用于云计算环境下的应用场景,如容器化部署、微服务
recommend-type

JSBSim Reference Manual

JSBSim参考手册,其中包含JSBSim简介,JSBSim配置文件xml的编写语法,编程手册以及一些应用实例等。其中有部分内容还没有写完,估计有生之年很难看到完整版了,但是内容还是很有参考价值的。