没有合适的资源?快使用搜索试试~ 我知道了~
首页Radar and Camera Data Association Algorithm for Sensor Fusion.pdf
资源详情
资源评论
资源推荐
510
IEICE TRANS. FUNDAMENTALS, VOL.E100–A, NO.2 FEBRUARY 2017
LETTER
Special Section on Intelligent Transport Systems
Radar and Camera Data Association Algorithm for Sensor Fusion
Yohei OISHI
†
, Nonmember and Isamu MATSUNAMI
†a)
, Member
SUMMARY This paper presents a method to accelerate target recogni-
tion processing in advanced driver assistance systems (ADAS). A histogram
of oriented gradients (HOG) is an effective descriptor for object recognition
in computer vision and image processing. The HOG is expected to replace
conventional descriptors, e.g., template-matching, in ADAS. However, the
HOG does not consider the occurrences of gradient orientation on objects
when localized portions of an image, i.e., a region of interest (ROI), are
not set precisely. The size and position of the ROI should be set precisely
for each frame in an automotive environment where the target distance
changes dynamically. We use radar to determine the size and position of
the ROI in a HOG and propose a radar and camera sensor fusion algorithm.
Experimental results are discussed.
key words: ADAS, ROS, radar, sensor fusion, UWB
1. Introduction
Advanced driver assistance systems (ADAS) are designed to
help drivers. They increase car and road safety when de-
signed using a safe human–machine interface. ADAS tech-
nology can be based on cameras, radar, lidar, car data net-
works, vehicle-to-vehicle, or vehicle-to-infrastructure sys-
tems. Next-generation ADAS will require more sophisti-
cated sensors and sensor fusion systems. In existing driv-
ing safety systems, different sensors are used for specific
purposes. For example, cameras are used to recognize rel-
atively close targets, e.g., pedestrians and cars, and radars
are used to detect distant targets. Cameras cannot detect
distant targets effectively and perform poorly in bad weather
and night conditions. In contrast, radar systems, which have
high accuracy and perform well in poor weather, can be used
in ADAS. However, radars have a low recognition ability.
Thus, sensor fusion can be used as a potential solution to
these problems.
Sensor fusion refers to the process of combining sen-
sory data or data from dissimilar sources. However, sensor
fusion significantly increases the complexity of calculations.
Moreover, research into sensor fusion for target recognition
in ADAS and methods to accelerate calculation are limited.
Recently, the sensing algorithms and systems using
multiple sensors have been reported [1]–[3]. However most
of them are not a sensor fusion but being based on indepen-
dent processing, furthermore there is no paper intended for
a data association and fusion of camera and radar. In order
Manuscript received April 15, 2016.
Manuscript revised August 24, 2016.
†
The authors are with Kitakyushu University, Kitakyushu-shi,
808-0135 Japan.
a) E-mail: i-matsunami@kitakyu-u.ac.jp
DOI: 10.1587/transfun.E100.A.510
to contribute to the development of a next-generation ADAS
to realize an autonomous car, the aim of our research is to
develop the radar and camera sensor fusion system that is
applicable to bad weather and night time. Therefore, setting
of the most suitable region of interest (ROI) using its fusion
data will be prerequisite to realize the high accuracy and fast
detection in all weather and all the time.
In this paper, we propose a data association algorithm in
order to improve the target recognition accuracy and the cal-
culation time using the ROI reconstructed by combining the
imaging data, which is obtained from a histogram of gradi-
ents (HOG) processing and machine lear ning, and the radar
ranging data. We also evaluate the usefulness by through
experiment using 24 GHz radar and camera measurement
system.
2. Data Association Method
2.1 DATMO Using UWB Radar
We developed a real-time method to detect and track moving
objects (DATMO) using a UWB automotive radar [4]–[6].
A UWB pulse transmitted at a pulse repetition interval
(PRI) is received by a receiver as several signal compo-
nents s
m
from a target and several clutter components c
m
,
expressed approximately as
x
m
(i) = s
m
(i) + c
m
(i) (1)
where x
m
represents the received signals (i.e., a power range
profile), i (= 1, 2, 3, . . .) is the number of each range com-
ponents (i.e.,., range-bin) framed by the power range pro-
file, and m (= 1, 2, 3, . . .) is the number of power range
profiles transmitted at the PRI. Figure 1 shows the power
range profile when a target is driving at an arbitrary position
in the measurement environment (Fig. 2). Figure 1 shows
many clutter components and unwanted radar echoes from
the road surface and man-made structures; thus, the target
echo is obscured by clutter and noise. Thus, the signal-
to-clutter ratio (SCR) is low and the detection performance
is affected by clutter because the target and clutter compo-
nents are distributed in multiple range-bins. To suppress
clutter, we developed weight pulse integration (WPI) using
the occurrence probability of the target and clutter in each
range-bin [9]. Here, occurrence probability w (i) for each
range-bin is
Copyright © 2017 The Institute of Electronics, Information and Communication Engineers
Goto_404
- 粉丝: 18
- 资源: 6
上传资源 快速赚钱
- 我的内容管理 收起
- 我的资源 快来上传第一个资源
- 我的收益 登录查看自己的收益
- 我的积分 登录查看自己的积分
- 我的C币 登录后查看C币余额
- 我的收藏
- 我的下载
- 下载帮助
会员权益专享
最新资源
- zigbee-cluster-library-specification
- JSBSim Reference Manual
- c++校园超市商品信息管理系统课程设计说明书(含源代码) (2).pdf
- 建筑供配电系统相关课件.pptx
- 企业管理规章制度及管理模式.doc
- vb打开摄像头.doc
- 云计算-可信计算中认证协议改进方案.pdf
- [详细完整版]单片机编程4.ppt
- c语言常用算法.pdf
- c++经典程序代码大全.pdf
- 单片机数字时钟资料.doc
- 11项目管理前沿1.0.pptx
- 基于ssm的“魅力”繁峙宣传网站的设计与实现论文.doc
- 智慧交通综合解决方案.pptx
- 建筑防潮设计-PowerPointPresentati.pptx
- SPC统计过程控制程序.pptx
资源上传下载、课程学习等过程中有任何疑问或建议,欢迎提出宝贵意见哦~我们会及时处理!
点击此处反馈
安全验证
文档复制为VIP权益,开通VIP直接复制
信息提交成功
评论0