Multisensor Information Fusion by Query Refinement, Recent advances in Visual information s

合集下载

多尺度特征融合的脊柱X线图像分割方法

多尺度特征融合的脊柱X线图像分割方法

脊柱侧凸是一种脊柱三维结构的畸形疾病,全球有1%~4%的青少年受到此疾病的影响[1]。

该疾病的诊断主要参考患者的脊柱侧凸角度,目前X线成像方式是诊断脊柱侧凸的首选,在X线图像中分割脊柱是后续测量、配准以及三维重建的基础。

近期出现了不少脊柱X线图像分割方法。

Anitha等人[2-3]提出了使用自定义的滤波器自动提取椎体终板以及自动获取轮廓的形态学算子的方法,但这些方法存在一定的观察者间的误差。

Sardjono等人[4]提出基于带电粒子模型的物理方法来提取脊柱轮廓,实现过程复杂且实用性不高。

叶伟等人[5]提出了一种基于模糊C均值聚类分割算法,该方法过程繁琐且实用性欠佳。

以上方法都只对椎体进行了分割,却无法实现对脊柱的整体轮廓分割。

深度学习在图像分割的领域有很多应用。

Long等人提出了全卷积网络[6](Full Convolutional Network,FCN),将卷积神经网络的最后一层全连接层替换为卷积层,得到特征图后再经过反卷积来获得像素级的分类结果。

通过对FCN结构改进,Ronneberger等人提出了一种编码-解码的网络结构U-Net[7]解决图像分割问题。

Wu等人提出了BoostNet[8]来对脊柱X线图像进行目标检测以及一个基于多视角的相关网络[9]来完成对脊柱框架的定位。

上述方法并未直接对脊柱图像进行分割,仅提取了关键点的特征并由定位的特征来获取脊柱的整体轮廓。

Fang等人[10]采用FCN对脊柱的CT切片图像进行分割并进行三维重建,但分割精度相对较低。

Horng等人[11]将脊柱X线图像进行切割后使用残差U-Net 来对单个椎骨进行分割,再合成完整的脊柱图像,从而导致分割过程过于繁琐。

Tan等人[12]和Grigorieva等人[13]采用U-Net来对脊柱X线图像进行分割并实现对Cobb角的测量或三维重建,但存在分割精度不高的问题。

以上研究方法虽然在一定程度上完成脊柱分割,但仍存在两个问题:(1)只涉及椎体的定位和计算脊柱侧凸角度,却没有对图像进行完整的脊柱分割。

融合顺序敏感的多传感器GM-PHD跟踪算法研究

融合顺序敏感的多传感器GM-PHD跟踪算法研究

融合次序敏感的多传感器GM-PHD跟踪算法探究Mixture Probability Hypothesis Density)跟踪算法。

该算法接受了多个传感器的数据融合,实现了对目标的跟踪和位置猜测,并通过次序敏感方法改进了算法的鲁棒性和跟踪精度。

试验表明,本算法能够有效跟踪目标,缩减“传感器失联”、“传感器漂移”的影响,在多传感器跟踪领域有较大的应用前景。

关键词:多传感器;GM-PHD;次序敏感;目标跟踪;数据融合一、引言近年来,随着传感器技术的飞速进步,多传感器目标跟踪技术也得到了迅猛的进步。

多传感器目标跟踪技术利用多个传感器得到的数据,对目标进行多源信息融合,可大大提高目标跟踪的准确性和鲁棒性。

然而,在多传感器目标跟踪中,“传感器失联”、“传感器漂移”等问题也变得突出,这些问题会导致目标的跟踪精度降低,甚至失效。

为了提高多传感器目标跟踪的鲁棒性和精度,本文基于GM-PHD(Gaussian Mixture Probability Hypothesis Density)跟踪算法,提出了一种次序敏感的多传感器跟踪算法。

在此基础上,接受了多源数据融合技术,实现了对目标的多源信息得到和位置猜测,显著提高了跟踪精度和鲁棒性。

二、多传感器跟踪算法探究2.1 GM-PHD算法原理GM-PHD算法是一种基于概率密度的目标跟踪算法,它使用高斯混合模型(Gaussian Mixture Model)来描述目标的位置和速度信息。

GM-PHD算法的核心思想是基于观测数据和历史轨迹来推断目标状态。

2.2 多传感器跟踪算法构建本文针对已有的多传感器跟踪算法进行优化,起首接受数据融合技术,实现了多个传感器数据的汇聚和处理。

然后针对传感器失联和漂移等问题,提出了一种次序敏感的算法。

该算法能够在传感器失联等状况下,自适应调整跟踪模型,提高跟踪精度和鲁棒性。

三、试验结果与分析为了验证本文提出的多传感器次序敏感GM-PHD算法的有效性,我们进行了模拟试验和真实数据试验。

u-blox F9 HPS 1.30产品介绍说明书

u-blox F9 HPS 1.30产品介绍说明书

u-blox F9 HPS 1.30u-blox F9 high precision sensor fusion GNSS receiver Protocol version 33.30Interface descriptionAbstractThis document describes the interface (version 33.30) of the u-bloxF9 firmware HPS 1.30 platform.UBX-22010984 - R01C1-PublicDocument informationTitle u-blox F9 HPS 1.30Subtitle u-blox F9 high precision sensor fusion GNSS receiver Document type Interface descriptionDocument number UBX-22010984Revision and date R0116-Sep-2022 Disclosure restriction C1-Publicu-blox or third parties may hold intellectual property rights in the products, names, logos and designs included in this document. Copying, reproduction, or modification of this document or any part thereof is only permitted with the express written permission of u-blox. Disclosure to third parties is permitted for clearly public documents only.The information contained herein is provided "as is" and u-blox assumes no liability for its use. No warranty, either express or implied, is given, including but not limited to, with respect to the accuracy, correctness, reliability and fitness for a particular purpose of the information. This document may be revised by u-blox at any time without notice. For the most recent documents, visit .Copyright © 2022, u-blox AG.Contents1 General information (14)1.1 Document overview (14)1.2 Firmware and protocol versions (14)1.3 Receiver configuration (16)1.4 Message naming (17)1.5 GNSS, satellite, and signal identifiers (17)1.5.1 Overview (17)1.5.2 GNSS identifiers (18)1.5.3 Satellite identifiers (18)1.5.4 Signal identifiers (19)1.6 Message types (20)2 NMEA protocol (21)2.1 NMEA frame structure (21)2.2 NMEA protocol configuration (21)2.3 NMEA-proprietary messages (22)2.4 NMEA multi-GNSS operation (23)2.5 NMEA data fields (23)2.5.1 NMEA Talker ID (23)2.5.2 NMEA extra fields (23)2.5.3 NMEA latitude and longitude format (24)2.5.4 NMEA GNSS, satellite, and signal numbering (24)2.5.5 NMEA position fix flags (24)2.5.6 NMEA output of invalid or unknown data (25)2.6 NMEA messages overview (26)2.7 Standard messages (26)2.7.1 DTM (26)2.7.1.1 Datum reference (27)2.7.2 GAQ (27)2.7.2.1 Poll a standard message (Talker ID GA) (27)2.7.3 GBQ (28)2.7.3.1 Poll a standard message (Talker ID GB) (28)2.7.4 GBS (28)2.7.4.1 GNSS satellite fault detection (28)2.7.5 GGA (29)2.7.5.1 Global positioning system fix data (29)2.7.6 GLL (30)2.7.6.1 Latitude and longitude, with time of position fix and status (30)2.7.7 GLQ (30)2.7.7.1 Poll a standard message (Talker ID GL) (30)2.7.8 GNQ (31)2.7.8.1 Poll a standard message (Talker ID GN) (31)2.7.9 GNS (31)2.7.9.1 GNSS fix data (31)2.7.10 GPQ (32)2.7.10.1 Poll a standard message (Talker ID GP) (32)2.7.11 GQQ (32)2.7.11.1 Poll a standard message (Talker ID GQ) (33)2.7.12 GRS (33)2.7.12.1 GNSS range residuals (33)2.7.13 GSA (34)2.7.13.1 GNSS DOP and active satellites (34)2.7.14 GST (34)2.7.14.1 GNSS pseudorange error statistics (34)2.7.15 GSV (35)2.7.15.1 GNSS satellites in view (35)2.7.16 RLM (36)2.7.16.1 Return link message (RLM) (36)2.7.17 RMC (36)2.7.17.1 Recommended minimum data (36)2.7.18 THS (37)2.7.18.1 True heading and status (37)2.7.19 TXT (38)2.7.19.1 Text transmission (38)2.7.20 VTG (38)2.7.20.1 Course over ground and ground speed (38)2.7.21 ZDA (39)2.7.21.1 Time and date (39)2.8 Secondary output messages (40)2.8.1 GGA (40)2.8.1.1 Global positioning system fix data (40)2.8.2 GLL (41)2.8.2.1 Latitude and longitude, with time of position fix and status (41)2.8.3 GNS (41)2.8.3.1 GNSS fix data (42)2.8.4 GSA (43)2.8.4.1 GNSS DOP and active satellites (43)2.8.5 RMC (44)2.8.5.1 Recommended minimum data (44)2.8.6 VTG (45)2.8.6.1 Course over ground and ground speed (45)2.8.7 ZDA (45)2.8.7.1 Time and date (45)2.9 PUBX messages (46)2.9.1 CONFIG (PUBX,41) (46)2.9.1.1 Set protocols and baud rate (46)2.9.2 POSITION (PUBX,00) (47)2.9.2.1 Poll a PUBX,00 message (47)2.9.2.2 Lat/Long position data (47)2.9.3 RATE (PUBX,40) (48)2.9.3.1 Set NMEA message output rate (48)2.9.4 SVSTATUS (PUBX,03) (49)2.9.4.1 Poll a PUBX,03 message (49)2.9.5 TIME (PUBX,04) (49)2.9.5.1 Poll a PUBX,04 message (50)3 UBX protocol (51)3.1 UBX protocol key features (51)3.2 UBX frame structure (51)3.3 UBX payload definition rules (52)3.3.1 UBX structure packing (52)3.3.2 UBX reserved elements (52)3.3.3 UBX undefined values (52)3.3.4 UBX conditional values (52)3.3.5 UBX data types (52)3.3.6 UBX fields scale and unit (53)3.3.7 UBX repeated fields (53)3.3.8 UBX payload decoding (54)3.4 UBX checksum (54)3.5 UBX message flow (54)3.5.1 UBX acknowledgement (54)3.5.2 UBX polling mechanism (54)3.6 GNSS, satellite, and signal numbering (55)3.7 UBX message example (55)3.8 UBX messages overview (56)3.9 UBX-ACK (0x05) (59)3.9.1 UBX-ACK-ACK (0x05 0x01) (59)3.9.1.1 Message acknowledged (60)3.9.2 UBX-ACK-NAK (0x05 0x00) (60)3.9.2.1 Message not acknowledged (60)3.10 UBX-CFG (0x06) (60)3.10.1 UBX-CFG-CFG (0x06 0x09) (60)3.10.1.1 Clear, save and load configurations (60)3.10.2 UBX-CFG-RST (0x06 0x04) (61)3.10.2.1 Reset receiver / Clear backup data structures (61)3.10.3 UBX-CFG-SPT (0x06 0x64) (62)3.10.3.1 Configure and start a sensor production test (62)3.10.4 UBX-CFG-VALDEL (0x06 0x8c) (63)3.10.4.1 Delete configuration item values (63)3.10.4.2 Delete configuration item values (with transaction) (63)3.10.5 UBX-CFG-VALGET (0x06 0x8b) (64)3.10.5.1 Get configuration items (65)3.10.5.2 Configuration items (65)3.10.6 UBX-CFG-VALSET (0x06 0x8a) (66)3.10.6.1 Set configuration item values (66)3.10.6.2 Set configuration item values (with transaction) (67)3.11 UBX-ESF (0x10) (68)3.11.1 UBX-ESF-ALG (0x10 0x14) (68)3.11.1.1 IMU alignment information (68)3.11.2 UBX-ESF-INS (0x10 0x15) (69)3.11.2.1 Vehicle dynamics information (69)3.11.3 UBX-ESF-MEAS (0x10 0x02) (70)3.11.3.1 External sensor fusion measurements (70)3.11.4 UBX-ESF-RAW (0x10 0x03) (70)3.11.4.1 Raw sensor measurements (70)3.11.5 UBX-ESF-STATUS (0x10 0x10) (71)3.11.5.1 External sensor fusion status (71)3.12 UBX-INF (0x04) (72)3.12.1 UBX-INF-DEBUG (0x04 0x04) (72)3.12.1.1 ASCII output with debug contents (72)3.12.2 UBX-INF-ERROR (0x04 0x00) (73)3.12.2.1 ASCII output with error contents (73)3.12.3 UBX-INF-NOTICE (0x04 0x02) (73)3.12.3.1 ASCII output with informational contents (73)3.12.4 UBX-INF-TEST (0x04 0x03) (73)3.12.4.1 ASCII output with test contents (73)3.12.5 UBX-INF-WARNING (0x04 0x01) (74)3.12.5.1 ASCII output with warning contents (74)3.13 UBX-MGA (0x13) (74)3.13.1 UBX-MGA-ACK (0x13 0x60) (74)3.13.1.1 Multiple GNSS acknowledge message (74)3.13.2 UBX-MGA-BDS (0x13 0x03) (75)3.13.2.1 BeiDou ephemeris assistance (75)3.13.2.2 BeiDou almanac assistance (76)3.13.2.3 BeiDou health assistance (77)3.13.2.4 BeiDou UTC assistance (77)3.13.2.5 BeiDou ionosphere assistance (78)3.13.3 UBX-MGA-DBD (0x13 0x80) (78)3.13.3.1 Poll the navigation database (78)3.13.3.2 Navigation database dump entry (79)3.13.4 UBX-MGA-GAL (0x13 0x02) (79)3.13.4.1 Galileo ephemeris assistance (79)3.13.4.2 Galileo almanac assistance (80)3.13.4.3 Galileo GPS time offset assistance (81)3.13.4.4 Galileo UTC assistance (82)3.13.5 UBX-MGA-GLO (0x13 0x06) (82)3.13.5.1 GLONASS ephemeris assistance (82)3.13.5.2 GLONASS almanac assistance (83)3.13.5.3 GLONASS auxiliary time offset assistance (84)3.13.6 UBX-MGA-GPS (0x13 0x00) (84)3.13.6.1 GPS ephemeris assistance (84)3.13.6.2 GPS almanac assistance (86)3.13.6.3 GPS health assistance (86)3.13.6.4 GPS UTC assistance (87)3.13.6.5 GPS ionosphere assistance (87)3.13.7 UBX-MGA-INI (0x13 0x40) (88)3.13.7.1 Initial position assistance (88)3.13.7.2 Initial position assistance (88)3.13.7.3 Initial time assistance (89)3.13.7.4 Initial time assistance (90)3.13.7.5 Initial clock drift assistance (91)3.13.7.6 Initial frequency assistance (91)3.13.8 UBX-MGA-QZSS (0x13 0x05) (91)3.13.8.1 QZSS ephemeris assistance (92)3.13.8.2 QZSS almanac assistance (93)3.13.8.3 QZSS health assistance (93)3.13.9 UBX-MGA-SF (0x13 0x10) (94)3.13.9.1 Sensor fusion initialization data (94)3.13.9.2 Sensor fusion initialization data (94)3.14 UBX-MON (0x0a) (95)3.14.1 UBX-MON-COMMS (0x0a 0x36) (95)3.14.1.1 Communication port information (95)3.14.2 UBX-MON-GNSS (0x0a 0x28) (96)3.14.2.1 Information message major GNSS selection (96)3.14.3 UBX-MON-HW (0x0a 0x09) (96)3.14.3.1 Hardware status (97)3.14.4 UBX-MON-HW2 (0x0a 0x0b) (97)3.14.4.1 Extended hardware status (98)3.14.5 UBX-MON-HW3 (0x0a 0x37) (98)3.14.5.1 I/O pin status (98)3.14.6 UBX-MON-IO (0x0a 0x02) (99)3.14.6.1 I/O system status (99)3.14.7 UBX-MON-MSGPP (0x0a 0x06) (100)3.14.7.1 Message parse and process status (100)3.14.8 UBX-MON-PATCH (0x0a 0x27) (100)3.14.8.1 Installed patches (100)3.14.9 UBX-MON-RF (0x0a 0x38) (101)3.14.9.1 RF information (101)3.14.10 UBX-MON-RXBUF (0x0a 0x07) (102)3.14.10.1 Receiver buffer status (102)3.14.11 UBX-MON-RXR (0x0a 0x21) (102)3.14.11.1 Receiver status information (102)3.14.12 UBX-MON-SPAN (0x0a 0x31) (102)3.14.12.1 Signal characteristics (103)3.14.13 UBX-MON-SPT (0x0a 0x2f) (103)3.14.13.1 Sensor production test (103)3.14.14 UBX-MON-SYS (0x0a 0x39) (105)3.14.14.1 Current system performance information (105)3.14.15 UBX-MON-TXBUF (0x0a 0x08) (106)3.14.15.1 Transmitter buffer status (106)3.14.16 UBX-MON-VER (0x0a 0x04) (107)3.14.16.1 Receiver and software version (107)3.15 UBX-NAV (0x01) (107)3.15.1 UBX-NAV-ATT (0x01 0x05) (108)3.15.1.1 Attitude solution (108)3.15.2 UBX-NAV-CLOCK (0x01 0x22) (108)3.15.2.1 Clock solution (108)3.15.3 UBX-NAV-COV (0x01 0x36) (109)3.15.3.1 Covariance matrices (109)3.15.4 UBX-NAV-DOP (0x01 0x04) (109)3.15.4.1 Dilution of precision (109)3.15.5 UBX-NAV-EELL (0x01 0x3d) (110)3.15.5.1 Position error ellipse parameters (110)3.15.6 UBX-NAV-EOE (0x01 0x61) (110)3.15.6.1 End of epoch (110)3.15.7 UBX-NAV-GEOFENCE (0x01 0x39) (111)3.15.7.1 Geofencing status (111)3.15.8 UBX-NAV-HPPOSECEF (0x01 0x13) (111)3.15.8.1 High precision position solution in ECEF (111)3.15.9 UBX-NAV-HPPOSLLH (0x01 0x14) (112)3.15.9.1 High precision geodetic position solution (112)3.15.10 UBX-NAV-ORB (0x01 0x34) (113)3.15.10.1 GNSS orbit database info (113)3.15.11 UBX-NAV-PL (0x01 0x62) (114)3.15.11.1 Protection level information (114)3.15.12 UBX-NAV-POSECEF (0x01 0x01) (116)3.15.12.1 Position solution in ECEF (116)3.15.13 UBX-NAV-POSLLH (0x01 0x02) (116)3.15.13.1 Geodetic position solution (117)3.15.14 UBX-NAV-PVAT (0x01 0x17) (117)3.15.14.1 Navigation position velocity attitude time solution (117)3.15.15 UBX-NAV-PVT (0x01 0x07) (119)3.15.15.1 Navigation position velocity time solution (119)3.15.16 UBX-NAV-RELPOSNED (0x01 0x3c) (121)3.15.16.1 Relative positioning information in NED frame (122)3.15.17 UBX-NAV-SAT (0x01 0x35) (123)3.15.17.1 Satellite information (123)3.15.18 UBX-NAV-SBAS (0x01 0x32) (125)3.15.18.1 SBAS status data (125)3.15.19 UBX-NAV-SIG (0x01 0x43) (126)3.15.19.1 Signal information (126)3.15.20 UBX-NAV-SLAS (0x01 0x42) (127)3.15.20.1 QZSS L1S SLAS status data (127)3.15.21 UBX-NAV-STATUS (0x01 0x03) (128)3.15.21.1 Receiver navigation status (128)3.15.22 UBX-NAV-TIMEBDS (0x01 0x24) (130)3.15.22.1 BeiDou time solution (130)3.15.23 UBX-NAV-TIMEGAL (0x01 0x25) (130)3.15.23.1 Galileo time solution (130)3.15.24 UBX-NAV-TIMEGLO (0x01 0x23) (131)3.15.24.1 GLONASS time solution (131)3.15.25 UBX-NAV-TIMEGPS (0x01 0x20) (132)3.15.25.1 GPS time solution (132)3.15.26 UBX-NAV-TIMELS (0x01 0x26) (132)3.15.26.1 Leap second event information (132)3.15.27 UBX-NAV-TIMEQZSS (0x01 0x27) (133)3.15.27.1 QZSS time solution (134)3.15.28 UBX-NAV-TIMEUTC (0x01 0x21) (134)3.15.28.1 UTC time solution (134)3.15.29 UBX-NAV-VELECEF (0x01 0x11) (135)3.15.29.1 Velocity solution in ECEF (135)3.15.30 UBX-NAV-VELNED (0x01 0x12) (135)3.15.30.1 Velocity solution in NED frame (136)3.16 UBX-NAV2 (0x29) (136)3.16.1 UBX-NAV2-CLOCK (0x29 0x22) (136)3.16.1.1 Clock solution (136)3.16.2 UBX-NAV2-COV (0x29 0x36) (137)3.16.2.1 Covariance matrices (137)3.16.3 UBX-NAV2-DOP (0x29 0x04) (137)3.16.3.1 Dilution of precision (137)3.16.4 UBX-NAV2-EELL (0x29 0x3d) (138)3.16.4.1 Position error ellipse parameters (138)3.16.5 UBX-NAV2-EOE (0x29 0x61) (138)3.16.5.1 End of epoch (138)3.16.6 UBX-NAV2-POSECEF (0x29 0x01) (139)3.16.6.1 Position solution in ECEF (139)3.16.7 UBX-NAV2-POSLLH (0x29 0x02) (139)3.16.7.1 Geodetic position solution (139)3.16.8 UBX-NAV2-PVAT (0x29 0x17) (140)3.16.8.1 Navigation position velocity attitude time solution (140)3.16.9 UBX-NAV2-PVT (0x29 0x07) (142)3.16.9.1 Navigation position velocity time solution (142)3.16.10 UBX-NAV2-SAT (0x29 0x35) (144)3.16.10.1 Satellite information (144)3.16.11 UBX-NAV2-SBAS (0x29 0x32) (146)3.16.11.1 SBAS status data (146)3.16.12 UBX-NAV2-SIG (0x29 0x43) (147)3.16.12.1 Signal information (147)3.16.13 UBX-NAV2-SLAS (0x29 0x42) (148)3.16.13.1 QZSS L1S SLAS status data (148)3.16.14 UBX-NAV2-STATUS (0x29 0x03) (149)3.16.14.1 Receiver navigation status (149)3.16.15 UBX-NAV2-TIMEBDS (0x29 0x24) (151)3.16.15.1 BeiDou time solution (151)3.16.16 UBX-NAV2-TIMEGAL (0x29 0x25) (151)3.16.16.1 Galileo time solution (151)3.16.17 UBX-NAV2-TIMEGLO (0x29 0x23) (152)3.16.17.1 GLONASS time solution (152)3.16.18 UBX-NAV2-TIMEGPS (0x29 0x20) (153)3.16.18.1 GPS time solution (153)3.16.19 UBX-NAV2-TIMELS (0x29 0x26) (153)3.16.19.1 Leap second event information (153)3.16.20 UBX-NAV2-TIMEQZSS (0x29 0x27) (154)3.16.20.1 QZSS time solution (155)3.16.21 UBX-NAV2-TIMEUTC (0x29 0x21) (155)3.16.21.1 UTC time solution (155)3.16.22 UBX-NAV2-VELECEF (0x29 0x11) (156)3.16.22.1 Velocity solution in ECEF (156)3.16.23 UBX-NAV2-VELNED (0x29 0x12) (156)3.16.23.1 Velocity solution in NED frame (157)3.17 UBX-RXM (0x02) (157)3.17.1 UBX-RXM-COR (0x02 0x34) (157)3.17.1.1 Differential correction input status (157)3.17.2 UBX-RXM-MEASX (0x02 0x14) (158)3.17.2.1 Satellite measurements for RRLP (158)3.17.3 UBX-RXM-PMP (0x02 0x72) (160)3.17.3.1 PMP (LBAND) message (160)3.17.4 UBX-RXM-PMREQ (0x02 0x41) (160)3.17.4.1 Power management request (160)3.17.4.2 Power management request (161)3.17.5 UBX-RXM-QZSSL6 (0x02 0x73) (161)3.17.5.1 QZSS L6 message (161)3.17.6 UBX-RXM-RAWX (0x02 0x15) (162)3.17.6.1 Multi-GNSS raw measurements (162)3.17.7 UBX-RXM-RLM (0x02 0x59) (164)3.17.7.1 Galileo SAR short-RLM report (164)3.17.7.2 Galileo SAR long-RLM report (164)3.17.8 UBX-RXM-RTCM (0x02 0x32) (165)3.17.8.1 RTCM input status (165)3.17.9 UBX-RXM-SPARTN (0x02 0x33) (165)3.17.9.1 SPARTN input status (165)3.17.10 UBX-RXM-SPARTNKEY (0x02 0x36) (166)3.17.10.1 Poll installed keys (166)3.17.10.2 Transfer dynamic SPARTN keys (166)3.18 UBX-SEC (0x27) (167)3.18.1 UBX-SEC-SIG (0x27 0x09) (167)3.18.1.1 Signal security information (167)3.18.2 UBX-SEC-SIGLOG (0x27 0x10) (168)3.18.2.1 Signal security log (168)3.18.3 UBX-SEC-UNIQID (0x27 0x03) (168)3.18.3.1 Unique chip ID (169)3.19 UBX-TIM (0x0d) (169)3.19.1 UBX-TIM-TM2 (0x0d 0x03) (169)3.19.1.1 Time mark data (169)3.19.2 UBX-TIM-TP (0x0d 0x01) (170)3.19.2.1 Time pulse time data (170)3.19.3 UBX-TIM-VRFY (0x0d 0x06) (171)3.19.3.1 Sourced time verification (171)3.20 UBX-UPD (0x09) (171)3.20.1 UBX-UPD-SOS (0x09 0x14) (171)3.20.1.1 Poll backup restore status (172)3.20.1.2 Create backup in flash (172)3.20.1.3 Clear backup in flash (172)3.20.1.4 Backup creation acknowledge (172)3.20.1.5 System restored from backup (173)4 RTCM protocol (174)4.1 RTCM introduction (174)4.2 RTCM 3.x configuration (174)4.3 RTCM messages overview (174)4.4 RTCM 3.3 messages (175)4.4.1 Message type 1001 (175)4.4.1.1 L1-only GPS RTK observables (175)4.4.2 Message type 1002 (176)4.4.2.1 Extended L1-only GPS RTK observables (176)4.4.3 Message type 1003 (176)4.4.3.1 L1/L2 GPS RTK observables (176)4.4.4 Message type 1004 (177)4.4.4.1 Extended L1/L2 GPS RTK observables (177)4.4.5 Message type 1005 (177)4.4.5.1 Stationary RTK reference station ARP (177)4.4.6 Message type 1006 (178)4.4.6.1 Stationary RTK reference station ARP with antenna height (178)4.4.7 Message type 1007 (178)4.4.7.1 Antenna descriptor (179)4.4.8 Message type 1009 (179)4.4.8.1 L1-only GLONASS RTK observables (179)4.4.9 Message type 1010 (180)4.4.9.1 Extended L1-Only GLONASS RTK observables (180)4.4.10 Message type 1011 (180)4.4.10.1 L1&L2 GLONASS RTK observables (180)4.4.11 Message type 1012 (181)4.4.11.1 Extended L1&L2 GLONASS RTK observables (181)4.4.12 Message type 1033 (181)4.4.12.1 Receiver and antenna descriptors (181)4.4.13 Message type 1074 (182)4.4.13.1 GPS MSM4 (182)4.4.14 Message type 1075 (182)4.4.14.1 GPS MSM5 (182)4.4.15 Message type 1077 (183)4.4.15.1 GPS MSM7 (183)4.4.16 Message type 1084 (184)4.4.16.1 GLONASS MSM4 (184)4.4.17 Message type 1085 (184)4.4.17.1 GLONASS MSM5 (184)4.4.18 Message type 1087 (185)4.4.18.1 GLONASS MSM7 (185)4.4.19 Message type 1094 (185)4.4.19.1 Galileo MSM4 (185)4.4.20 Message type 1095 (186)4.4.20.1 Galileo MSM5 (186)4.4.21 Message type 1097 (186)4.4.21.1 Galileo MSM7 (187)4.4.22 Message type 1124 (187)4.4.22.1 BeiDou MSM4 (187)4.4.23 Message type 1125 (188)4.4.23.1 BeiDou MSM5 (188)4.4.24 Message type 1127 (188)4.4.24.1 BeiDou MSM7 (188)4.4.25 Message type 1230 (189)4.4.25.1 GLONASS L1 and L2 code-phase biases (189)5 SPARTN protocol (190)5.1 SPARTN introduction (190)5.2 SPARTN configuration (190)5.3 SPARTN messages overview (190)5.4 SPARTN messages (191)5.4.1 Message type 0, sub-type 0 (191)5.4.1.1 GPS orbit, clock, bias (OCB) (191)5.4.2 Message type 0, sub-type 1 (191)5.4.2.1 GLONASS orbit, clock, bias (OCB) (192)5.4.3 Message type 0, sub-type 2 (192)5.4.3.1 Galileo orbit, clock, bias (OCB) (192)5.4.4 Message type 0, sub-type 3 (193)5.4.4.1 BeiDou orbit, clock, bias (OCB) (193)5.4.5 Message type 0, sub-type 4 (194)5.4.5.1 QZSS orbit, clock, bias (OCB) (194)5.4.6 Message type 1, sub-type 0 (195)5.4.6.1 GPS high-precision atmosphere correction (HPAC) (195)5.4.7 Message type 1, sub-type 1 (195)5.4.7.1 GLONASS high-precision atmosphere correction (HPAC) (195)5.4.8 Message type 1, sub-type 2 (196)5.4.8.1 Galileo high-precision atmosphere correction (HPAC) (196)5.4.9 Message type 1, sub-type 3 (197)5.4.9.1 BeiDou high-precision atmosphere correction (HPAC) (197)5.4.10 Message type 1, sub-type 4 (198)5.4.10.1 QZSS high-precision atmosphere correction (HPAC) (198)5.4.11 Message type 2, sub-type 0 (199)5.4.11.1 Geographic area definition (GAD) (199)5.4.12 Message type 3, sub-type 0 (199)5.4.12.1 Basic-precision atmosphere correction (BPAC) (199)6 Configuration interface (201)6.1 Configuration database (201)6.2 Configuration items (201)6.3 Configuration layers (202)6.4 Configuration interface access (203)6.4.1 UBX protocol interface (203)6.5 Configuration data (203)6.6 Configuration transactions (204)6.7 Configuration reset behavior (205)6.8 Configuration overview (205)6.9 Configuration reference (206)6.9.1 CFG-BDS: BeiDou system configuration (206)6.9.2 CFG-GEOFENCE: Geofencing configuration (206)6.9.3 CFG-HW: Hardware configuration (207)6.9.4 CFG-I2C: Configuration of the I2C interface (209)6.9.5 CFG-I2CINPROT: Input protocol configuration of the I2C interface (209)6.9.6 CFG-I2COUTPROT: Output protocol configuration of the I2C interface (209)6.9.7 CFG-INFMSG: Information message configuration (209)6.9.8 CFG-MOT: Motion detector configuration (210)6.9.9 CFG-MSGOUT: Message output configuration (211)6.9.10 CFG-NAV2: Secondary output configuration (231)6.9.11 CFG-NAVHPG: High precision navigation configuration (231)6.9.12 CFG-NAVSPG: Standard precision navigation configuration (232)6.9.13 CFG-NMEA: NMEA protocol configuration (236)6.9.14 CFG-QZSS: QZSS system configuration (238)6.9.15 CFG-RATE: Navigation and measurement rate configuration (238)6.9.16 CFG-RINV: Remote inventory (239)6.9.17 CFG-RTCM: RTCM protocol configuration (239)6.9.18 CFG-SBAS: SBAS configuration (240)6.9.19 CFG-SEC: Security configuration (241)6.9.20 CFG-SFCORE: Sensor fusion (SF) core configuration (242)6.9.21 CFG-SFIMU: Sensor fusion (SF) inertial measurement unit (IMU) configuration (242)6.9.22 CFG-SFODO: Sensor fusion (SF) odometer configuration (243)6.9.23 CFG-SIGNAL: Satellite systems (GNSS) signal configuration (244)6.9.24 CFG-SPARTN: SPARTN configuration (245)6.9.25 CFG-SPI: Configuration of the SPI interface (245)6.9.26 CFG-SPIINPROT: Input protocol configuration of the SPI interface (246)6.9.27 CFG-SPIOUTPROT: Output protocol configuration of the SPI interface (246)6.9.28 CFG-TP: Time pulse configuration (246)6.9.29 CFG-TXREADY: TX ready configuration (248)6.9.30 CFG-UART1: Configuration of the UART1 interface (248)6.9.31 CFG-UART1INPROT: Input protocol configuration of the UART1 interface (249)6.9.32 CFG-UART1OUTPROT: Output protocol configuration of the UART1 interface (249)6.9.33 CFG-UART2: Configuration of the UART2 interface (250)6.9.34 CFG-UART2INPROT: Input protocol configuration of the UART2 interface (250)6.9.35 CFG-UART2OUTPROT: Output protocol configuration of the UART2 interface (251)6.9.36 CFG-USB: Configuration of the USB interface (251)6.9.37 CFG-USBINPROT: Input protocol configuration of the USB interface (251)6.9.38 CFG-USBOUTPROT: Output protocol configuration of the USB interface (252)6.10 Legacy UBX message fields reference (252)Configuration defaults (258)Related documents (281)Revision history (282)1 General information1.1 Document overviewThis document describes the interface of the u-blox F9 high precision sensor fusion GNSS receiver. The interface consists of the following parts:•NMEA protocol•UBX protocol•RTCM protocol•SPARTN protocol•Configuration interfaceSome of the features described here may not be available in the receiver, and some mayrequire specific configurations to be enabled. See the applicable data sheet for availability of the features and the integration manual for instructions for enabling them.Previous versions of u-blox receiver documentation combined general receiver description and interface specification. In the current documentation the receiver description isincluded in the integration manual.See also Related documents.1.2 Firmware and protocol versionsu-blox generation 9 receivers execute firmware from internal ROM or from internal code-RAM. If the firmware image is stored in a flash it is loaded into the code-RAM before execution. It is also possible to store the firmware image in the host system. The firmware is then loaded into the code-RAM from the host processor. (Loading the firmware from the host processor is not supported in all products.) If there is no external firmware image, then the firmware is executed from the ROM.The location and the version of the boot loader and the currently running firmware can be found in the boot screen and in the UBX-MON-VER message. If the firmware has been loaded from a connected flash or from the host processor, it is indicated by text "EXT". When the receiver is started, the boot screen is output automatically in UBX-INF-NOTICE or NMEA-Standard-TXT messages if configured using CFG-INFMSG. The UBX-MON-VER message can be polled using the UBX polling mechanism.The following u-center screenshots show an example of a u-blox receiver running firmware loaded from flash:The following information is available (✓) from the boot screen (B) and the UBX-MON-VER message (M):B M Example Information✓u-blox AG - Start of the boot screen.✓HW UBX 9 00190000Hardware version of the u-blox receiver.✓00190000✓✓EXT CORE 1.00 (61b2dd)Base (CORE) firmware version and revision number, loaded from externalmemory (EXT).EXT LAP 1.00 (12a3bc)Product firmware version and revision number, loaded from external memory(EXT). Available only in some firmware versions. See below for a list of productacronyms.✓✓ROM BASE 0x118B2060Revision number of the underlying boot loader firmware in ROM.✓✓FWVER=HPG 1.12Product firmware version number, where:•SPG = Standard precision GNSS product•HPG = High precision GNSS product•ADR = Automotive dead reckoning product•TIM = Time sync product•LAP = Lane accurate positioning product•HPS = High precision sensor fusion product•DBS = Dual band standard precision•MDR = Multi-mode dead reckoning product•PMP = L-Band Inmarsat point-to-multipoint receiver•QZS = QZSS L6 centimeter level augmentation service (CLAS) messagereceiver•DBD = Dual band dead reckoning product•LDR = ROM bootloader, no GNSS functionality✓✓PROTVER=34.00Supported protocol version.✓✓MOD=ZED-F9P Module name (if available).✓✓GPS;GLO;GAL;BDS List of supported major GNSS (see GNSS identifiers).✓✓SBAS;QZSS List of supported augmentation systems (see GNSS identifiers).B M Example Information✓ANTSUPERV=AC SD PDoS SR Configuration of the antenna supervisor (if available), where:•AC = Active antenna control enabled•SD = Short circuit detection enabled•OD = Open circuit detection enabled•PDoS = Short circuit power down logic enabled•SR = Automatic recovery from short state enabled✓PF=FFF79Product configuration.✓BD=E01C GNSS band configuration.The "FWVER" product firmware version indicates which firmware is currently running. This is referred to as "firmware version" in this and other documents.The revision numbers should only be used to identify a known firmware version. They arenot necessarily numeric nor are they guaranteed to increase with newer firmware versions.Similarly, firmware version numbers can have additional non-numeric informationappended, such as in "5.00B03".Not every entry is output by all u-blox receivers. The availability of some of the information depends on the product, the firmware location and the firmware version.The product firmware version and the base firmware version relate to the protocol version:Product firmware version Base firmware version Protocol versionHPS 1.00EXT CORE 1.00 (500086)33.00HPS 1.20EXT CORE 1.00 (a669b8)33.20HPS 1.21EXT CORE 1.00 (e2b374)33.21HPS 1.30EXT CORE 1.00 (a59682)33.301.3 Receiver configurationu-blox positioning receivers are fully configurable with UBX protocol messages. The configuration used by the receiver during normal operation is called the "current configuration". The current configuration can be changed during normal operation by sending UBX-CFG-VALSET messages over any I/O port. The receiver will change its current configuration immediately after receiving a configuration message. The receiver will always use the current configuration only.The current configuration is loaded from permanent configuration hard-coded in the receiver firmware (the defaults) and from non-volatile memory (user configuration) on startup of the receiver. Changes made to the current configuration at run-time will be lost when there is a power cycle, a hardware reset or a (complete) controlled software reset (see Configuration reset behavior).See Configuration interface for a detailed description of the receiver configuration system, the explanation of the configuration concept and its principles and interfaces.The configuration interface has changed from earlier u-blox positioning receivers. Thereis some backwards compatibility provided in UBX-CFG configuration messages. Users are strongly advised to only use the Configuration interface. See also Legacy UBX messagefields reference.See the integration manual for a basic receiver configuration most commonly used.。

多源数据融合的压缩感知算法在智能交通中的应用研究

多源数据融合的压缩感知算法在智能交通中的应用研究

TECHNOLOGY AND INFORMATION交通与信息化科学与信息化2020年1月下 145多源数据融合的压缩感知算法在智能交通中的应用研究王小红 招商局重庆交通科研设计院有限公司 重庆 400067摘 要 本文阐述多源数据融合在智能交通网中的研究现状,利用压缩感知的基本理论,进行智能交通中数据融合与重构算法。

最后,总结压缩感知应用于智能交通网,以实现智能交通网络中的数据采集。

智能交通网的数据采集的数据多、数据杂,本文提出压缩感知理论用于智能交通网的数据采集,降低数据量,提高有效数据的采集,采用的重构算法实现数据的重构,保证信号重构的精确性和稳定性,保证网络传输信息的准确性。

最后给出了仿真分析论证。

关键词 数据融合;压缩感知;重构算法;智能交通引言智能交通系统网络中,采用了大量不同类型的传感器,比如温度、声音、振动、压力等等,由于网络的智能处理系统中传感器种类是多样的,而且是多种传感器相互组合作用。

在智能交通系统设计当中,外界道路交通环境感知信息是由各种各样的传感器的采集后,由处理器进行分析处理后发出相关指令,最后指令通过无线传输到外部道路基础设施中。

传统的多源数据融合算法,在多传感器节点的智能交通系统中能量与通信带宽上的均不能达到要求。

因此,本文采用多源数据融合的压缩感知算法,通过仿真验证能够较好的减少节点间数据通信量。

多源传感器数据融合( Multi-Sensor data fusion)最早兴起于国外军事方面。

这些军事系统采用的传感器包括雷达、红外、激光、可见光传感器和声音传感器等,通过各种算法解决态势评估和威胁评估技术问题。

与其他发达国家相比,我国的数据融合技术发展较晚。

我国已开始研究数据融合理论方法和技术的实现,将传感器数据融合技术用于检测、控制、模式识别、故障诊断和导航等领域。

无线传感器网络(Wireless Sensor Networks, WSN)数据融合技术的研究重点包括以下三个方面:路由协议、数据融合算法和数据表示。

多模态感知融合 雷达

多模态感知融合 雷达

多模态感知融合雷达
多模态感知融合是一种通过整合多种传感器数据来提高感知系统性能的技术。

在雷达领域,多模态感知融合是指将雷达与其他传感器(如相机、激光雷达、红外传感器等)的数据进行融合,从而提高目标检测、跟踪和识别的准确性和鲁棒性。

首先,雷达是一种主要用于目标检测和跟踪的传感器,其优势在于对目标的距离、速度和角度等信息有较高的测量精度。

然而,雷达也存在着一些局限性,如无法提供目标的外观信息和环境背景信息。

这时候,通过与其他传感器的数据融合,可以弥补雷达的不足,提高感知系统的性能。

在多模态感知融合中,首先需要对不同传感器的数据进行同步和校准,以确保
数据的一致性和准确性。

然后,利用数据融合算法,将不同传感器的信息进行融合,得到更加全面和准确的目标信息。

常用的数据融合方法包括卡尔曼滤波、粒子滤波、相关滤波等,这些算法可以根据不同的传感器性能和应用场景进行选择和优化。

多模态感知融合在雷达领域的应用非常广泛。

例如,结合相机和雷达的数据可
以实现目标的三维定位和外观识别,提高目标检测的准确性;结合激光雷达和雷达的数据可以实现地面和空中目标的同时检测和跟踪,提高目标追踪的鲁棒性;结合红外传感器和雷达的数据可以在复杂环境中实现目标的隐蔽和遮挡检测,提高目标识别的性能。

总的来说,多模态感知融合技术可以有效地提高雷达感知系统的性能,提高目
标检测、跟踪和识别的准确性和鲁棒性。

未来随着传感器技术的不断发展和融合,多模态感知融合在雷达领域的应用前景将更加广阔,为智能交通、安防监控、无人驾驶等领域的发展提供强大的支持。

SensorFusion多传感器融合算法设计

SensorFusion多传感器融合算法设计

SensorFusion多传感器融合算法设计随着科技的不断发展和智能化应用的快速推进,多传感器融合技术成为了现代信息处理领域中的一个重要研究方向。

在众多应用中,传感器融合算法在自动驾驶、智能家居、健康监测等领域有着广泛的应用。

本文将探讨SensorFusion多传感器融合算法的设计原理和关键技术。

1. 引言SensorFusion是指将多个传感器的数据融合起来,以提高系统的性能和稳定性。

传感器融合的目标是从多个传感器中获取更准确、更完整的信息,同时减少传感器之间的冗余和噪声。

传感器融合算法设计包括数据采集、数据预处理、特征提取和数据融合等步骤。

2. 数据采集与预处理传感器融合的首要任务是获取传感器数据。

不同传感器的数据类型和采集方式不同,因此在设计传感器融合算法时,需要考虑如何有效地采集传感器数据,并进行预处理以滤除噪声和无用信息。

常见的传感器包括摄像头、激光雷达、红外传感器等。

对于每个传感器,采集的数据需要进行校准和对齐,以保证数据的准确性和一致性。

3. 特征提取和选择传感器的数据通常是庞大且复杂的,需要通过特征提取和选择来减少数据量和提取有用的特征信息。

特征提取是指从原始数据中提取具有代表性和区分性的特征,比如提取图像中的边缘、颜色等特征;特征选择是指从提取得到的特征中选择与任务相关的特征,以充分利用有限的计算和存储资源。

特征提取和选择的方法包括统计学方法、机器学习方法和信息论方法等。

4. 数据融合算法数据融合是指将多个传感器的信息整合起来,通过融合算法处理和分析多源数据,以提高系统的性能和鲁棒性。

常见的数据融合算法包括加权平均法、卡尔曼滤波、粒子滤波等。

4.1 加权平均法加权平均法是最简单且常用的数据融合方法。

该方法通过为每个传感器分配权重,将传感器的数据进行加权平均。

权重的分配可以基于经验、精度或其他可靠性指标。

加权平均法适用于静态环境下,要求传感器之间相互独立且准确。

4.2 卡尔曼滤波卡尔曼滤波是一种运用在系统状态估计中的最优滤波算法。

Multisensor Data Fusion_Detection Fusion-课件

Multisensor Data Fusion_Detection Fusion-课件

Detection Fusion
In the case of binary signals, there are four possible decisions, there is a corresponding decision probability: (1) If H 0 is true, decision is D0 , this is the correct decision.
2 The Single Sensor Statistical Decision Theory
In signal detection, in order to make the most reasonable decision, firstly, we must choose a reasonable rule. Although there are many kinds of most common rules, they can belong to the same type of hypothesis test problem. Hypothesis 假设: Various possibilities of decision object. Detection 检验: Make a decision based on observation of data source output.
2.1 Decision Probability of Binary Signal 二元信号的判决概率
In the case of binary signal, the signal source has two possible output signals, recorded as hypothesis H 0 and H1 . In the background of noise interference, the output signal is mapped to the whole observation space Z with a specified probability. Then observation (z|H 0 ) and (z|H1 ) are generated. According to the decision rule, observation space will be divided into two decision domains Z0 and Z1 . If observation (z|H 0 ) belongs to domain Z0 , judgment D0 is true. It is denoted by

多模态生物特征

多模态生物特征

多模态生物特征
多模态生物特征是指采用多种生物特征识别技术来对个体进行
身份识别或辨认。

这些生物特征可以包括指纹、人脸、虹膜、声纹、手掌纹、视网膜等。

多模态生物特征识别技术的优势在于可以提高识别的准确性和可靠性,同时也可以降低识别误差率和欺骗率。

多模态生物特征识别技术在安防领域、公安领域、金融领域和医疗领域等有着广泛的应用。

在安防领域,多模态生物特征识别技术可以用于门禁系统、智能家居系统等场景,以提高安全性和便捷性。

在公安领域,多模态生物特征识别技术可以用于刑侦破案、警务管理等方面,以提高工作效率和准确性。

在金融领域,多模态生物特征识别技术可以用于身份验证、交易授权等场景,以提高交易的安全性和可靠性。

在医疗领域,多模态生物特征识别技术可以用于患者身份验证、医疗记录管理等方面,以提高医疗服务的质量和可靠性。

与传统的单一生物特征识别技术相比,多模态生物特征识别技术的识别准确性更高,同时也能够克服单一生物特征识别技术的局限性。

但是,多模态生物特征识别技术的应用还面临着技术成本高、隐私保护等问题,需要在实际应用中加以解决。

- 1 -。

多模态信息融合感知 科技专项 指南

多模态信息融合感知 科技专项 指南

多模态信息融合感知科技专项指南《多模态信息融合感知科技专项指南》1.引言在当今数字化、智能化的时代,多模态信息融合感知技术正逐渐成为人工智能和智能制造等领域的热点。

多模态信息融合感知技术指的是利用多种传感器和信息源,将不同类型、不同来源的信息进行融合和感知,从而实现更加全面、准确的信息获取和理解。

本文将从深度和广度两方面,围绕多模态信息融合感知科技专项展开探讨和指导。

2.深度探讨:多模态信息融合感知技术的关键特点多模态信息融合感知技术的核心在于融合多种信息源,包括但不限于图像、声音、文本、视频等多种模态的信息。

这种融合可以让系统获得更丰富、更全面的信息,从而提高信息的准确性和可靠性。

多模态信息融合感知技术需要涉及到传感器、数据融合算法、深度学习、模式识别等多个方面的知识,因此在研究和应用中存在一定的复杂性和挑战性。

3.广度探讨:多模态信息融合感知技术的应用领域多模态信息融合感知技术在智能制造、智能交通、智能医疗、智能家居等领域有着广泛的应用前景。

以智能制造为例,通过融合视觉、声音等多种信息,可以实现对生产过程的实时监测和智能控制,提高生产效率和产品质量。

在智能交通领域,多模态信息融合感知技术可以实现对交通状况的全面感知和智能调度,提高交通运输的效率和安全性。

4.总结和回顾多模态信息融合感知技术作为一项新兴技术,具有非常广阔的应用前景和深远的影响。

通过深度和广度兼具的探讨,我们对这一技术有了更加全面、深刻的理解。

在未来的研究和应用中,我们需要进一步深化对多模态信息融合感知技术的理解,不断创新并推动其在各个领域的应用。

5.个人观点和理解在我看来,多模态信息融合感知技术代表着人工智能和智能制造的未来发展趋势。

通过将不同模态的信息进行融合,可以实现对现实世界更全面、更准确的感知和理解,从而为各个领域的智能化改造和升级提供强有力的支撑。

我对多模态信息融合感知技术的前景充满信心,相信它将在未来发挥越来越重要的作用。

MULTISENSOR FUSION

MULTISENSOR FUSION

How do constraints on communication bandwidth and processing limit architectures for fusion? How does the brain create and modify its data representation? How does the brain encode time, dynamics and use feedback? How does the brain encode and process probabilities and uncertain knowledge?
MULTISENSOR FUSION
Centralised
Sensor 1 Sensor 2 Sensor 3
Architecture (US-JDL/UK-TFDF) Feature Space
(Data representations, Task-specific, feedback)
-Impractical -Not scalable -best Decentralised -Robust
usion is an iterative dynamical process - Continually refining estimates, representations ..
MULTISENSOR FUSION
Effective Sensor Fusion requires key elements:
Sensor 2
+
Sensor 1
+ + +
Sensor 4 Sensor 3
-scalable
Dimensionality

Multi-sensor Information Fusion

Multi-sensor Information Fusion
Keywords: information fusion, neural networks, multi-sensor
I. INTRODUCTION Along with the constant development of China's modernization process and the continuous improvement of people's living standards, people need better producing and living environment, and they pay more attention to the ecological environment. So the real-time monitoring of environmental factors (water, dust, smoke, noise, air quality, etc.) becomes the focus work of the environmental protection departments. They often put the monitoring device in every monitoring point, and there are three data acquisition ways: artificial way, specialized lines, and wireless sensor networks. But these ways all have significant drawback: artificial way is neither convenient nor a waste of manpower and resources; specialized line is difficult to achieve in remote areas; wireless sensor networks is greatly influenced by the weather and geographical conditions [1]. Aiming at these problems, some experts have made encouraging progress. Wireless sensor networks can solve the data transmission problem in environment monitoring system [3]; however, the accuracy of the data which return from the area by wireless sensor networks is not very well, especially when the external environment mutation, it will return clear distortion data. In recent years, multi-sensor information fusion technology has been rapidly developed and applied in many sophisticated application areas [4]. In view of the current low accuracy, reliability in environmental detecting, we propose an environmental detecting system based on multi-sensor information fusion technology, which can get more accurate information than any single sensor in a shorter period of time with smaller price. The paper is organized as follows. In section 2, the system structure and the process model are discussed, and choose the adaptive one to apply in the system. In section 3, the method of multi-sensor information fusion based on Neural Networks is illustrated in detail, and then we propose an improved ART

多传感器信息融合技术研究

多传感器信息融合技术研究

多传感器信息融合技术研究多传感器信息融合技术(Multi-sensor Information Fusion Technology)是一种通过整合多种传感器信息来获得更好结果的技术。

多传感器信息融合技术能够有效地解决单一传感器无法完成的任务,例如环境感知、目标检测和定位等。

本文将讨论多传感器信息融合技术的概念、应用、挑战和未来发展方向。

一、多传感器信息融合技术的概念多传感器信息融合技术是指通过整合多种类型的传感器信息,以及运用人工智能和机器学习算法等技术,将信息转换为更精确的数据和知识。

多传感器信息融合技术能够将多种数据源(如可见光、红外、声音、气体、温度等)的信息相结合,以获取丰富的信息和更完整的数据。

通过多传感器信息融合技术,可以提高传感器的工作效率和准确性。

二、多传感器信息融合技术的应用1.智能交通:多传感器信息融合技术已经在智能交通领域得到了广泛应用。

通过整合多种类型的传感器(如雷达、视频、红外、微波、光学等),交通系统可以实时监测交通流量、车辆速度和事故等情况,并实现智能化的交通管制。

2.工业生产:在工业生产中,多传感器信息融合技术可以帮助企业检测设备故障、监测生产过程和优化生产效率。

通过整合不同类型传感器的信息,可以更精确地实现设备状态监测和故障诊断。

3.智能家居:多传感器信息融合技术可以帮助智能家居系统实现个性化的家居控制。

例如,通过整合温度、湿度、光线等传感器的信息,系统可以自动地调整室内温度和照明等环境,提供更舒适和安全的家庭环境。

三、多传感器信息融合技术的挑战多传感器信息融合技术的应用还面临一些挑战。

首先,不同类型传感器所采集的信息不一定匹配,因此需要对传感器信息进行标准化处理。

其次,传感器之间可能存在互相影响的情况,例如传感器之间的干扰或协作。

最后,多传感器信息融合技术需要用复杂的算法实现数据的整合和分析,算法的复杂度和计算量也需要考虑。

四、多传感器信息融合技术的未来发展方向未来多传感器信息融合技术的发展趋势将更加注重智能化和自主化。

西安交大自动化专业多传感器信息融合ch1(资料)

西安交大自动化专业多传感器信息融合ch1(资料)

第1章 绪论1.1多源信息融合的一般概念与定义1.1.1定义多源信息融合(multi-source information fusion)又称为多传感信息融合(multi-sensor information fusion),是20世纪70年代提出来的,军事应用是该技术诞生的源泉。

事实上,人类和自然界中其它动物对客观事物的认知过程,就是对多源信息的融合过程。

在这个认知过程中,人或动物首先通过视觉、听觉、触觉、嗅觉和味觉等多种感官(不是单纯依靠一种感官)对客观事物实施多种类、多方位的感知,从而获得大量互补和冗余的信息;然后由大脑对这些感知信息依据某种未知的规则进行组合和处理,从而得到对客观对象的统一与和谐的理解与认识。

这种由感知到认知的过程就是生物体的多源信息融合过程。

人们希望用机器来模仿这种由感知到认知的过程。

于是,一门新的边缘学科——多源信息融合便诞生了。

由于早期的融合方法研究是针对数据处理的,所以有时也把信息融合称为数据融合(data fusion)。

我们在这里所讲的传感器(sensor)也是广义的,不仅包括物理意义上的各种传感系统,也包括与观测环境匹配的各种信息获取系统,甚至包括人或动物的感知系统。

虽然人们对这门边缘学科的研究已经有20至30年的历史了,但至今仍然没有一个被普遍接受的定义。

这是因为其应用面非常广泛,而各行各业会按自己的理解给出不同的定义。

目前能被大多数研究者接受的有关信息融合的定义,是由美国三军组织实验室理事联合会JDL(Joint Directors of Laboratories)提出来的[1-3],从军事应用的角度给出信息融合的定义。

定义1.1.1 信息融合就是一种多层次、多方面的处理过程,包括对多源数据进行检测、相关、组合和估计,从而提高状态和身份估计的精度,以及对战场态势和威胁的重要程度进行适时完整的评价。

从该定义可以看出,信息融合是在几个层次上完成对多源信息处理的过程,其中每一个层次反映对原始观测数据不同级别的抽象。

海洋相关多模态传感技术

海洋相关多模态传感技术

海洋相关多模态传感技术
海洋相关的多模态传感技术是指利用多个传感器融合不同的感知模态,以获取更全面、准确的海洋信息的技术。

常用的海洋相关多模态传感技术包括以下几种:
1. 声学传感技术:利用声纳等装置探测海洋中的声音信号,用于测量海洋的物理参数、生物信息以及水下目标的探测等。

2. 光学传感技术:利用光学仪器、摄像头等获取海洋中的图像、视频信息,用于观察海洋生态环境、水下目标检测和成像等。

3. 电磁传感技术:利用电磁波传感器获取海洋中的电磁信号,用于测量海洋的电磁特性、水下目标探测和通信等。

4. 化学传感技术:利用化学传感器检测海洋中的化学成分和污染物,用于海洋水质监测、环境保护等。

5. 生物传感技术:利用生物传感器或生物学检测方法监测海洋中的生物信息,例如测量水中的藻类浓度、海洋生物的迁徙和分布等。

通过综合应用不同的传感器和技术,海洋相关的多模态传感技术可以实现对海洋环境的多维度、全方位的监测和观测,为海洋资源开发、环境保护、海洋科学研究等提供了重要的技术支持。

物联网数据融合技术

物联网数据融合技术

第9章 物联网数据融合技术
1.数据融合的定义 数据融合的定义简洁地表述为:数据融合是利用计算机 技术对时序获得的若干感知数据,在一定准则下加以分析、 综合,以完成所需决策和评估任务而进行的数据处理过程。 数据融合有三层含义: (1) 数据的全空间,即数据包括确定的和模糊的、全空 间的和子空间的、同步的和异步的、数字的和非数字的,它 是复杂的、多维多源的,覆盖全频段。
第9章 物联网数据融合技术 ③ 分析能力差。不能实现对影像的有效理解和分析。 ④ 纠错要求。由于底层传感器信息存在不确定性、不完
全性或不稳定性,所以对融合过程中的纠错能力有较高要求。 ⑤ 抗干扰性差。 像元级融合所包含的具体融合方法有代数法、IHS变换、
小波变换、主成分变换(PCT)、K-T变换等。
第9章 物联网数据融合技术
相对于单源遥感影像数据,多源遥感影像数据所提供的 信息具有以下特点:
(1) 冗余性:指多源遥感影像数据对环境或目标的表示、 描述或解译结果相同。
(2) 互补性:指信息来自不同的自由度且相互独立。 (3) 合作性:不同传感器在观测和处理信息时对其他信 息有依赖关系。 (4) 信息分层的结构特性:数据融合所处理的多源遥感 信息可以在不同的信息层次上出现,这些信息抽象层次包括 像素层、特征层和决策层,分层结构和并行处理机制还可保 证系统的实时性。
像元级融合模型如图9.2所示。
第9章 物联网数据融合技术 图9.2 像元级融合模型
第9章 物联网数据融合技术
像元级融合的优点:保留了尽可能多的信息,具有最高 精度。
像元级融合的局限性: ① 效率低下。由于处理的传感器数据量大,所以处理时 间较长,实时性差。 ② 分析数据受限。为了便于像元比较,对传感器信息的 配准精度要求很高,而且要求影像来源于一组同质传感器或 同单位的。

(完整版)多传感器分布式航迹融合综述

(完整版)多传感器分布式航迹融合综述

多传感器航迹融合综述在20年代70年代初,R. A. Singer等人首次提出航迹融合问题,其推导了表征两航迹关联概率的“相关方程”,其实就是计算两条航迹间的玛氏距离:将关联概率小于门限值的航迹视为待融合的航迹,这即是一个假设检验问题;但其后续的航迹融合有一个隐含假设:来自同一目标、不同传感器的两个局部估计误差是相互独立的[1][2]。

[1] R. A. Singer and A. J. Kanyuck, “Computer control of multiple site track correlation”, Automatica, vol. 7, pp. 455-463, July 1971.[2] R. A. Singer and A. J. Kanyuck, “Correlation of Multiple-Site Track Data”, IEEE Transactions on Aerospace and Electronic System, vol. 6, No. 2, pp. 180-187, March 1970.而实际情况中,尽管不考虑目标机动性或量测噪声,过程噪声是相同的,因此局部估计误差往往是高度相关的,因此相关性不容忽视。

1979年,J.Speyer在多传感器分布式估计问题中将估计间的相关性考虑其中,但其不适用于假设检验问题[3]。

此外,Willsky等人也在其研究中考虑了相关性等问题[4]。

[3] J. L. Speyer, “Computation and Transmission Requirements for a Decentralized Linear- Quadratic-Gaussian Control Problem”, IEEE Transactions on Automatic Control, vol. 24 no. 2 pp. 266-269, 1979.[4] A. Willsky, M. Bello, D. Castanon, B. Levy, G. Verghese, “Combining and Updating of Local Estimates and Regional Maps Along Sets of One-Dimensional Tracks”, IEEE Transactions on Automatic Control, vol. 27, no. 4, pp. 799-813, 1982.1981年,Y. Bar-shalom等人推导了两局部估计误差互相关的简单递推公式,将互相关性融入假设统计量公式中。

多重离子束成像

多重离子束成像

多重离子束成像
多重离子束成像(Multiple Ion Beam Imaging,MIBI)是一种高分辨率、高灵敏度的生物分子成像技术。

该技术利用离子束和靶标分子间的反应,将离子束的信号转化为分子信息,并通过提供高分辨率的成像能力来直观显示分子图像。

由于离子束的能量和注入位置都可以精确控制,因此可以在细胞和组织层面上实现高空间分辨率成像,同时保持高化学分辨率和高检测灵敏度。

多重离子束成像技术可以用于研究许多生物学领域,如神经科学、肿瘤学和免疫学等。

它可以提供细胞和组织中生物分子的三维位置信息,例如蛋白质、核酸和小分子等物质的定位,从而研究这些生物分子在细胞和组织中的分布情况及其功能。

与传统的光学成像不同,这种技术可以分辨出亚细胞结构和细胞内复杂分子的非均匀分布。

因此,它为研究细胞和组织的复杂结构和生物化学过程提供了重要的工具。

机器人感知中多模态传感器集成

机器人感知中多模态传感器集成

机器人感知中多模态传感器集成一、多模态传感器集成在机器人感知中的重要性随着科技的不断进步,机器人技术在各个领域的应用越来越广泛,从工业自动化到日常生活服务,机器人的智能化程度也在不断提升。

机器人感知作为其智能化的关键技术之一,直接影响着机器人对环境的理解和交互能力。

多模态传感器集成作为机器人感知技术的核心,通过整合多种类型的传感器数据,能够显著提升机器人对复杂环境的感知能力。

1.1 多模态传感器集成的定义与作用多模态传感器集成是指将多种类型的传感器数据融合在一起,形成一个统一的、更全面的感知系统。

这些传感器可以包括视觉传感器、触觉传感器、听觉传感器、力觉传感器等。

通过这种方式,机器人能够从不同的角度和维度获取环境信息,从而更准确地理解其所处的环境。

1.2 多模态传感器集成的关键技术实现多模态传感器集成的关键技术主要包括数据融合、传感器校准、传感器选择和传感器网络设计等。

数据融合技术是将不同传感器的数据进行整合,消除数据间的冗余和矛盾,提取出更有价值的信息。

传感器校准则是确保不同传感器的数据在时间和空间上具有一致性,从而提高数据的可靠性。

传感器选择和传感器网络设计则是根据机器人的应用场景和需求,选择合适的传感器并设计合理的传感器网络布局。

1.3 多模态传感器集成的应用场景多模态传感器集成在机器人感知中的应用场景非常广泛,包括但不限于以下几个方面:- 工业自动化:在工业生产线上,机器人需要通过多模态传感器集成来感知工件的位置、形状和材质,从而实现精确的抓取和操作。

- 服务机器人:在服务机器人领域,多模态传感器集成可以帮助机器人更好地理解人类的语言和行为,提供更自然和人性化的服务。

- 无人驾驶:在无人驾驶汽车中,多模态传感器集成可以整合视觉、雷达、激光雷达等多种传感器的数据,提高车辆对周围环境的感知能力,从而实现更安全和高效的驾驶。

二、多模态传感器集成的技术挑战与解决方案尽管多模态传感器集成在机器人感知中具有巨大的潜力,但在实际应用中也面临着许多技术挑战。

博士生发一篇information fusion

博士生发一篇information fusion

博士生发一篇information fusion Information Fusion: Enhancing Decision-Making through the Integration of Data and KnowledgeIntroduction:Information fusion, also known as data fusion or knowledge fusion, is a rapidly evolving field in the realm of decision-making. It involves the integration and analysis of data and knowledge from various sources to generate meaningful and accurate information. In this article, we will delve into the concept of information fusion, explore its key components, discuss its application in different domains, and highlight its significance in enhancingdecision-making processes.1. What is Information Fusion?Information fusion is the process of combining data and knowledge from multiple sources to provide a comprehensive and accurate representation of reality. The goal is to overcome the limitations inherent in individual sources and derive improved insights and predictions. By assimilating diverse information,information fusion enhances situational awareness, reduces uncertainty, and enables intelligent decision-making.2. Key Components of Information Fusion:a. Data Sources: Information fusion relies on various data sources, which can include sensors, databases, social media feeds, and expert opinions. These sources provide different types of data, such as text, images, audio, and numerical measurements.b. Data Processing: Once data is collected, it needs to be processed to extract relevant features and patterns. This step involves data cleaning, transformation, normalization, and aggregation to ensure compatibility and consistency.c. Information Extraction: Extracting relevant information is a crucial step in information fusion. This includes identifying and capturing the crucial aspects of the data, filtering out noise, and transforming data into knowledge.d. Knowledge Representation: The extracted information needs to be represented in a meaningful way for integration and analysis.Common methods include ontologies, semantic networks, and knowledge graphs.e. Fusion Algorithms: To integrate the information from various sources, fusion algorithms are employed. These algorithms can be rule-based, model-based, or data-driven, and they combine multiple pieces of information to generate a unified and coherent representation.f. Decision-Making Processes: The ultimate goal of information fusion is to enhance decision-making. This requires the fusion of information with domain knowledge and decision models to generate insights, predictions, and recommendations.3. Applications of Information Fusion:a. Defense and Security: Information fusion plays a critical role in defense and security applications, where it improves intelligence analysis, surveillance, threat detection, and situational awareness. By integrating information from multiple sources, such as radars, satellites, drones, and human intelligence, it enables effective decision-making in complex and dynamic situations.b. Health Monitoring: In healthcare, information fusion is used to monitor patient health, combine data from different medical devices, and provide real-time decision support to medical professionals. By fusing data from wearables, electronic medical records, and physiological sensors, it enables early detection of health anomalies and improves patient care.c. Smart Cities: Information fusion offers enormous potential for the development of smart cities. By integrating data from multiple urban systems, such as transportation, energy, and public safety, it enables efficient resource allocation, traffic management, and emergency response. This improves the overall quality of life for citizens.d. Financial Markets: In the financial sector, information fusion helps in the analysis of large-scale and diverse datasets. By integrating data from various sources, such as stock exchanges, news feeds, and social media mentions, it enables better prediction of market trends, risk assessment, and investmentdecision-making.4. Significance of Information Fusion:a. Enhanced Decision-Making: Information fusion enables decision-makers to obtain comprehensive and accurate information, reducing uncertainty and improving the quality of decisions.b. Improved Situational Awareness: By integrating data from multiple sources, information fusion enhances situational awareness, enabling timely and informed responses to dynamic and complex situations.c. Risk Reduction: By combining information from diverse sources, information fusion improves risk assessment capabilities, enabling proactive and preventive measures.d. Resource Optimization: Information fusion facilitates the efficient utilization of resources by providing a holistic view of the environment and enabling optimization of resource allocation.Conclusion:In conclusion, information fusion is a powerful approach to enhance decision-making by integrating data and knowledge from multiple sources. Its key components, including data sources, processing, extraction, knowledge representation, fusion algorithms, and decision-making processes, together create a comprehensive framework for generating meaningful insights. By applying information fusion in various domains, such as defense, healthcare, smart cities, and financial markets, we can maximize the potential of diverse information sources to achieve improved outcomes.。

  1. 1、下载文档前请自行甄别文档内容的完整性,平台不提供额外的编辑、内容补充、找答案等附加服务。
  2. 2、"仅部分预览"的文档,不可在线预览部分如存在完整性等问题,可反馈申请退款(可完整预览的文档不适用该条件!)。
  3. 3、如文档侵犯您的权益,请联系客服反馈,我们会尽快为您处理(人工客服工作时间:9:00-18:30)。

Multi-Sensor Information Fusion by Query RefinementShi-Kuo Chang1 and Erland Jungert21 Department of Computer ScienceUniversity of Pittsburgh -chang@2 Swedish Defense Research Agency (FOI) – jungert@lin.foi.seAbstract: In recent years the fusion of multimedia information from multiple real-time sources and databases has become increasingly important because of its practical significance in many application areas such as telemedicine, community networks for crime prevention, health care, emergency management, e-learning, digital library, and field computing for scientific exploration. To support the retrieval and fusion of multimedia information from multiple real-time sources and databases, a novel approach for sensor-based query processing is described. Since most sensors can generate large quantities of spatial information within short periods of time, sensor-based query processing requires new techniques for query optimization. The sensor dependency tree is used to facilitate query optimization. Through query refinement one or more sensor may provide feedback information to the other sensors. The approach is also applicable to evolutionary queries that change in time and/or space, depending upon the temporal/spatial coordinates of the query originator. It provides significant improvements in the accuracy and efficiency in multi-sensor information fusion and accomplishes sensor data independence through the construction of an ontological knowledge base.1. Sensor-based Query Processing for Information FusionIn recent years the fusion of multimedia information from multiple real-time sources and databases has become increasingly important because of its practical significance in many application areas such as telemedicine, community networks for crime prevention, health care, emergency management, e-learning, digital library, and field computing for scientific exploration. Information fusion is the integration of information from multiple sources and databases in multiple modalities and located in multiple spatial and temporal domains. Generally speaking, the objectives of information fusion are: a) to detect certain significant events [Waltz90, White98], and b) to verify the consistency of detected events [Chong99, Klein93, Parker99].As an example, Figure 1(a) is a laser radar image of a parking lot with a moving vehicle (encircled). The laser radar is manufactured by SAAB Dynamics in Sweden. It generates image elements from a laser beam that is split into short pulses by a rotating mirror. The laser pulses are transmitted to the ground in a scanning movement, and when reflected back to the platform on the helicopter a receiver collects the returning pulses that are stored and analyzed. The results are points with x, y, z coordinates and time t. The resolution is about 0.3 m. In Figure 1(a) the only moving vehicle is in the lower right part of the image with a north-south orientation, while all other vehicles have east-west orientation.Figure 1(b) are two video frames showing a moving white vehicle (encircled) while entering a parking lot in the middle of the upper left frame, and between some of the parked vehicles in the lower right frame. Moving objects can be detected from the video sequence [Jungert99c]. On the other hand, the approximate 3D shape of an object or the terrain can be obtained from the laser radar image [Elmqvist01]. Therefore the combined analysis of laser radar image and video frame sequence provides better information to detect a certain type of object and/or to verify the consistency of the detected object from both sources.To accomplish the objectives of information fusion, novel sensor-based query processing techniques to retrieve and fuse information from multiple sources are needed. In sensor-based query processing, the queries are applied to both stored databases and real-time sources that include different type of sensors. Since most sensors can generate large quantities of spatial information within short periods of time, sensor-based query processing requires query optimization.We describe a novel approach for sensor-based query processing and query optimization using the sensor dependency tree. Another aspect to consider is that queries may involve data from more than one sensor. In our approach, one or more sensor may provide feedback information to the other sensors through query refinement. The status information such as position, time and certainty can be incorporated in multi-level views and formulated as constraints in the refined query. In order to accomplish sensor data independence, an ontological knowledge base is employed.(a) (b)Figure 1. (a) A laser radar image of a parking lot with a moving vehicle (encircled). (b) Two video frames showing a moving white vehicle (encircled) while entering a parking lot.There is an important class of queries that require more sophisticated query refinement. We will call this class of queries evolutionary queries. An evolutionary query is a query that may change in time and/or space. For example when an emergency management worker moves around in a disaster area, a predefined query can be executed repeatedly to evaluate the surrounding area to find out objects of threat. Depending upon the position of the person or agent (the query originator) and the time of the day, the query can be quite different. Our approach is also applicable to evolutionary queries that may be modified, depending upon the temporal/spatial coordinates of the query originator.This paper is organized as follows. The background and related research are described in Section 2. The notion of sensor data dependence is discussed in Section 3, and the sensor data dependency tree is introduced in Section 4. Section 4 describes simple query processing, and Section 5 illustrated the query refinement approach. Section 6 and Section 7 discuss view management and the sensor data ontological knowledge base, respectively. The σ-join-query examples are presented in Section 8. AN empirical study is described in Section 9. Section 10 discusses the advantages of the proposed research and future research topics.2. Background and Related ResearchIn our previous research, a spatial/temporal query language called ΣQL was developed to support the retrieval and fusion of multimedia information from real-time sources and databases [Chang98,Chang99, Chang2000c, Jungert2000]. ΣQL allows a user to specify powerful spatial/temporal queries for both multimedia data sources and multimedia databases, thus eliminating the need to write separate queries for each. ΣQL can be seen as a tool for handling spatial/temporal information for sensor-based information fusion, because most sensors generate spatial information in a temporal sequential manner[Jungert99a]. A powerful visual user interface called the Sentient Map allows the user to formulate spatial/temporal σ-queries using gestures [Chang2000a, Chang2000b].For empirical study we collaborated with the Swedish Defense Research Agency who has collectedinformation from different type of sensors, including laser radar, infrared video (similar to video butgenerated at 60 frames/sec), and CCD digital camera. In our preliminary analysis, when we applied ΣQL to the fusion of the above described sensor data, we discovered that in the fusion process data from a single sensor yields poor results in object recognition. For instance, the target object may bepartially hidden by an occluding object such as a tree, rendering certain type of sensors ineffective. Object recognition can be significantly improved, if a refined query is generated to obtain information from another type of sensor, while allowing the target being partially hidden. In other words, one (or more) sensor may serve as a guide to the other sensors by providing status information such as position, time and accuracy, which can be incorporated in multiple views and formulated as constraints in the refined query. In the refined query, the source(s) can be changed, and additional constraints can be included in the where-clause of the σ-query. This approach provides better object recognition results because the refined query can improve the result from the various sensor data that will also lead to a better result in the fusion process. A refined query may also send a request for new data and thus lead to a feedback process.In early research on query modification, queries are modified to deal with integrity constraints [Stonebraker75]. In query augmentation, queries are augmented by adding constraints to speed up query processing [Grafe93]. In query refinement [Velez97] multiple term queries are refined by dynamically combining pre-computed suggestions for single term queries. Recently query refinement technique was applied to content-based retrieval from multimedia databases [Chakrabarti2000]. In our approach, the refined queries are created to deal with the lack of information from a certain source or sources, and therefore not only the constraints can be changed, but also the source(s). This approach has not been considered previously in database query optimization, because usually the sources are assumed to provide the complete information needed by the queries.In addition to the related approaches in query augmentation, there is also recent research work inagent-based techniques that are relevant to our approach. Many mobile agent systems have been developed [Baumann98, Baumer99, Lange99], and recently mobile agent technology is beginning to be applied to information retrieval from multimedia databases [Kosch2001]. It is conceivable that sensors can be handled by different agents that exchange information and cooperate with each other to achieve information fusion. However, mobile agents are highly domain-specific and depend on ad-hoc, ‘hardwired’ programs to implement them. In contrast, our approach offers a theoretical framework for query optimization and is applicable to different type of sensors, thus achieving sensor data independence.3. Sensor Data IndependenceAs mentioned in the previous sections, sensor data independence is an important new concept in sensor-based query processing. In database design, data independence was first introduced in order to allow modifications of the physical databases without affecting the application programs [Ullman88]. It was a very powerful innovation in information technology. The main purpose was to simplify the use of the databases from an end-user's perspective while at the same time allow a more flexible administration of the databases themselves [Date95].In sensor-based information systems [Waltz90], no similar concept has yet been suggested, due to the fact that this area is still less mature with respect to the design and development of information systems integrated with databases in which sensor data are stored. Another reason is that the users are supposed to be domain experts and consequently they have not yet requested sensor-based information systems with this property.In current sensor-based information systems, in order to formulate queries concerning various objects and their attributes registered by the sensors, detailed knowledge about the sensors is required. Therefore sensor selection is left to the users who supposedly are also experts on sensors. However in real life this is not always the case. A user cannot be an expert on all sensors and all sensor data types. Therefore systems with the ability to hide this kind of low-level information from the users need to be developed. User interfaces also need to be designed to allow the users to formulate queries with ease and to request information at a high-level of abstraction to accomplish sensor data independence. An approach to overcome these problems and to accomplish sensor data independence is described, through the use of the sensor dependency tree, the query refinement technique, the multi-level view databases, and above all an ontological knowledge base for the sensors and objects to be sensed.4. The Sensor Dependency TreeIn database theory, query optimization is usually formulated with respect to a query execution plan where the nodes represent the various database operations to be performed [Jarke84]. The query execution plan can then be transformed in various ways to optimize query processing with respect to certain cost functions. In sensor-based query processing, a concept similar to the query execution plan is introduced. It is called the sensor dependency tree, which is a tree in which each node P i has the following parameters:object i is the object type to be recognizedsource i is the information sourcerecog i is the object recognition algorithm to be appliedsqo i is the spatial coordinates of the query originatortqo i is the temporal coordinates of the query originatoraoi i is the spatial area-of-interest for object recognitionioi i is the temporal interval-of-interest for object recognitiontime i is the estimated computation time in some unit such as secondsrange i is the range of certainty in applying the recognition algorithm,represented by two numbers min, max from the closed interval [0,1]These parameters provide detailed information on a computation step to be carried out in sensor-based query processing. The query originator is the person/agent who issues a query. For evolutionary queries, the spatial/temporal coordinates of the query originator are required. For other type of queries, these parameters are optional.If the computation results of a node P1 are the required input to another node P2, there is a directed arc from P1 to P2. The directed arcs originate from the leave nodes and terminate at the root node. The leave nodes of the tree are the information sources such as laser radar, infrared camera, CCD camera and so on. They have parameters such as (none, LR, NONE, sqo i, tqo i, aoi all, ioi all, 0, (1,1)). Sometimes we represent such leave nodes by their symbolic names such as LR, IR, CCD, etc.The intermediate nodes of the tree are the objects to be recognized. For example, suppose the object type is 'truck'. An intermediate node may have parameters (truck, LR, recog315, sqo i, tqo i, aoi all, ioi all, 10, (0.3, 0.5)).The root node of the tree is the result of information fusion, for example, a node with parameters (truck, ALL, fusion7, sqo i, tqo i, aoi all, ioi all, 2000, (0,1)) where the parameter ALL indicates thatinformation is drawn from all the sources. In what follows, the spatial/temporal coordinates sqo i and tqo i for the query originator, the all-inclusive area-of-interest aoi all and the all-inclusive interval-of-interest ioi all will be omitted for the sake of brevity, so that the examples are easier to read.Query processing is accomplished by the repeated computation and updates of the sensor dependency tree. During each iteration, one or more nodes are selected for computation. The selected nodes must not be dependent on any other nodes. After the computation, one ore more nodes are removed from the sensor dependency tree. The process then iterates. As an example, by analyzing the initial query, the following sensor dependency tree is constructed:(none, LR, NONE, 0, (1,1)) → (truck, LR, recog315, 10, (0.3, 0.5)) →(none, IR, NONE, 0, (1,1)) → (truck, IR, recog144, 2000, (0.5, 0.7)) → (truck, ALL, fusion7, 2000, (0,1))(none, CCD, NONE, 0, (1,1)) → (truck, CCD, recog11, 100, (0, 1)) →This means the information is from the three sources - laser radar, infrared camera and CCD camera - and the information will be fused for recognizing the object type 'truck'.Next, we select some of the nodes to compute. For instance, all the three leaf nodes can be selected, meaning information will be gathered from all three sources. After this computation, the processed nodes are dropped and the following updated sensor dependency tree is obtained:(truck, LR, recog315, 10, (0.3, 0.5)) →(truck, IR, recog144, 2000, (0.5, 0.7)) → (truck, ALL, fusion7, 2000, (0,1))(truck, CCD, recog11, 100, (0, 1)) →We can then select the next node(s) to compute. Since LR has the smallest estimated computation time, it is selected and recognition algorithm 315 is applied. The updated sensor dependency tree is:(truck, IR, recog144, 2000, (0.5, 0.7)) →(truck, ALL, fusion7, 2000, (0,1))(truck, CCD, recog11, 100, (0, 1)) →In the updated tree, the LR node has been removed. We can now select the CCD node and, after its removal, select the IR node.(truck, IR, recog144, 2000, (0.5, 0.7)) → (truck, ALL, fusion7, 2000, (0,1))Finally, the fusion node is selected.(truck, ALL, fusion7, 2000, (0,1))After the fusion operation, there are no unprocessed (i.e., unselected) nodes, and query processing terminates.5. Query RefinementIn the previous section, a straightforward approach of sensor-based query processing is described. This straightforward approach misses the opportunity of utilizing incomplete and imprecise knowledge gained during query processing.Let us re-examine the above scenario. After LR is selected and recognition algorithm 315 applied, suppose the result of recognition is not very good, and only some partially occluded large objects are recognized. If we follow the original approach, the reduced sensor dependency tree becomes: (truck, IR, recog144, 2000, (0.5, 0.7)) →(truck, ALL, fusion7, 2000, (0,1))(truck, CCD, recog11, 100, (0, 1)) →But this misses the opportunity of utilizing the incomplete and imprecise knowledge gained by recognition algorithm 315.If the query is to find un-occluded objects and the sensor reports only an occluded object, then the query processor is unable to continue unless we modify the query to find occluded objects. Therefore a better approach is to refine the original query, so that the updated sensor dependency tree becomes:(truck, IR, recog144, aoi-23, 2000, (0.6, 0.8)) → (truck, ALL, fusion7, aoi-23, 2000, (0, 1))Figure 2. Flowchart for the query refinement algorithm.This means recognition algorithm 315 is applied to detect objects in an area-of-interest aoi-23. After this is done, the recognition algorithm 144 is applied to recognize objects of the type 'truck' in this specific area-of-interest. Finally, the fusion algorithm fusion7 is applied.Given a user query in a high-level language, the natural language, a visual language or a form, the query refinement approach is outlined below, where italic words indicate operations for the second (and subsequent) iteration. Its flowchart is illustrated in Figure 2.Step 1. Analyze the user query to generate/update the sensor dependency tree based upon the ontological knowledge base and the multi-level view database that contains up-to-date contextual information in the object view, local view and global view, respectively.Step 2. If the sensor dependency tree is reduced to a single node, perform fusion operation (if multiple sensors have been used) and then terminate query processing. Otherwise build/refine the ó-query based upon the user query, the sensor dependency tree and the multi-level view database.Step 3. Execute the portion of the ó-query that is executable according to the sensor dependency tree. Step 4. Update the multi-level view database and go back to Step 1.As mentioned above, there is another class of queries that require more sophisticated query refinement. An evolutionary query is a query that change in time and/or space. Depending upon the position of the query originator and the time of the day, the query can be different. In other words, queries and query processing are affected by the spatial/temporal relations among the query originator, the sensors and the sensed objects.In query processing/refinement, the spatial/temporal relations must be taken into consideration in the construction/update of the sensor dependency tree. The temporal relations include "followed by", "preceded by", and so on. The spatial relations include the usual spatial relations, and special ones such as "occluded by", and so on [Lee92]. As mentioned above, if in the original query we are interested only in finding un-occluded objects, then the query processor must report failure when only an occluded object is found. If, however, the query is refined to "find both un-occluded and occluded objects", then the query processor can still continue.6. Multi-Level View DatabaseA multi-level view database (MLVD) is needed to support sensor-based query processing. The status information is obtained from the sensors, which includes object type, position, orientation, time, accuracy and so on. The positions of the query originator and the sensors may also change. This is processed and integrated into the multi-level view database. Whenever the query processor needs some information, it asks the view manager. The view manager also shields the rest of the system from the details of managing sensory data, thus achieving sensory data independence.The multiple views may include the following three views in a resolution pyramid structure: the global view, the local view and the object view. The global view describes where the target object is situated in relation to some other objects, e.g. a road from a map. This will enable the sensor analysis program to find the location of the target object with greater accuracy and thus make a better analysis. The local view provides the information such as the target object is partially hidden. The local view can be described, for example, in terms of Symbolic Projection [Chang96], or other representations. Finally, there is also a need for a symbolic object description. The views may include information about the query originator and can be used later on in other important tasks such as in situation analysis.The multi-level views are managed by the view manager, which can be regarded as an agent, or as middleware, depending upon the system architecture. The global view is obtained primarily from the geographic information system (GIS). The local view and object view are more detailed descriptions of local areas and objects. The results of query processing, and the movements of the query originator, may both lead to the updating of all three views.7. The Ontological Knowledge BaseFor any single sensor the sensed data usually does not fully describe an object, otherwise there will be no need to utilize other sensors. In the general case the system should be able to detect that some sensors are not giving the complete view of the scene and automatically select those sensors that canhelp the most in providing more information to describe the whole scene. In order to do so the system should have a collection of facts and conditions, which constitute the working knowledge about the real world and the sensors. This knowledge is stored in the ontological knowledge base, whose content includes object knowledge structure, sensor and sensor data control knowledge.Figure 3. The ontological knowledge base.An example of the ontological knowledge base is shown in Figure 3. It consists of three parts: the sensor part describing the sensors, recognition algorithms and so on, the external conditions part providing a description of external conditions such as weather condition, light condition and so on, and the sensed objects part describing objects to be sensed. Given the external condition and the object to be sensed, we can determine what sensor(s) and recognition algorithm(s) may be applied. For example, IR and Laser can be used at night (time condition), while CCD cannot be used. IR probably can be used in foggy weather, but Laser and CCD cannot be used (weather condition). However, such determination is often uncertain. Therefore certainty factors should be associated with items in the ontological knowledge base to deal with the uncertainty.8. Query ExamplesAn example of a σ-query is illustrated in Figure 4 [Chang98, Chang99]. The source R consists of time slices of 2D frames. To extract three pre-determined time slices from the source R, the query in mathematical notation is: σt (t1 , t2 , t3 ) R.The meaning of the σ−operator in the above query is “select”, i.e. we want to select the time axis and three slices along this axis. The subscript t in σt indicates the selection of the time axis. In the SQL-like language the ΣQL query is expressed as:SELECT tCLUSTER t1, t2, t3FROM RA new keyword "CLUSTER" is introduced, so that the parameters for the σ−operator can be listed, such as t1, t2, t3. The word "CLUSTER" indicates that objects belonging to the same cluster must share some common characteristics (such as having the same time parameter value). A cluster may have a sub-structure specified in another (recursive) query. Clustering is a natural concept when dealing with spatial/temporal objects. To facilitate query formulation, an alias (the name of a variable) can be given to a cluster.Figure 4. Example of extracting three time slices (frames) from a video source.In what follows we present three examples to illustrate the processing of queries that involve one or more sensors, and the possibility for query optimization.Query 1: "Find two frames from a video frame sequence where a frame containing a white bird comes after a frame containing a red bird."Using the query refinement approach, the steps are as follows.Step 1. Generate the sensor dependency tree: (none, video, NONE, 0, (1,1)) (white bird, recog192, 20, (0.5,1)). Since the source is given, the sensor dependency tree is a chain.Step 2. Build the query.// Extract all the frame pairs OBJ1 and OBJ2 such that OBJ1 precedes OBJ2// in the video sequence and OBJ1 contains an object that is a red bird// and OBJ2 contains an object that is a white birdSELECT objectCLUSTER * ALIAS OBJ1 OBJ2FROM{// it extracts all the video framesSELECT tCLUSTER *FROM video_source}WHERE OBJ1.type="bird" AND OBJ1.color="red"AND OBJ2.type = "bird" AND OBJ2.color="white"AND OBJ1.t < OBJ2.tNote: The clusters of the time axis must be open while the clusters on the object axis must be closed in order to keep the integrity of the frame.Step 3. Execute the query. Since source is known, we can execute the whole query to obtain the results.Step 4. Update the multi-level view database.Step 1 (2nd Iteration). Update the sensor dependency tree. Since it is empty and only one sensor was used, terminate.Query 2: “Find cars in a region five hours before it is covered by a flood.”Step 1. Generate the sensor dependency tree. Initially the source can be any source. By checking the ontological knowledge base and the views, the appropriate source(s) is identified, such as video source, camera, and so on. Once the source(s) has been identified, the sensor dependency tree can be constructed. For this example, there is only one source – the video source.Step 2. Build the query.// Extract all the pairs OBJ1 and OBJ2 such that OBJ1 precedes OBJ2 by 5 hours// in the frame sequence and OBJ1 contains at least a car and OBJ2 is the starting frame// for a flood. Then extract all cars from OBJ1.SELECT objectFROM{SELECT objectCLUSTER * ALIAS OBJ1 OBJ2FROM{// it extracts all the framesSELECT tCLUSTER *FROM video_source}WHERE OBJ1.type= ‘car’ AND OBJ2.type = ‘starting_flood’ ANDOBJ1.t = OBJ2.t – 5 AND OBJ1.position inside OBJ2.region}WHERE object.type= ‘car’In the above query, the clusters of the time axis must be open while the clusters on the object axis must be closed in order to keep the integrity of the frame.Step 3. Execute the query. Since source is determined to be a video source, we can execute the query to obtain the results.Step 4. Update the multi-level view database.Step 1 (2nd Iteration). Update the sensor dependency tree. Since the sensor dependency tree is empty and only one sensor was used, we terminate query processing.Query 3: Same as query 2 but using two sources: a video and a flood meter. This is an example of multi-sensor query processing.Step 1. Generate the sensor dependency tree. By checking the ontological knowledge base and the views, the source for car information is determined to be the video source, and the source for flood information is determined to be the flood meter.。

相关文档
最新文档