Severity: Warning
Message: file_get_contents(https://...@gmail.com&api_key=61f08fa0b96a73de8c900d749fcb997acc09&a=1): Failed to open stream: HTTP request failed! HTTP/1.1 429 Too Many Requests
Filename: helpers/my_audit_helper.php
Line Number: 197
Backtrace:
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 197
Function: file_get_contents
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 271
Function: simplexml_load_file_from_url
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 1075
Function: getPubMedXML
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 3195
Function: GetPubMedArticleOutput_2016
File: /var/www/html/application/controllers/Detail.php
Line: 597
Function: pubMedSearch_Global
File: /var/www/html/application/controllers/Detail.php
Line: 511
Function: pubMedGetRelatedKeyword
File: /var/www/html/index.php
Line: 317
Function: require_once
98%
921
2 minutes
20
Object detection plays a significant role in various industrial and scientific domains, particularly in autonomous driving. It enables vehicles to detect surrounding objects, construct spatial maps, and facilitate safe navigation. To accomplish these tasks, a variety of sensors have been employed, including LiDAR, radar, RGB cameras, and ultrasonic sensors. Among these, LiDAR and RGB cameras are frequently utilized due to their advantages. RGB cameras offer high-resolution images with rich color and texture information but tend to underperform in low light or adverse weather conditions. In contrast, LiDAR provides precise 3D geometric data irrespective of lighting conditions, although it lacks the high spatial resolution of cameras. Recently, thermal cameras have gained significant attention in both standalone applications and in combination with RGB cameras. They offer strong perception capabilities under low-visibility conditions or adverse weather conditions. Multimodal sensor fusion effectively overcomes individual sensor limitations. In this paper, we propose a novel multimodal fusion method that integrates LiDAR, a 360 RGB camera, and a 360 thermal camera to fully leverage the strengths of each modality. Our method employs a feature-level fusion strategy that temporally accumulates and synchronizes multiple LiDAR frames. This design not only improves the detection accuracy but also enhances the spatial coverage and robustness. The use of 360 images significantly reduces blind spots and provides comprehensive environmental awareness, which is especially beneficial in complex or dynamic scenes.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC12389630 | PMC |
http://dx.doi.org/10.3390/s25164902 | DOI Listing |