CBFF-Net: A New Framework for Efficient and Accurate Hyperspectral Object Tracking

Publisher:
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
Publication Type:
Journal Article
Citation:
IEEE Transactions on Geoscience and Remote Sensing, 2023, 61
Issue Date:
2023-01-01
Full metadata record
Visual object tracking is a fundamental task in computer vision and thrived in recent decades. With the development of snapshot hyperspectral (HS) sensors, efforts have been made to exploit tracking the object with HS videos to overcome the inherent limitation of red-green-blue (RGB) images. Existing HS tracking algorithms extract the deep features from image data separately, which break the interaction information between bands. Therefore, the discrimination ability of HS trackers is limited and the efficiency of the existing HS algorithms is low. In this article, a novel algorithm [cross-band feature fusion network (CBFF-Net)] is proposed for HS object tracking to improve the discrimination ability and reduce the computational complexity. Specifically, the backbone and head network are implemented with modules of a transferred RGB object tracking network to carry out the HS target tracking task while maintaining the discrimination ability learned from RGB data. Moreover, a bidirectional multiple deep feature fusion (BMDFF) module is proposed to fuse the features extracted from different bands of the HS images, and a cross-band group attention (CBGA) module is introduced to learn interaction information across bands of the HS images. Experiments results indicate the superiority in performance of CBFF-Net, and it runs at 24 frames/s (FPS).
Please use this identifier to cite or link to this item: