Matching remote sensing images is a fundamental step in many image processing applications. Unlike regular images, remote sensing images often undergo complex and nonlinear background changes, making them difficult to match. They also pose challenges such as scale variations, rotation, and different viewing angles. One commonly used method for finding corresponding points between images is the Scale-Invariant Feature Transform (SIFT) algorithm; however, it often produces many incorrect matches when applied to such data. In contrast, deep learning-based approaches can extract and compare medium and high-level features for more accurate matching. Inspired by these advances, this work introduces a method that combines the SIFT algorithm with a Siamese deep neural network to improve the matching of remote sensing images.
The proposed method modifies the conventional SIFT by adjusting its parameters to increase the proportion of correct to incorrect correspondences. After keypoints are extracted and described, initial correspondences are established. Then, for each matched point, a local patch is extracted based on the keypoint’s position, scale, and orientation. These patch pairs are input to a trained Siamese network that estimates the probability of a correct match. Matches with confidence below a threshold are rejected. This hybrid approach leverages the strengths of both traditional and deep learning-based techniques to enhance accuracy. The proposed approach introduces several key innovations, including optimized keypoint extraction to maximize true matches, patch-based feature representation aligned with local image geometry, and a neural network-based verification step to suppress incorrect matches. Based on experiments conducted on a dataset of 35 pairs of remote sensing images, and comparing the results with the SIFT algorithm and deep learning-based methods, the proposed approach achieved an accuracy of 0.849 by reducing false matches and increasing correct ones.
Type of Study:
Research |
Subject:
Paper Received: 2023/12/23 | Accepted: 2025/07/21 | Published: 2025/12/19 | ePublished: 2025/12/19