[صفحه اصلی ]   [Archive] [ English ]  
:: صفحه اصلي :: درباره نشريه :: آخرين شماره :: تمام شماره‌ها :: جستجو :: ثبت نام :: ارسال مقاله :: تماس با ما ::
بخش‌های اصلی
صفحه اصلی::
اطلاعات نشریه::
آرشیو مجله و مقالات::
برای نویسندگان::
داوران::
ثبت نام و اشتراک::
تماس با ما::
تسهیلات پایگاه::
بایگانی مقالات زیر چاپ::
آمار نشریه::
::
جستجو در پایگاه

جستجوی پیشرفته
..
دریافت اطلاعات پایگاه
نشانی پست الکترونیک خود را برای دریافت اطلاعات و اخبار پایگاه، در کادر زیر وارد کنید.
..
آمار سایت
مقالات منتشر شده: 326
نرخ پذیرش: 63.2
نرخ رد: 36.8
میانگین داوری: 208 روز
میانگین انتشار: 344 روز
..
:: دوره 10، شماره 2 - ( 8-1401 ) ::
جلد 10 شماره 2 صفحات 88-63 برگشت به فهرست نسخه ها
ارزیابی عملکرد آشکارسازهای عوارض موضعی در حضور نویز، به‌منظور تناظریابی تصاویر چندسنجنده ای سنجش‌از‌دوری
نگار جوهری* ، امین صداقت ، نازیلا محمدی
دانشگاه تبریز
چکیده:   (2378 مشاهده)
تناظریابی خودکار، کارآ، دقیق و پایدار تصاویر یکی از مسائل اساسی در حوزه­های سنجش ازدور، فتوگرامتری و بینایی­ماشین است. در دهه­های گذشته، الگوریتم­های متنوعی مبتنی بر چارچوب تناظریابی­ عارضه­مبنا ارائه شده­است که هسته اصلی آنها را تشخیص و توصیف عوارض موضعی تشکیل می‌دهد. شناخت خصوصیات الگوریتم‌های مختلف تناظریابی در کاربردهای گوناگون یک ضرورت اساسی بوده و تاثیر زیادی در انتخاب صحیح یک الگوریتم مناسب در یک کاربرد مشخص خواهد داشت. مطالعات متعددی در خصوص ارزیابی و مقایسه بسیاری از الگوریتم­های تناظریابی در کاربردهای گوناگون انجام گرفته است. با این وجود تحقیقات انجام گرفته در خصوص ارزیابی عملکرد الگوریتم‌های مختلف تناظریابی در تصاویر چندسنسوری خصوصا تصاویر راداری و نوری بسیار محدود است. در این تحقیق به ارزیابی عملکرد آشکارسازهای شاخص و متداول عوارض موضعی شامل SURF، KAZE، SIFT، PC، FAST و Harris در تناظریابی تصاویر چندسنسوری نوری و راداری پرداخته خواهد شد. به منظور استخراج عوارض پایدار و با توزیع یکنواخت در این الگوریتم‌ها از روش شایستگی یکنواخت استفاده خواهد شد. علاوه بر این به منظور توصیف عوارض از نسخه مستقل از مقیاس توصیفگر جدید HOSS بهره‌گیری خواهد شد. نتایج حاکی از برتری آشکارساز KAZE در حضور سطوح متوالی نویز و سایر اختلافات هندسی و رادیومتریکی است.
واژه‌های کلیدی: تصاویر چندسنسوری، ارزیابی آشکارسازهای عوارض موضعی، آشکارساز KAZE، الگوریتم شایستگی‌یکنواخت، توصیفگر HOSS
متن کامل [PDF 2332 kb]   (677 دریافت)    
نوع مطالعه: پژوهشي | موضوع مقاله: سنجش از دور
دریافت: 1400/10/12 | پذیرش: 1401/3/16 | انتشار: 1401/8/10
فهرست منابع
1. [1] Y. Ye, L. Bruzzone, J. Shan, F. Bovolo, and Q. Zhu, "Fast and robust matching for multimodal remote sensing image registration," IEEE Transactions on Geoscience and Remote Sensing, vol. 57, no. 11, pp. 9059-9070, 2019. [DOI:10.1109/TGRS.2019.2924684]
2. [2] C. F. Nunes and F. L. Padua, "A local feature descriptor based on log-Gabor filters for keypoint matching in multispectral images," IEEE Geoscience and Remote Sensing Letters, vol. 14, no. 10, pp. 1850-1854, 2017. [DOI:10.1109/LGRS.2017.2738632]
3. [3] X. Jiang, J. Ma, G. Xiao, Z. Shao, and X. Guo, "A review of multimodal image matching: Methods and applications," Information Fusion, 2021. [DOI:10.1016/j.inffus.2021.02.012]
4. [4] J. Ma, X. Jiang, A. Fan, J. Jiang, and J. Yan, "Image matching from handcrafted to deep features: A survey," International Journal of Computer Vision, vol. 129, no. 1, pp. 23-79, 2021. [DOI:10.1007/s11263-020-01359-2]
5. [5] A. Sedaghat and N. Mohammadi, "Uniform competency-based local feature extraction for remote sensing images," ISPRS Journal of Photogrammetry and Remote Sensing, vol. 135, pp. 142-157, 2018. [DOI:10.1016/j.isprsjprs.2017.11.019]
6. [6] Y. Ye, J. Shan, S. Hao, L. Bruzzone, and Y. Qin, "A local phase based invariant feature for remote sensing image matching," ISPRS Journal of Photogrammetry and Remote Sensing, vol. 142, pp. 205-221, 2018. [DOI:10.1016/j.isprsjprs.2018.06.010]
7. [7] S. Paul and U. C. Pati, "A comprehensive review on remote sensing image registration," International Journal of Remote Sensing, vol. 42, no. 14, pp. 5400-5436, 2021. [DOI:10.1080/01431161.2021.1906985]
8. [8] J. Li, Q. Hu, and M. Ai, "RIFT: Multi-modal image matching based on radiation-variation insensitive feature transform," IEEE Transactions on Image Processing, vol. 29, pp. 3296-3310, 2019. [DOI:10.1109/TIP.2019.2959244]
9. [9] Q. Yu, D. Ni, Y. Jiang, Y. Yan, J. An, and T. Sun, "Universal SAR and optical image registration via a novel SIFT framework based on nonlinear diffusion and a polar spatial-frequency descriptor," ISPRS Journal of Photogrammetry and Remote Sensing, vol. 171, pp. 1-17, 2021. [DOI:10.1016/j.isprsjprs.2020.10.019]
10. [10] D. Marcos, R. Hamid, and D. Tuia, "Geospatial correspondences for multimodal registration," in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 2016, pp. 5091-5100. [DOI:10.1109/CVPR.2016.550]
11. [11] W.-L. Du, Y. Zhou, J. Zhao, and X. Tian, "K-means clustering guided generative adversarial networks for SAR-optical image matching," IEEE Access, vol. 8, pp. 217554-217572, 2020. [DOI:10.1109/ACCESS.2020.3042213]
12. [12] J. Ma, J. Zhao, Y. Ma, and J. Tian, "Non-rigid visible and infrared face registration via regularized Gaussian fields criterion," Pattern Recognition, vol. 48, no. 3, pp. 772-784, 2015. [DOI:10.1016/j.patcog.2014.09.005]
13. [13] X. Liu, J.-B. Li, and J.-S. Pan, "Feature point matching based on distinct wavelength phase congruency and log-gabor filters in infrared and visible images," Sensors, vol. 19, no. 19, p. 4244, 2019. [DOI:10.3390/s19194244]
14. [14] G. Lehureau, F. Tupin, C. Tison, G. Oller, and D. Petit, "Registration of metric resolution SAR and optical images in urban areas," in 7th European Conference on Synthetic Aperture Radar, 2008: VDE, pp. 1-4.
15. [15] X. Xu, X. Li, X. Liu, H. Shen, and Q. Shi, "Multimodal registration of remotely sensed images based on Jeffrey's divergence," ISPRS journal of photogrammetry and remote sensing, vol. 122, pp. 97-115, 2016. [DOI:10.1016/j.isprsjprs.2016.10.005]
16. [16] D. Loeckx, P. Slagmolen, F. Maes, D. Vandermeulen, and P. Suetens, "Nonrigid image registration using conditional mutual information," IEEE transactions on medical imaging, vol. 29, no. 1, pp. 19-29, 2009. [DOI:10.1109/TMI.2009.2021843]
17. [17] J. Luo and E. E. Konofagou, "A fast normalized cross-correlation calculation method for motion estimation," IEEE transactions on ultrasonics, ferroelectrics, and frequency control, vol. 57, no. 6, pp. 1347-1357, 2010. [DOI:10.1109/TUFFC.2010.1554]
18. [18] Y. Ye and J. Shan, "A local descriptor based registration method for multispectral remote sensing images with non-linear intensity differences," ISPRS Journal of Photogrammetry and Remote Sensing, vol. 90, pp. 83-95, 2014. [DOI:10.1016/j.isprsjprs.2014.01.009]
19. [19] T. Tuytelaars and K. Mikolajczyk, Local invariant feature detectors: a survey. Now Publishers Inc, 2008. [DOI:10.1561/9781601981394]
20. [20] A. Sedaghat and N. Mohammadi, "Illumination-Robust remote sensing image matching based on oriented self-similarity," ISPRS Journal of Photogrammetry and Remote Sensing, vol. 153, pp. 21-35, 2019. [DOI:10.1016/j.isprsjprs.2019.04.018]
21. [21] C. G. Harris and M. Stephens, "A combined corner and edge detector," 1988: Citeseer. [DOI:10.5244/C.2.23]
22. [22] T. Lindeberg, "Image matching using generalized scale-space interest points," Journal of mathematical Imaging and Vision, vol. 52, no. 1, pp. 3-36, 2015. [DOI:10.1007/s10851-014-0541-0]
23. [23] P. Kovesi, "Phase congruency detects corners and edges," in The australian pattern recognition society conference: DICTA, 2003, vol. 2003.
24. [24] F. Mokhtarian and R. Suomela, "Robust image corner detection through curvature scale space," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 20, no. 12, pp. 1376-1381, 1998. [DOI:10.1109/34.735812]
25. [25] J. Fan, Y. Wu, M. Li, W. Liang, and Y. Cao, "SAR and optical image registration using nonlinear diffusion and phase congruency structural descriptor," IEEE Transactions on Geoscience and Remote Sensing, vol. 56, no. 9, pp. 5368-5379, 2018. [DOI:10.1109/TGRS.2018.2815523]
26. [26] E. Rosten, R. Porter, and T. Drummond, "Faster and better: A machine learning approach to corner detection," IEEE transactions on pattern analysis and machine intelligence, vol. 32, no. 1, pp. 105-119, 2008. [DOI:10.1109/TPAMI.2008.275]
27. [27] S. M. Smith and J. M. Brady, "SUSAN-a new approach to low level image processing," International journal of computer vision, vol. 23, no. 1, pp. 45-78, 1997. [DOI:10.1023/A:1007963824710]
28. [28] B. Zhao, T. Xu, Y. Chen, T. Li, and X. Sun, "Automatic and Robust Infrared-Visible Image Sequence Registration via Spatio-Temporal Association," Sensors, vol. 19, no. 5, p. 997, 2019. [DOI:10.3390/s19050997]
29. [29] X. Zhang, Q. Hu, M. Ai, and X. Ren, "A Multitemporal UAV Images Registration Approach Using Phase Congruency," in 2018 26th International Conference on Geoinformatics, 2018: IEEE, pp. 1-6. [DOI:10.1109/GEOINFORMATICS.2018.8557189]
30. [30] J. Aldana-Iuit, D. Mishkin, O. Chum, and J. Matas, "In the saddle: chasing fast and repeatable features," in 2016 23rd International Conference on Pattern Recognition (ICPR), 2016: IEEE, pp. 675-680. [DOI:10.1109/ICPR.2016.7899712]
31. [31] F. Dellinger, J. Delon, Y. Gousseau, J. Michel, and F. Tupin, "SAR-SIFT: a SIFT-like algorithm for SAR images," IEEE Transactions on Geoscience and Remote Sensing, vol. 53, no. 1, pp. 453-466, 2014. [DOI:10.1109/TGRS.2014.2323552]
32. [32] Y. Xiang, R. Tao, F. Wang, H. You, and B. Han, "Automatic Registration of Optical and SAR Images Via Improved Phase Congruency Model," IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, vol. 13, pp. 5847-5861, 2020. [DOI:10.1109/JSTARS.2020.3026162]
33. [33] D. G. Lowe, "Distinctive image features from scale-invariant keypoints," International journal of computer vision, vol. 60, no. 2, pp. 91-110, 2004. [DOI:10.1023/B:VISI.0000029664.99615.94]
34. [34] H. Bay, T. Tuytelaars, and L. V. Gool, "Surf: Speeded up robust features," in European conference on computer vision, 2006: Springer, pp. 404-417. [DOI:10.1007/11744023_32]
35. [35] M. Agrawal, K. Konolige, and M. R. Blas, "Censure: Center surround extremas for realtime feature detection and matching," in European Conference on Computer Vision, 2008: Springer, pp. 102-115. [DOI:10.1007/978-3-540-88693-8_8]
36. [36] P. F. Alcantarilla, A. Bartoli, and A. J. Davison, "KAZE features," in European Conference on Computer Vision, 2012: Springer, pp. 214-227. [DOI:10.1007/978-3-642-33783-3_16]
37. [37] L. Xing and W. Dai, "A local feature extraction method for UAV-based image registration based on virtual line descriptors," Signal, Image and Video Processing, vol. 15, no. 4, pp. 705-713, 2021. [DOI:10.1007/s11760-020-01788-z]
38. [38] A. Mustafa, H. Kim, and A. Hilton, "MSFD: Multi-scale segmentation-based feature detection for wide-baseline scene reconstruction," IEEE Transactions on Image Processing, vol. 28, no. 3, pp. 1118-1132, 2018. [DOI:10.1109/TIP.2018.2872906]
39. [39] J. B. Roerdink and A. Meijster, "The watershed transform: Definitions, algorithms and parallelization strategies," Fundamenta informaticae, vol. 41, no. 1, 2, pp. 187-228, 2000. [DOI:10.3233/FI-2000-411207]
40. [40] K. Mikolajczyk and C. Schmid, "Scale & affine invariant interest point detectors," International journal of computer vision, vol. 60, no. 1, pp. 63-86, 2004. [DOI:10.1023/B:VISI.0000027790.02288.f2]
41. [41] S. Cui, Y. Zhong, A. Ma, and L. Zhang, "A Novel Robust Feature Descriptor for Multi-Source Remote Sensing Image Registration," in IGARSS 2019-2019 IEEE International Geoscience and Remote Sensing Symposium, 2019: IEEE, pp. 919-922. [DOI:10.1109/IGARSS.2019.8900521]
42. [42] B. Fan, C. Huo, C. Pan, and Q. Kong, "Registration of optical and SAR satellite images by exploring the spatial relationship of the improved SIFT," IEEE Geoscience and Remote Sensing Letters, vol. 10, no. 4, pp. 657-661, 2012. [DOI:10.1109/LGRS.2012.2216500]
43. [43] Z. Yi, C. Zhiguo, and X. Yang, "Multi-spectral remote image registration based on SIFT," Electronics Letters, vol. 44, no. 2, pp. 107-108, 2008. [DOI:10.1049/el:20082477]
44. [44] W. Ma et al., "Remote sensing image registration with modified SIFT and enhanced feature matching," IEEE Geoscience and Remote Sensing Letters, vol. 14, no. 1, pp. 3-7, 2016. [DOI:10.1109/LGRS.2016.2600858]
45. [45] H.-H. Chang and W.-C. Chan, "Automatic Registration of Remote Sensing Images Based on Revised SIFT With Trilateral Computation and Homogeneity Enforcement," IEEE Transactions on Geoscience and Remote Sensing, 2021. [DOI:10.1109/TGRS.2021.3052926]
46. [46] K. Joshi and M. I. Patel, "Recent advances in local feature detector and descriptor: a literature survey," International Journal of Multimedia Information Retrieval, pp. 1-17, 2020.
47. [47] Y. Uchida, "Local feature detectors, descriptors, and image representations: A survey," arXiv preprint arXiv:1607.08368, 2016.
48. [48] X. Zhang, F. X. Yu, S. Karaman, and S.-F. Chang, "Learning discriminative and transformation covariant local feature detectors," in Proceedings of the IEEE conference on computer vision and pattern recognition, 2017, pp. 6818-6826. [DOI:10.1109/CVPR.2017.523]
49. [49] S. A. K. Tareen and Z. Saleem, "A comparative analysis of sift, surf, kaze, akaze, orb, and brisk," in 2018 International conference on computing, mathematics and engineering technologies (iCoMET), 2018: IEEE, pp. 1-10. [DOI:10.1109/ICOMET.2018.8346440]
50. [50] S. Urban and M. Weinmann, "FINDING A GOOD FEATURE DETECTOR-DESCRIPTOR COMBINATION FOR THE 2D KEYPOINT-BASED REGISTRATION OF TLS POINT CLOUDS," ISPRS Annals of Photogrammetry, Remote Sensing & Spatial Information Sciences, vol. 2, 2015. [DOI:10.5194/isprsannals-II-3-W5-121-2015]
51. [51] Z. Pusztai and L. Hajder, "Quantitative comparison of feature matchers implemented in OpenCV3," 2016.
52. [52] H.-J. Chien, C.-C. Chuang, C.-Y. Chen, and R. Klette, "When to use what feature? SIFT, SURF, ORB, or A-KAZE features for monocular visual odometry," in 2016 International Conference on Image and Vision Computing New Zealand (IVCNZ), 2016: IEEE, pp. 1-6. [DOI:10.1109/IVCNZ.2016.7804434]
53. [53] C. Schmid, R. Mohr, and C. Bauckhage, "Evaluation of interest point detectors," International Journal of computer vision, vol. 37, no. 2, pp. 151-172, 2000. [DOI:10.1023/A:1008199403446]
54. [54] A. Gil, O. M. Mozos, M. Ballesta, and O. Reinoso, "A comparative evaluation of interest point detectors and local descriptors for visual SLAM," Machine Vision and Applications, vol. 21, no. 6, pp. 905-920, 2010. [DOI:10.1007/s00138-009-0195-x]
55. [55] A. Sedaghat and H. Ebadi, "Remote sensing image matching based on adaptive binning SIFT descriptor," IEEE transactions on geoscience and remote sensing, vol. 53, no. 10, pp. 5283-5293, 2015. [DOI:10.1109/TGRS.2015.2420659]
ارسال پیام به نویسنده مسئول



XML   English Abstract   Print


Download citation:
BibTeX | RIS | EndNote | Medlars | ProCite | Reference Manager | RefWorks
Send citation to:

Jovhari N, Sedaghat A, Mohammadi N. Performance Evaluation of Local Detectors in the Presence of Noise for Multi-Sensor Remote Sensing Image Matching. jgit 2022; 10 (2) :63-88
URL: http://jgit.kntu.ac.ir/article-1-867-fa.html

جوهری نگار، صداقت امین، محمدی نازیلا. ارزیابی عملکرد آشکارسازهای عوارض موضعی در حضور نویز، به‌منظور تناظریابی تصاویر چندسنجنده ای سنجش‌از‌دوری. مهندسی فناوری اطلاعات مکانی. 1401; 10 (2) :63-88

URL: http://jgit.kntu.ac.ir/article-1-867-fa.html



بازنشر اطلاعات
Creative Commons License این مقاله تحت شرایط Creative Commons Attribution-NonCommercial 4.0 International License قابل بازنشر است.
دوره 10، شماره 2 - ( 8-1401 ) برگشت به فهرست نسخه ها
نشریه علمی-پژوهشی مهندسی فناوری اطلاعات مکانی Engineering Journal of Geospatial Information Technology
Persian site map - English site map - Created in 0.07 seconds with 38 queries by YEKTAWEB 4660