IMAGE FUSION QUALITY ASSESSMENT USING SHANNON ENTROPY AND HARTLEY USEFUL INFORMATION COEFFICIENT
Abstract
The article examines methods for improving the quality of images obtained from different sources of
information through multi-modal integration. By combining information from various modalities, features
that cannot be accurately interpreted when analyzed separately can be utilized. To support the relevance
of this topic, recent research in this area is discussed. The goal of the work is to enhance the information
content of images resulting from merging data from diverse sources and create high-quality images suitable
for accurate machine learning applications. To achieve this objective, the authors address several
tasks. They develop an approach for measuring image quality and design algorithms to evaluate the quality
of fused results based on multi-modal information. These algorithms are implemented within a software
framework for validating the proposed approach. Evaluation experiments are conducted based on the
presented calculations of the information content of images and the effect of noise and blur on the entropy
of the combined image. The results of the experimental studies on data sets from open sources have shown
that the proposed method allows determining the best way to merge images with maximum information
content. The use of Shannon entropy makes it possible to calculate the amount of information transmitted
in images, and the Hartley coefficient of useful information helps estimate the amount of noise in an image.
Additionally, the article compares the results at different levels of noise and degrees of blur in images,
demonstrating the effectiveness of different algorithms for evaluating image quality. To illustrate the
proposed approach, we analyze images obtained by combining data from two devices: an infrared camera
and a video camera capturing images in the visible range.
References
Eng. Comput., 2020, Vol. 58, No. 4, pp. 669-687.
2. Meyer-Baese A., Schmid V. The Wavelet Transform in Medical Imaging, Pattern Recognition and
Signal Analysis in Medical Imaging, 2014, pp. 113-134.
3. Bijelic M. Seeing Through Fog Without Seeing Fog: Deep Multimodal Sensor Fusion in Unseen Adverse
Weather, The IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR),
2019, 11 p.
4. Xu H., Ma J., Le Z., Jiang J., & Guo X. FusionDN: A Unified Densely Connected Network for Image
Fusion, Proceedings of the AAAI Conference on Artificial Intelligence, 2020, Vol. 34, No. 07,
pp. 12484-12491.
5. Chernov A.V., Savvas I.K., Alexandrov A.A., Kartashov O.O., Polyanichenko D.S., Butakova M.A.,
Soldatov A.V. Integrated Video and Acoustic Emission Data Fusion for Intelligent Decision Making in
Material Surface Inspection System, Sensors, 2022, Vol. 22, No. 21, 8554 p.
6. Li H. A Multi-Sensor Fusion Framework Based on Coupled Residual Convolutional Neural Networks,
Remote Sens., 2020, Vol. 12, No. 12, 2067 p.
7. Rogova L., Bosse E. Information quality in information fusion, in 2010 13th International Conference
on Information Fusion, Edinburgh: IEEE, 2010, pp. 1-8.
8. Castanedo F. A Review of Data Fusion Techniques, Sci. World J., 2013, pp. 1-19.
9. Kenda K., Kažič B., Novak E., Mladenić D Streaming Data Fusion for the Internet of Things, Sensors,
2019, Vol. 19, No. 8, 1955 p.
10. Proppe C., Kaupp J. On information fusion for reliability estimation with multifidelity models, Probabilistic
Engineering Mechanics, 2022, Vol. 69, 103291 p.
11. Umme S., Morium A., Mohammad S.U., Image Quality Assessment through FSIM, SSIM, MSE and
PSNR—A Comparative Study, Journal of Computer and Communications, 2019, Vol. 07, pp. 8-18.
12. Kavitha S., Thyagharajan K.K. A Survey on Quantitative Metrics for Assessing the Quality of Fused
Medical Images, Res. J. Appl. Sci. Eng. Technol., 2016, Vol. 12, No. 3, pp. 282-293.
13. Hao Q., Zhao Q., Sbert M., Feng Q., Ancuti C., Feixas M., Vila M. Information-Theoretic Channel for
Multi-exposure Image Fusion, The Computer Journal, 2023, Vol. 66, pp. 114-127.
14. Li B., Li R., Liu Z., Li C., Wang Z. An Objective Non-Reference Metric Based on Arimoto Entropy for
Assessing the Quality of Fused Images, Entropy, 2019, Vol. 21, No. 9, 879 p.
15. Vila M., Bardera A., Feixas M., Bekaert P., Sbert M. Analysis of image informativeness measures,
2014 IEEE International Conference on Image Processing (ICIP), 2014, pp. 1086-1090.
16. Kovalenko P.P. Metodika informatsionnoy otsenki vospriyatiya izobrazheniy [Methodology of information
assessment of image perception], Nauchno-tekhnicheskiy vestnik informatsionnykh
tekhnologiy, mekhaniki i optiki [Scientific and technical bulletin of information technologies, mechanics
and optics], 2008, No. 48, pp. 45-49.
17. Obukhov A.D., Nikolyukin M.S. Metod povysheniya informatsionnoy tsennosti videodannykh na
osnove fil'tratsii kadrov i otsenki entropii [Method of increasing the information value of video data
based on frame filtering and entropy assessment], Nauchno-tekhnicheskiy vestnik informatsionnykh
tekhnologiy, mekhaniki i optiki [Scientific and technical bulletin of information technologies, mechanics
and optics],. – 2023. – 23 (3). – S. 493-499.
18. Kuznetsov L.A., Bugakov D.A. Razrabotka mery otsenki informatsionnogo rasstoyaniya mezhdu
graficheskimi ob"ektami [Development of a measure for assessing the information distance between
graphic objects], Informatsionno-upravlyayushchie sistemy [Information control systems], 2013, No. 1
(62), pp. 74-79.
19. Gao P., Li Z., Zhang H. Thermodynamics-Based Evaluation of Various Improved Shannon Entropies
for Configurational Information of Gray-Level Images, Entropy, 2018, Vol. 20, No. 1, 19 p.
20. Ayunts H., Grigoryan A., Agaian S. Novel Entropy for Enhanced Thermal Imaging and Uncertainty
Quantification, Entropy, 2024, Vol. 26, No. 5, 374 p.
21. Cui Y. No-Reference Image Quality Assessment Based on Dual-Domain Feature Fusion, Entropy,
2020, Vol. 22, No. 3, 344 p.