Abstract
In this work we present two new methods for approximating the Kullback-Liebler (KL) divergence between two mixtures of Gaussians. The first method is based on matching between the Gaussian elements of the two Gaussian mixture densities. The second method is based on the unscented transform. The proposed methods are utilized for image retrieval tasks. Continuous probabilistic image modeling based on mixtures of Gaussians together with KL measure for image similarity, can be used for image retrieval tasks with remarkable performance. The efficiency and the performance of the KL approximation methods proposed are demonstrated on both simulated data and real image data sets. The experimental results indicate that our proposed approximations outperform previously suggested methods.
Original language | English |
---|---|
Pages | 487-493 |
Number of pages | 7 |
DOIs | |
State | Published - 2003 |
Externally published | Yes |
Event | Proceedings: Ninth IEEE International Conference on Computer Vision - Nice, France Duration: 13 Oct 2003 → 16 Oct 2003 |
Conference
Conference | Proceedings: Ninth IEEE International Conference on Computer Vision |
---|---|
Country/Territory | France |
City | Nice |
Period | 13/10/03 → 16/10/03 |