Kernelized Similarity Learning and Embedding for Dynamic Texture Synthesis

Publisher:
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
Publication Type:
Journal Article
Citation:
IEEE Transactions on Systems, Man, and Cybernetics: Systems, 2023, 53, (2), pp. 824-837
Issue Date:
2023-02-01
Filename Description Size
Kernelized_Similarity_Learning_and_Embedding_for_Dynamic_Texture_Synthesis.pdfPublished version4.17 MB
Adobe PDF
Full metadata record
Dynamic texture (DT) exhibits statistical stationarity in the spatial domain and stochastic repetitiveness in the temporal dimension, indicating that different frames of DT possess a high similarity correlation that is critical prior knowledge. However, existing methods cannot effectively learn a synthesis model for high-dimensional DT from a small number of training samples. In this article, we propose a novel DT synthesis method, which makes full use of similarity as prior knowledge to address this issue. Our method is based on the proposed kernel similarity embedding, which can not only mitigate the high dimensionality and small sample issues, but also has the advantage of modeling nonlinear feature relationships. Specifically, we first put forward two hypotheses that are essential for the DT model to generate new frames using similarity correlations. Then, we integrate kernel learning and the extreme learning machine into a unified synthesis model to learn kernel similarity embeddings for representing DTs. Extensive experiments on DT videos collected from the Internet and two benchmark datasets, i.e., Gatech Graphcut Textures and Dyntex, demonstrate that the learned kernel similarity embeddings can provide discriminative representations for DTs. Further, our method can preserve the long-term temporal continuity of the synthesized DT sequences with excellent sustainability and generalization. Meanwhile, it effectively generates realistic DT videos with higher speed and lower computation than the current state-of-the-art methods.
Please use this identifier to cite or link to this item: