Title
Learning Linear Transformations For Fast Image And Video Style Transfer
Abstract
Given a random pair of images, a universal style transfer method extracts the feel from a reference image to synthesize an output based on the look of a content image. Recent algorithms based on second-order statistics, however, are either computationally expensive or prone to generate artifacts due to the trade-off between image quality and run-time performance. In this work, we present an approach for universal style transfer that learns the transformation matrix in a data-driven fashion. Our algorithm is efficient yet flexible to transfer different levels of styles with the same auto-encoder network. It also produces stable video style transfer results due to the preservation of the content affinity. In addition, we propose a linear propagation module to enable a feed-forward network for photo-realistic style transfer. We demonstrate the effectiveness of our approach on three tasks: artistic style, photo-realistic and video style transfer, with comparisons to state-of-the-art methods. The project web- site can be found at https://sites.google.com/view/linear-style-transfer-cvpr19.
Year
DOI
Venue
2019
10.1109/CVPR.2019.00393
2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019)
Field
DocType
ISSN
Computer vision,Computer science,Artificial intelligence,Linear map
Conference
1063-6919
Citations 
PageRank 
References 
3
0.40
0
Authors
4
Name
Order
Citations
PageRank
Xueting Li1143.22
Sifei Liu222717.54
Jan Kautz33615198.77
Yang Ming-Hsuan415303620.69