Investigation of the new problem of fast controllable artistic text style transfer, in terms of glyph deformations, and propose a novel bidirectional shape matching framework to solve it
Development of a sketch module to match the shape from the style to the glyph, which transforms a single style image to paired training data at various scales and thus enables learning robust glyph-style mappings
Shape-Matching GAN to transfer text styles, with a scale-controllable module designed to allow for adjusting the stylistic degree of the glyph with a continuous parameter as user input and generating diversified artistic text in real-time
Abstract
Artistic text style transfer is the task of migrating the style from a source image to the target text to create artistic typography. Recent style transfer methods have considered texture control to enhance usability. However, controlling the stylistic degree in terms of shape deformation remains an important open challenge. In this paper, we present the first text style transfer network that allows for real-time control of the crucial stylistic degree of the glyph through an adjustable parameter. Our key contribution is a novel bidirectional shape matching framework to establish an effective glyph-style mapping at various deformation levels without paired ground truth. Based on this idea, we propose a scale-controllable module to empower a single network to continuously characterize the multi-scale shape features of the style image and transfer these features to the target text. The proposed method demonstrates its superiority over previous state-of-the-arts in generating diverse, controllable and high-quality stylized text.
Summary
Abstract
Artistic text style transfer is the task of migrating the style from a source image to the target text to create artistic typography. Recent style transfer methods have considered texture control to enhance usability. However, controlling the stylistic degree in terms of shape deformation remains an important open challenge. In this paper, we present the first text style transfer network that allows for real-time control of the crucial stylistic degree of the glyph through an adjustable parameter. Our key contribution is a novel bidirectional shape matching framework to establish an effective glyph-style mapping at various deformation levels without paired ground truth. Based on this idea, we propose a scale-controllable module to empower a single network to continuously characterize the multi-scale shape features of the style image and transfer these features to the target text. The proposed method demonstrates its superiority over previous state-of-the-arts in generating diverse, controllable and high-quality stylized text.
Author
Journal/Conference
ICCV 2019
Subjects
cs.CV
: Computer Vision and Pattern RecognitionComment
Accepted by ICCV 2019. Code is available at this https URL
Link