Skip to main content
Video s3
    Details
    Presenter(s)
    Wenhan Yang Headshot
    Display Name
    Wenhan Yang
    Affiliation
    Affiliation
    Peking University
    Country
    Author(s)
    Display Name
    Shixing Yu
    Affiliation
    Affiliation
    Peking University
    Display Name
    Yiyang Ma
    Affiliation
    Affiliation
    Peking University
    Display Name
    Wenhan Yang
    Affiliation
    Affiliation
    Peking University
    Display Name
    Wei Xiang
    Affiliation
    Affiliation
    Bigo Technology
    Display Name
    Jiaying Liu
    Affiliation
    Affiliation
    Peking University
    Abstract

    In this paper, we aim to explore a more generalized kind of video frame interpolation, that at an arbitrary time step. To this end, we consider processing different time-steps with adaptively generated convolutional kernels in a unified way with the help of meta-learning. Specifically, we develop a dual meta-learned frame interpolation framework to synthesize intermediate frames with the guidance of context information and optical flow as well as taking the time-step as side information. Extensive qualitative and quantitative evaluations demonstrate that, our method not only achieves superior performance to state-of-the-art frame interpolation approaches but also owns an extended capacity to support the interpolation at an arbitrary time-step.

    Slides
    • Meta-Interpolation: Time-Arbitrary Frame Interpolation via Dual Meta-Learning (application/pdf)