Skip to main content
Video s3
    Details
    Presenter(s)
    S Indrapriyadarsini Headshot
    Affiliation
    Affiliation
    Shizuoka University
    Country
    Abstract

    With the rising need for high-performance circuits, electronic design automation has gained significant attention. Analog circuit design optimization is complex mainly due to the large search space and high non-linearity. Neural networks have shown to be effective in solving highly non-linear problems. Training algorithms play an important role in neural networks. In this paper, we propose a second-order, adaptive modified Nesterov’s accelerated quasi-Newton (amNAQ) method for neural network training. The performance of the proposed method is evaluated in transistor sizing for a two-stage operational amplifier. The results indicate that the proposed method can efficiently deduce the size while satisfying the desired specifications. The proposed method shows faster and better convergence compared to the first order SGD, AdaGrad, Adam and second-order BFGS methods.

    Slides