Enhancing GANs With MMD Neural Architecture Search, PMish Activation Function, and Adaptive Rank Decomposition
Enhancing GANs With MMD Neural Architecture Search, PMish Activation Function, and Adaptive Rank Decomposition
Blog Article
Generative Adversarial Networks (GANs) have gained considerable attention owing to their impressive ability to generate high-quality, realistic images from a desired data distribution.This research introduces advancements in GANs by developing an improved activation function, a novel training strategy, and an adaptive rank decomposition method to compress the network.The proposed activation function, called Parametric Mish (PMish), automatically adjusts a trainable parameter to control Lunch Bags the smoothness and shape of the activation function.
Our method employs a Neural Architecture Search (NAS) to discover the optimal architecture for image generation while using the Maximum Mean Discrepancy (MMD) repulsive loss for adversarial training.The proposed novel training strategy improves performance by progressively increasing the upper bound of the bounded MMD-GAN repulsive loss.Finally, the proposed Adaptive Rank Decomposition (ARD) method reduces the complexity of the network with minimal impact on its generative performance, thus enabling efficient deployment on resource-limited platforms.
The effectiveness of these advancements is rigorously tested on standard Packs and Bags benchmark datasets such as CIFAR-10, CIFAR-100, STL-10, and CelebA, where significant improvements over existing techniques are demonstrated.The implementation code is available at: https://github.com/PrasannaPulakurthi/MMD-PMish-NAS.