GANEx: A complete pipeline of training, inference and benchmarking GAN experiments
Conference object
Accepted version
View/ Open
Date
2019-10-21Metadata
Show full item recordCollections
Original version
Thambawita, Hammer, Riegler, Halvorsen: GANEx: A complete pipeline of training, inference and benchmarking GAN experiments. In: Gurrin CG, Jónsson BT, Peteri R. Proceedings of Content Based Multimedia Information (CBMI 2019), 2019. IEEE conference proceedings https://dx.doi.org/10.1109/CBMI.2019.8877387Abstract
Deep learning (DL) is one of the standard methods in the field of multimedia research to perform data classification, detection, segmentation and generation. Within DL, generative adversarial networks (GANs) represents a new and highly popular branch of methods. GANs have the capability to generate, from random noise or conditional input, new data realizations within the dataset population. While generation is popular and highly useful in itself, GANs can also be useful to improve supervised DL. GAN-based approaches can, for example, perform segmentation or create synthetic data for training other DL models. The latter one is especially interesting in domains where not much training data exists such as medical multimedia. In this respect, performing a series of experiments involving GANs can be very time consuming due to the lack of tools that support the whole pipeline such as structured training, testing and tracking of different architectures and configurations. Moreover, the success of generative models is highly dependent on hyper-parameter optimization and statistical analysis in the design and fine-tuning stages. In this paper, we present a new tool called GANEx for making the whole pipeline of training, inference and benchmarking GANs faster, more efficient and more structured. The tool consists of a special library called FastGAN which allows designing generative models very fast. Moreover, GANEx has a graphical user interface to support structured experimenting, quick hyper-parameter configurations and output analysis. The presented tool is not limited to a specific DL framework and can be therefore even used to compare the performance of cross frameworks.
Publisher
IEEESeries
International Workshop on Content-Based Multimedia Indexing, CBMI;Related items
Showing items related by title, author, creator and subject.
-
Why don't all high-trust Networks achieve strong network benefits? A case-based Exploration of Cooperation in Norwegian SME networks
Gausdal, Anne Haugen; Svare, Helge; Möllering, Guido (Journal article; Peer reviewed, 2016)This paper explores the interactions between three focal constructs: network trust, network cooperation and network benefits. While positive interactions between these constructs are generally recognised, a deeper understanding ... -
A neuro-inspired general framework for the evolution of stochastic dynamical systems: Cellular automata, random Boolean networks and echo state networks towards criticality
Pontes-Filho, Sidney; Lind, Pedro; Yazidi, Anis; Zhang, Jianhua; Hammer, Hugo Lewi; Mello, Gustavo; Sandvig, Ioanna; Tufte, Gunnar; Nichele, Stefano (Cognitive Neurodynamics;, Journal article; Peer reviewed, 2020-06-11)Although deep learning has recently increased in popularity, it suffers from various problems including high computational complexity, energy greedy computation, and lack of scalability, to mention a few. In this paper, ... -
Within-network connectivity in the salience network after attention bias modification training in residual depression: Report from a preregistered clinical trial
Hilland, Eva; Landrø, Nils Inge; Harmer, Catherine; Maglanoc, Luigi Angelo; Jonassen, Rune (Frontiers in Human Neuroscience; December 2018, Volume 12, Article 508, Journal article; Peer reviewed, 2018-12-21)Alterations in resting state networks (RSNs) are associated with emotional- and attentional control difficulties in depressed individuals. Attentional bias modification (ABM) training may lead to more adaptive emotional ...