Inviting an author to review:
Find an author and click ‘Invite to review selected article’ near their name.
Search for authorsSearch for similar articles
44
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      RenderGAN: Generating Realistic Labeled Data

      methods-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Deep Convolutional Neuronal Networks (DCNNs) are showing remarkable performance on many computer vision tasks. Due to their large parameter space, they require many labeled samples when trained in a supervised setting. The costs of annotating data manually can render the use of DCNNs infeasible. We present a novel framework called RenderGAN that can generate large amounts of realistic, labeled images by combining a 3D model and the Generative Adversarial Network framework. In our approach, image augmentations (e.g., lighting, background, and detail) are learned from unlabeled data such that the generated images are strikingly realistic while preserving the labels known from the 3D model. We apply the RenderGAN framework to generate images of barcode-like markers that are attached to honeybees. Training a DCNN on data generated by the RenderGAN yields considerably better performance than training it on various baselines.

          Related collections

          Most cited references34

          • Record: found
          • Abstract: not found
          • Article: not found

          ImageNet Large Scale Visual Recognition Challenge

            Bookmark
            • Record: found
            • Abstract: not found
            • Conference Proceedings: not found

            Adam: a method for stochastic optimization

              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Tracking individuals shows spatial fidelity is a key regulator of ant social organization.

              Ants live in organized societies with a marked division of labor among workers, but little is known about how this division of labor is generated. We used a tracking system to continuously monitor individually tagged workers in six colonies of the ant Camponotus fellah over 41 days. Network analyses of more than 9 million interactions revealed three distinct groups that differ in behavioral repertoires. Each group represents a functional behavioral unit with workers moving from one group to the next as they age. The rate of interactions was much higher within groups than between groups. The precise information on spatial and temporal distribution of all individuals allowed us to calculate the expected rates of within- and between-group interactions. These values suggest that the network of interaction within colonies is primarily mediated by age-induced changes in the spatial location of workers.
                Bookmark

                Author and article information

                Contributors
                Journal
                Front Robot AI
                Front Robot AI
                Front. Robot. AI
                Frontiers in Robotics and AI
                Frontiers Media S.A.
                2296-9144
                08 June 2018
                2018
                : 5
                : 66
                Affiliations
                Fachbereich Mathematik und Informatik, Freie Universität Berlin , Berlin, Germany
                Author notes

                Edited by: Guanghui Wang, University of Kansas, United States

                Reviewed by: Zhichao Lian, Nanjing University of Science and Technology, China; Xinlei Chen, Facebook, United States

                *Correspondence: Leon Sixt leon.sixt@ 123456fu-berlin.de

                This article was submitted to Vision Systems Theory, Tools and Applications, a section of the journal Frontiers in Robotics and AI

                Article
                10.3389/frobt.2018.00066
                7805882
                6013285e-90c2-4027-861c-207f023ce008
                Copyright © 2018 Sixt, Wild and Landgraf.

                This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

                History
                : 01 February 2018
                : 17 May 2018
                Page count
                Figures: 7, Tables: 2, Equations: 6, References: 34, Pages: 9, Words: 5851
                Funding
                Funded by: Freie Universität 10.13039/501100007537
                Categories
                Robotics and AI
                Methods

                generative adversarial networks,unsupervised learning,social insects,markers,deep learning

                Comments

                Comment on this article

                scite_
                0
                0
                0
                0
                Smart Citations
                0
                0
                0
                0
                Citing PublicationsSupportingMentioningContrasting
                View Citations

                See how this article has been cited at scite.ai

                scite shows how a scientific paper has been cited by providing the context of the citation, a classification describing whether it supports, mentions, or contrasts the cited claim, and a label indicating in which section the citation was made.

                Similar content294

                Cited by25

                Most referenced authors417