21
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: not found

      Robots That Use Language

      Read this article at

      ScienceOpenPublisher
      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          This article surveys the use of natural language in robotics from a robotics point of view. To use human language, robots must map words to aspects of the physical world, mediated by the robot's sensors and actuators. This problem differs from other natural language processing domains due to the need to ground the language to noisy percepts and physical actions. Here, we describe central aspects of language use by robots, including understanding natural language requests, using language to drive learning about the physical world, and engaging in collaborative dialogue with a human partner. We describe common approaches, roughly divided into learning methods, logic-based methods, and methods that focus on questions of human–robot interaction. Finally, we describe several application domains for language-using robots.

          Related collections

          Most cited references170

          • Record: found
          • Abstract: found
          • Article: not found

          Long Short-Term Memory

          Learning to store information over extended time intervals by recurrent backpropagation takes a very long time, mostly because of insufficient, decaying error backflow. We briefly review Hochreiter's (1991) analysis of this problem, then address it by introducing a novel, efficient, gradient-based method called long short-term memory (LSTM). Truncating the gradient where this does not do harm, LSTM can learn to bridge minimal time lags in excess of 1000 discrete-time steps by enforcing constant error flow through constant error carousels within special units. Multiplicative gate units learn to open and close access to the constant error flow. LSTM is local in space and time; its computational complexity per time step and weight is O(1). Our experiments with artificial data involve local, distributed, real-valued, and noisy pattern representations. In comparisons with real-time recurrent learning, back propagation through time, recurrent cascade correlation, Elman nets, and neural sequence chunking, LSTM leads to many more successful runs, and learns much faster. LSTM also solves complex, artificial long-time-lag tasks that have never been solved by previous recurrent network algorithms.
            Bookmark
            • Record: found
            • Abstract: not found
            • Article: not found

            ImageNet: A large-scale hierarchical image database

              Bookmark
              • Record: found
              • Abstract: not found
              • Conference Proceedings: not found

              Deep visual-semantic alignments for generating image descriptions

                Bookmark

                Author and article information

                Journal
                Annual Review of Control, Robotics, and Autonomous Systems
                Annu. Rev. Control Robot. Auton. Syst.
                Annual Reviews
                2573-5144
                2573-5144
                May 03 2020
                May 03 2020
                : 3
                : 1
                : 25-55
                Affiliations
                [1 ]Department of Computer Science, Brown University, Providence, Rhode Island 02912, USA;
                [2 ]School of Interactive Computing, Georgia Institute of Technology, Atlanta, Georgia 30332, USA;
                [3 ]Sibley School of Mechanical and Aerospace Engineering, Cornell University, Ithaca, New York 14853, USA;
                [4 ]Department of Computer Science and Electrical Engineering, University of Maryland, Baltimore County, Baltimore, Maryland 21250, USA;
                Article
                10.1146/annurev-control-101119-071628
                1e8071f1-512e-4140-9712-2fd3db775c4a
                © 2020
                History

                Comments

                Comment on this article