Show simple item record

dc.creatorGaffin, Douglas D.
dc.creatorBrayfield, Brad P.
dc.date.accessioned2016-06-20T19:48:55Z
dc.date.available2016-06-20T19:48:55Z
dc.date.issued2016-04-27
dc.identifier.citationGaffin DD, Brayfield BP (2016) Autonomous Visual Navigation of an Indoor Environment Using a Parsimonious, Insect Inspired Familiarity Algorithm. PLoS ONE 11(4): e0153706. doi:10.1371/journal.pone.0153706en_US
dc.identifier.urihttps://hdl.handle.net/11244/42358
dc.description.abstractThe navigation of bees and ants from hive to food and back has captivated people for more than a century. Recently, the Navigation by Scene Familiarity Hypothesis (NSFH) has been proposed as a parsimonious approach that is congruent with the limited neural elements of these insects’ brains. In the NSFH approach, an agent completes an initial training excursion, storing images along the way. To retrace the path, the agent scans the area and compares the current scenes to those previously experienced. By turning and moving to minimize the pixel-by-pixel differences between encountered and stored scenes, the agent is guided along the path without having memorized the sequence. An important premise of the NSFH is that the visual information of the environment is adequate to guide navigation without aliasing. Here we demonstrate that an image landscape of an indoor setting possesses ample navigational information. We produced a visual landscape of our laboratory and part of the adjoining corridor consisting of 2816 panoramic snapshots arranged in a grid at 12.7-cm centers. We show that pixel-by-pixel comparisons of these images yield robust translational and rotational visual information. We also produced a simple algorithm that tracks previously experienced routes within our lab based on an insect-inspired scene familiarity approach and demonstrate that adequate visual information exists for an agent to retrace complex training routes, including those where the path’s end is not visible from its origin. We used this landscape to systematically test the interplay of sensor morphology, angles of inspection, and similarity threshold with the recapitulation performance of the agent. Finally, we compared the relative information content and chance of aliasing within our visually rich laboratory landscape to scenes acquired from indoor corridors with more repetitive scenery.en_US
dc.description.sponsorshipThe authors received funding from a Research Council Faculty Investment Grant from the University of Oklahoma.en_US
dc.format.extent25 pages
dc.format.extent12,933,802 bytes
dc.format.mediumapplication.pdf
dc.languageen_USen_US
dc.relation.requiresAdobe Acrobat Reader
dc.subject.lcshRobot visionen_US
dc.subject.lcshInsects -- Orientationen_US
dc.titleAutonomous visual navigation of an indoor environment using a parsimonious, insect inspired familiarity algorithmen_US
dc.typeArticleen_US
dc.typeDataseten_US
dc.typetext
dc.typecomputer dataset
dc.description.peerreviewYesen_US
dc.identifier.doi10.1371/journal.pone.0153706en_US
ou.groupCollege of Arts and Sciences::Department of Biologyen_US


Files in this item

Thumbnail
Thumbnail
Thumbnail
Thumbnail
Thumbnail
Thumbnail
Thumbnail

This item appears in the following Collection(s)

Show simple item record