• Appreciating a flower's texture, color,

    From ScienceDaily@1337:3/111 to All on Tue Jan 19 21:30:40 2021
    Appreciating a flower's texture, color, and shape leads to better drone landings

    Date:
    January 19, 2021
    Source:
    Delft University of Technology
    Summary:
    Researchers present an optical flow-based learning process that
    allows robots to estimate distances through the visual appearance
    (shape, color, texture) of the objects in view. This artificial
    intelligence (AI)-based learning strategy increases the navigation
    skills of small flying drones and entails a new hypothesis on
    insect intelligence.



    FULL STORY ==========================================================================
    If you ever saw a honeybee hopping elegantly from flower to flower or
    avoiding you as you passed by, you may have wondered how such a tiny
    insect has such perfect navigation skills. These flying insects' skills
    are partially explained by the concept of optical flow: they perceive
    the speed with which objects move through their field of view. Robotics researchers have tried to mimic these strategies on flying robots,
    but with limited success.


    ==========================================================================
    A team of TU Delft and the Westphalian University of Applied Sciences researchers therefore present an optical flow-based learning process that allows robots to estimate distances through the visual appearance (shape, color, texture) of the objects in view. This artificial intelligence
    (AI)-based learning strategy increases the navigation skills of small
    flying drones and entails a new hypothesis on insect intelligence. The
    article is published today in Nature Machine Intelligence.

    How do honeybees land on flowers or avoid obstacles? One would expect
    such questions to be mostly of interest to biologists. However, the rise
    of small electronics and robotic systems has also made these questions
    relevant to robotics and Artificial Intelligence (AI). Small flying
    robots for example are extremely restricted in terms of the sensors
    and processing that they can carry onboard. If these robots are to be
    as autonomous as the much larger self- driving cars, they will have to
    use an extremely efficient type of artificial intelligence -- similar
    to the highly developed intelligence possessed by flying insects.

    Optical flow One of the main tricks up the insect's sleeve is the
    extensive use of 'optical flow': the way in which objects move in
    their view. They use it to land on flowers and avoid obstacles or
    predators. Insects use surprisingly simple and elegant optical flow
    strategies to tackle complex tasks. For example, for landing honeybees
    use the optical flow "divergence," which captures how quickly things
    get bigger in view. If a honeybee were to fall to the ground, this
    divergence would keep increasing, with for example the grass becoming
    bigger in view ever faster. However, while landing honeybees employ a
    strategy of keeping the divergence constant by slowing down. The result
    is that they make smooth, soft landings.

    "Our work on optical flow control started from enthusiasm about the
    elegant, simple strategies employed by flying insects" says Guido de
    Croon, professor of Bio-inspired Micro Air Vehicles and first author
    on the article. "However, developing the control methods to actually
    implement these strategies in flying robots turned out to be far from
    trivial. For example, our flying robots would not actually land, but
    they started to oscillate, continuously going up and down, just above
    the landing surface." Fundamental limitations


    ========================================================================== Optical flow has two fundamental limitations that have been widely
    described in the growing literature on bio-inspired robotics. The first
    is that optical flow only provides mixed information on distances and velocities -- and not on distance or velocity separately. To illustrate,
    if there are two landing drones and one of them flies twice as high and
    twice as fast as the other drone, then they experience exactly the same
    optical flow. However, for good control these two drones should actually
    react differently to deviations in the optical flow divergence. If a
    drone does not adapt its reactions to the height when landing, it will
    never arrive and start to oscillate above the landing surface. Second,
    for obstacle avoidance it is very unfortunate that in the direction in
    which a robot is moving, the optical flow is very small. This means that
    in that direction, optical flow measurements are noisy and hence provide
    very little information on the presence of obstacles. Hence, the most
    important obstacles - - the ones that the robot is moving towards --
    are actually the hardest ones to detect! Learning visual appearance
    as the solution "We realized that both problems of optical flow would
    disappear if the robots were able to interpret not only optical flow,
    but also the visual appearance of objects in their environment," adds
    Guido de Croon. "This would allow robots to see distances to objects in
    the scene similarly to how we humans can estimate distances in a still
    picture. The only question was: How can a robot learn to see distances
    like that?" The key to this question lay in a recent theory devised by
    De Croon, which showed that flying robots can actively induce optical
    flow oscillations to perceive distances to objects in the scene. In
    the approach proposed in the Nature Machine Intelligence article the
    robots use such oscillations in order to learn what the objects in their environment look like at different distances. In this way, the robot
    can for example learn how fine the texture of grass is when looking at
    it from different heights during landing, or how thick tree barks are
    at different distances when navigating in a forest.

    Relevance to robotics and applications "Learning to see distances
    by means of visual appearance led to much faster, smoother landings
    than we achieved before," says Christophe De Wagter, researcher at TU
    Delft and co-author of the article. "Moreover, for obstacle avoidance,
    the robots were now also able to see obstacles in the flight direction
    very clearly. This did not only improve obstacle detection performance,
    but also allowed our robots to speed up." The proposed methods will be
    very relevant to resource-constrained flying robots, especially when they operate in a rather confined environment, such as flying in greenhouses
    to monitor crop or keeping track of the stock in warehouses.

    Relevance to biology The findings are not only relevant to robotics,
    but also provide a new hypothesis for insect intelligence. "Typical
    honeybee experiments start with a learning phase, in which honeybees
    exhibit various oscillatory behaviors when they get acquainted with a new environment and related novel cues like artificial flowers," says Tobias
    Seidl, biologist and professor at the Westphalian University of Applied Sciences. "The final measurements presented in articles typically take
    place after this learning phase has finished and focus predominantly on
    the role of optical flow. The presented learning process forms a novel hypothesis on how flying insects improve their navigational skills,
    such as landing, over their lifetime. This suggests that we should set
    up more studies to investigate and report on this learning phase."

    ========================================================================== Story Source: Materials provided by Delft_University_of_Technology. Note: Content may be edited for style and length.


    ==========================================================================


    Link to news story: https://www.sciencedaily.com/releases/2021/01/210119114409.htm

    --- up 5 weeks, 7 hours, 57 minutes
    * Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)