Dive of the RoboBee

via Harvard SEAS

via Harvard SEAS


In 1939, a Russian engineer proposed a “flying submarine” — a vehicle that can seamlessly transition from air to water and back again. While it may sound like something out of a James Bond film, engineers have been trying to design functional aerial-aquatic vehicles for decades with little success. Now, engineers may be one step closer to the elusive flying submarine.

The biggest challenge is conflicting design requirements: aerial vehicles require large airfoils like wings or sails to generate lift while underwater vehicles need to minimize surface area to reduce drag.

To solve this engineers at the Harvard John A. Paulson School of Engineering and Applied Science (SEAS) took a clue from puffins. The birds with flamboyant beaks are one of nature’s most adept hybrid vehicles, employing similar flapping motions to propel themselves through air as through water.

“Through various theoretical, computational and experimental studies, we found that the mechanics of flapping propulsion are actually very similar in air and in water,” said Kevin Chen, a graduate student in the Harvard Microrobotics Lab at SEAS. “In both cases, the wing is moving back and forth. The only difference is the speed at which the wing flaps.”

Coming from the Harvard Microrobotics Lab, this discovery can only mean one thing: swimming RoboBees.

For the first time, researchers at SEAS have demonstrated a flying, swimming, insect-like robot — paving the way for future duel aerial aquatic robotic vehicles.

Read more: Dive of the RoboBee



The Latest on: Insect-like robot
  • The best robots from CES 2020: the cute, the cuddly and the confusing
    on January 10, 2020 at 3:47 am

    Robot pets are nothing new (Sony's dog-bot Aibo made its debut in 1999), but the ... Reachy’s bouncing antennae make it look vaguely insect-like, which is unfortunate because it’s a remarkable feat of ...

  • This tiny, soft robo-bug scoots with smarts and survives swats
    on December 18, 2019 at 11:48 am

    Still, we’re getting closer. This tiny insect-like robot is made of soft materials and weighs less than a gram, yet can move quickly and with some intelligence — and is robust enough to survive a ...

  • This tiny, soft robo-bug scoots with smarts and survives swats
    on December 18, 2019 at 11:42 am

    Still, we're getting closer. This tiny insect-like robot is made of soft materials and weighs less than a gram, yet can move quickly and with some intelligence — and is robust enough to survive a ...

  • Attack on the killer robots
    on December 9, 2019 at 12:22 pm

    BERLIN — The killer robots of the future are facing an unlikely alliance of enemies ... could be deployed as swarms of millions of tiny, insect-like craft within the next five to 10 years. The swarms ...

  • Flying, insect-like robot flits closer to independent flight
    on June 26, 2019 at 2:13 pm

    The alternative is bottom-up. Start with something similar to the flying insect-like robots and figure out how to expand their capabilities. Not surprisingly, since they built the insect-like robot, ...

  • Tiny Spies: This Insect-Like Flying Robot is Smaller Than a Penny
    on May 15, 2019 at 8:36 am

    A team of engineers from the University of Southern California in Los Angeles built a four-winged flying robot called Bee+, which weighs just 95 grams and sports a footprint smaller than a penny.

  • A tiny four-winged robotic insect flies more like the real thing
    on May 15, 2019 at 2:05 am

    In recent years, aerodynamicists, engineers, and roboticists have attempted to copy insect-like flight by building tiny flying robots. The main thing they’ve discovered is just how difficult this is.

via  Bing News


Pars life-saving flying robot is now a reality


Earlier this year, RTS Lab unveiled its concept for Pars, an aerial robot that flies out over a large body of water to air-drop life preservers near drowning victims.

Like many design concepts, we weren’t sure if this life-saving drone would ever become a reality, but it seems the Iran-based company was recently able to fund a working prototype and even test its capabilities in open water. Based on these initial tests, it’s possible that this flying, GPS-guided lifeguard could be out there saving lives sooner than you think.

Over the course of four days in August of this year, the Pars development team visited the Caspian Sea to conduct a battery of tests on its brand new prototype. The location was chosen in part for its proximity to the RTS lab, but also because it’s been the site of several tragic drownings in the past few years, including an incident that took the lives of six students this past summer. Among other attributes, the team tested the Pars’ stability during flight, the accuracy of the life preserver release mechanism, and the bot’s performance in both day and nighttime conditions. According to the researchers at RTS Labs, the prototype bot met their expectations perfectly.

The Pars was able to fly for 10 minutes at a top speed of 10 m/s (22.4 mph) before needing to recharge. This gives it a maximum range of 4.5 km (2.8 miles), making it ideal for emergencies occurring along coastlines and near ships at sea. It also proved to have a distinct advantage over its flesh and blood counterparts, since it can bypass treacherous waters with ease.

When conducting a trial rescue mission, the drone was able to reach a target 75 m (246 ft) away and drop its payload in about 22 seconds, while a human lifeguard took 91 seconds to swim to the same location. During testing at night, the Pars was also able to illuminate targets on the ground and make itself more visible to its controller on land using several bright LEDs.

RTS Lab has pointed out that the drone’s fast speed combined with a capacity for several life preservers means it could attend to multiple people in one trip. With its built-in GPS, it can even be programmed to fly to a certain area, dispense life preservers to anyone in danger, and then automatically return to its base. Of course, the aerial bot won’t be able to pull anyone to safety just yet, but it could be sent out ahead of rescue crews to provide some initial aid. The researchers are also hoping it could give emergency teams a birds-eye view of the situation and help them plot a safe path to where they need to go.

Read more . . .


Smart as a bird: Flying rescue robot will autonomously avoid obstacles

Cornell researchers have created an autonomous flying robot that is as smart as a bird when it comes to maneuvering around obstacles.

Able to guide itself through forests, tunnels or damaged buildings, the machine could have tremendous value in search-and-rescue operations. Small flying machines are already common, and GPS technology provides guidance. Now, Ashutosh Saxena, assistant professor of computer science, and his team are tackling the hard part: how to keep the vehicle from slamming into walls and tree branches. Human controllers can’t always react swiftly enough, and radio signals may not reach everywhere the robot goes.

The test vehicle is a quadrotor, a commercially available flying machine about the size of a card table with four helicopter rotors. Saxena and his team have already programmed quadrotors to navigate hallways and stairwells. But in the wild, current methods aren’t accurate enough at large distances to plan a route around obstacles. Saxena is building on methods he previously developed to turn a flat video camera image into a 3-D model of the environment using such cues as converging straight lines, the apparent size of familiar objects and what objects are in front of or behind each other — the same cues humans unconsciously use to supplement their stereoscopic vision.

Read more . . .

via Cornell University – Bill Steele

The Latest Streaming News: Flying rescue robot updated minute-by-minute

Bookmark this page and come back often

Latest NEWS


Latest VIDEO


The Latest from the BLOGOSPHERE

‘Green Brain’ project to create an autonomous flying robot with a honey bee brain

The team will build models of the systems in the brain that govern a honey bee’s vision and sense of smell.

Scientists at the Universities of Sheffield and Sussex are embarking on an ambitious project to produce the first accurate computer models of a honey bee brain in a bid to advance our understanding of Artificial Intelligence (AI), and how animals think.

The team will build models of the systems in the brain that govern a honey bee’s vision and sense of smell. Using this information, the researchers aim to create the first flying robot able to sense and act as autonomously as a bee, rather than just carry out a pre-programmed set of instructions.

If successful, this project will meet one of the major challenges of modern science: building a robot brain that can perform complex tasks as well as the brain of an animal. Tasks the robot will be expected to perform, for example, will include finding the source of particular odours or gases in the same way that a bee can identify particular flowers.

It is anticipated that the artificial brain could eventually be used in applications such as search and rescue missions, or even mechanical pollination of crops.

Dr James Marshall, leading the £1 million EPSRC1 funded project in Sheffield, said: “The development of an artificial brain is one of the greatest challenges in Artificial Intelligence. So far, researchers have typically studied brains such as those of rats, monkeys, and humans, but actually ‘simpler’ organisms such as social insects have surprisingly advanced cognitive abilities.”

Called “Green Brain”, and partially supported with hardware donated by NVIDIA Corporation, the project invites comparison with the IBM-sponsored Blue Brain initiative, which is developing brain modeling technologies using supercomputers with the ultimate goal of producing an accurate model of a human brain.

The hardware provided by NVIDIA is based on high-performance processors called “GPU accelerators” that generate the 3D graphics on home PCs and games consoles and power some of the world’s highest-performance supercomputers. These accelerators provide a very efficient way of performing the massive calculations needed to simulate a brain using a standard desktop PC – rather than on a large, expensive supercomputing cluster.

“Using NVIDIA’s massively parallel GPU accelerators for brain models is an important goal of the project as they allow us to build faster models than ever before,” explained Dr Thomas Nowotny, the leader of the Sussex team. “We expect that in many areas of science this technology will eventually replace the classic supercomputers we use today.”

Green Brain’s researchers anticipate that developing a model of a honey bee brain will offer a more accessible method of driving forward our knowledge of how a brain’s cognitive systems work, leading to advances in understanding animal and human cognition. “Because the honey bee brain is smaller and more accessible than any vertebrate brain, we hope to eventually be able to produce an accurate and complete model that we can test within a flying robot,” said Dr Marshall.

“Not only will this pave the way for many future advances in autonomous flying robots, but we also believe the computer modelling techniques we will be using will be widely useful to other brain modelling and computational neuroscience projects,” added Dr Nowotny.

Read more . . .

via The University of Sheffield – EurekaAlert

The Latest Streaming News: Autonomous flying robot updated minute-by-minute

Bookmark this page and come back often

Latest NEWS


Latest VIDEO


The Latest from the BLOGOSPHERE

Centeye Creates Insect-Like Flying Robots In A DC Basement

Centeye is dedicated to computer vision.

When we first wandered up to the suburban split-level that houses Centeye Inc., we were a bit confused. Could this be the place where a mad roboticist was building tiny robots with insect eyes and brains that could interact with their environment? We rang the doorbell and weren’t disappointed.

Founded by Geoffrey Barrows, Centeye is dedicated to computer vision. They make little electronic eyes that are cheap to reproduce and “see” only a few thousand pixels. He has a staff of two engineers who work with him on designing and building chips and has just released the open source Arudeye board, a tiny Arduino board with camera built-in.

Barrows does everything from his basement. Recent advances in fabrication allow him and his staff to design chips on a computer at home and then send the plans to manufacturers in Asia. They can then mass produce their eyes, driving down the cost per unit to a few dollars. They don’t need a big lab because everything is done remotely.

Their robots are actually proofs-of-concept but they’re really cool. The little helicopters use Centeye eyes to remain stationary in space and other models can avoid objects as they move. Because each eye takes in a small part of the scene, not much computing power is needed to process each bit of input. Like insects, the brain doesn’t have to work very hard to get a lot done.

Read more . . .

via TechCrunch – John Biggs

The Latest Streaming News: Insect-Like Flying Robots updated minute-by-minute

Bookmark this page and come back often

Latest NEWS


Latest VIDEO


The Latest from the BLOGOSPHERE