BEAR: Berkeley Aerobot Team

You are now at:

Current Research Topics

We believe the autonomous systems are much more than a simple automaton, which performs the given task in a same routine ad nauseum. A simple waypoint navigation of an UAV is such an example: we believe an UAV should be a UAAV: unmanned, aerial, and autonomous vehicle. We have showcased such a capability for an UAV to sense the surrounding, compute the most optimal trajectory that avoids nearby obstacles while flying towards the original destination using a real-time trajectory generation layer using model predictive control scheme.
BEAR Intelligent Autonomous Systems Group continues to take on the challenging topics on providing solutions to real world problems that require autonomous systems to perform its missions with minimal human intervention. Such goals require the capability of the autonomous systems to sense, reason, and act in a highly intelligent manner.
Our research in that morale continues today in the following challenging topics.

Formation Flight

Formation flight is the primary movement technique for helicopter teams. By forming a coordinated formation, it is possible to achieve flight integrity with less fuel consumption, increasing the possibility of a mission's success. Even with such unique flight capabilities, the helicopter teams will be confronted by very challenging situations. The potential for accidents is increased by requirements to fly in close formations and under harsh conditions including poor weather, extremely low altitudes, low visibility, extreme temperatures, noise, vibrations, blasts, flashes, radiation and battlefield air pollution. The effects of battlefield stress exerted on aircrew increases dramatically under these adverse circumstances. Therefore, computer-assisted autonomous formation flight can help to diminish battlefield stress. Reduction of pilots' stress directly means the extension of radius of action and minimizing uncertainties during movements. Even though helicopter formation flight is of critical importance in military operations, relatively little research has been done on this topic. Since helicopter dynamics are notoriously complex and uncertain, it had not been feasible to design an automatic controller for a single helicopter. However, recent advances in system identification techniques and control of rotorcraft-based unmanned aerial vehicles(RUAVs) provided us insight into autonomous helicopter formation flight. Although several researchers have made efforts on the stable helicopter formation, their concerns were restricted to homogeneous formations. The automation of helicopter formation poses a challenge for researchers because formations used in the real world are more complicated and heterogeneous than those described in existing publications. Our technologies are designed to work both manned and unmanned rotorcraft, since it is our sense that they will be working together in the future. We also deeply believe that it is important to develop tools for human centered automation as a via-point to complete automation.


(Click to watch a clip on the formation experiment for ONR review, November 2002)

Vision-based Landing

UC Berkeley's team has been actively on imagery and the use of computer vision in favor of decreased probability of detection and enhanced situational awareness. Though radar, LIDAR, ultrasonic, and other time-of-flight sensors can provide very accurate estimates of distances, they have several disadvantages. By their nature, time-of-flight sensors rely on self-emitted RF, laser or ultrasonic signals, which adversaries can potentially use to detect, or worse, use to localize by triangulation. Furthermore, active sensing modalities have relatively high rates of power consumption. Visible-light or IR cameras, on the other hand are passive and rely solely on reflected ambient light or heat, thereby lowering the probability of detection as well as reducing power consumption. Factors that make vision-in-the-loop difficult are: achieving the accuracy of active sensors without time-of-flight measurements; high bandwidth and computation subject to operation in real-time; and inherent photogrammetric ambiguities making some computations ill-conditioned. Imagery, however, has a number of benefits besides lower power and passivity. Because of the more descriptive signal, giving color and appearance information, imagery can be used for target recognition and navigation from reference points when operating in unknown environments, as well as enhanced situational awareness.

Many landing scenarios demand the ability to land in an arbitrary environment where a landing site has not been previously selected. A natural extension to the former landing project was to eliminate the constraint on requiring a target to land autonomously. Whereas the site for a target would presumably be chosen so as to be safe for landing a helicopter, for example, clear of debris, foliage, etc., these characteristics would have to be autonomously gauged from the air to determine suitability for landing, a problem which is very challenging. It necessitates having the ability to accurately estimate elevation and terrain slope, to reliably classify vegetation and to be able to detect small debris.

As part of the SEC extension program with Boeing, the BEAR team has designed a system to land an autonomous helicopter in an unknown environment [Geyer, CDC 2004] which will be flown on the Maverick platform at the end of May 2005. The system's design was constrained by several factors: real-time operation, accurate terrain estimates, necessitating accurate position and attitude estimates, passivity, and arbitrary terrain. For example, a naive choice might be to use a stereo vision system; error in the estimates of depth from two view, however, increase with the square of depth, thereby prohibiting the use of stereo vision. The resulting system incorporates four major components: fusion of GPS, INS and image measurements for increased accuracy of position, which can operate in environments of unknown terrain; a system that creates digital elevation maps in real-time by integrating image measurements from multiple views (not juts two) using a recursive filter; and the implementation of a site selection strategy based on appearance and estimated elevation maps. The on-board computer continually evaluates and analyzes the observed terrain to determine sites tolerable for landing depending on factors such as the helicopter's landing constraints, desired size of clearing without trees or tall grasses and maximum landing slope. We predict that vision in autonomous systems will revolutionize control systems, like radar helped revolutionize combat in World War II. We firmly believe that autonomous sensing is necessary to achieve this goal, and that computer vision will have to form a major component of many control systems. It must be recognized that sensing by vision is not appropriate for all circumstances: very low light conditions where night vision is ineffective; or in situations where vision cannot demonstrate sufficient accuracy. Furthermore, many challenges need to be overcome, including guaranteed robustness, computational speed. Nevertheless, one cannot help but think that vision is prevalent everywhere in nature.

Hypothetical expanding box search pattern superimposed on the Victorville area and the flight zone.
View of reconstructed terrain near Victorville , CA using the Recursive Multi-Frame Planar Parallax Algorithm.

Swarms: SmartBAT Development

The SWARMS project's goall is to develop a framework and methodology for the analysis of swarming behavior in biology and the synthesis of bio-inspired swarming behavior for engineered systems. We will be actively engaged in addressing questioins such as: Can large numbers of autonomously functioning vehicles be reliably deployed in the form of a “swarm” to carry out a prescribed mission and to respond as a group to high-level management commands? Can such a group successfully function in a potentially hostile environment, without a designated leader, with limited communications between its members, and/or with different and potentially dynamically changing “roles” for its members?

BEAR group has started developing a large number of easily deployable/retrievable fixed-wing platform based on commerically available flying wing aircraft . We will be soon able to demonstrate the swarm-like coordination of SmartBATs as shown in the following image- by providing fully decentralized online optimizing capability to each vehicle, which actively computes a trajectory that will keep the host vehicle in a safe formation to the destination.

 

BEAR Emblem About Us | Site Map | Contact Us | ©2005 Berkeley Aerobot Team, University of California, Berkeley, Berkeley, CA94720, USA