Saturday, May 14, 2011

Controling Robotic Arms Is Child's Play

"The input device contains various movement sensors, also called inertial sensors," says Bernhard Kleiner of the Fraunhofer Institute for Manufacturing Engineering and Automation IPA in Stuttgart, who leads the project. The individual micro-electromechanical systems themselves are not expensive. What the scientists have spent time developing is how these sensors interact."We have developed special algorithms that fuse the data of individual sensors and identify a pattern of movement. That means we can detect movements in free space," summarizes Kleiner.

What may at first appear to be a trade show gimmick, is in fact a technology that offers numerous advantages in industrial production and logistical processes. The system could be used to simplify the programming of industrial robots, for example. To date, this has been done with the aid of laser tracking systems: An employee demonstrates the desired motion with a hand-held baton that features a white marker point. The system records this motion by analyzing the light reflected from a laser beam aimed at the marker. Configuring and calibrating the system takes a lot of time. The new input device should eliminate the need for these steps in the future -- instead, employees need only pick up the device and show the robot what it is supposed to do.

The system has numerous applications in medicine, as well. Take, for example, gait analysis. Until now, cameras have made precise recordings of patients as they walk back and forth along a specified path. The films reveal to the physician such things as how the joints behave while walking, or whether incorrect posture in the knees has been improved by physical therapy. Installing the cameras, however, is complex and costly, and patients are restricted to a predetermined path. The new sensor system can simplify this procedure: Attached to the patient's upper thigh, it measures the sequences and patterns of movement -- without limiting the patient's motion in any way.

"With the inertial sensor system, gait analysis can be performed without a frame of reference and with no need for a complex camera system," explains Kleiner. In another project, scientists are already working on comparisons of patients' gait patterns with those patterns appearing in connection with such diseases as Parkinson's.

Another medical application for the new technology is the control of active prostheses containing numerous small actuators. Whenever the patient moves, the prosthesis in turn also moves; this makes it possible for a leg prosthesis to roll the foot while walking. Here, too, the sensor could be attached to the patient's upper thigh and could analyze the movement, helping to regulate the motors of the prosthesis. Research scientists are currently working on combining the inertial sensor system with an electromyographic (EMG) sensor. Electromyography is based on the principle that when a muscle tenses, it produces an electrical voltage which a sensor can then measure by way of an electrode. If the sensor is placed, for example, on the muscle responsible for lifting the patient's foot, the sensor registers when the patient tenses this muscle -- and the prosthetic foot lifts itself. EMG sensors like this are already available but are difficult to position.

"While standard EMG sensors consist of individual electrodes that have to be positioned precisely on the muscle, our system is made up of many small electrodes that attach to a surface area. This enables us to sense muscle movements much more reliably," says Kleiner.


Source

Tuesday, May 10, 2011

Robotics: A Tiltable Head Could Improve the Ability of Undulating Robots to Navigate Disaster Debris

Researchers at the Georgia Institute of Technology recently built a robot that can penetrate and"swim" through granular material. In a new study, they show that varying the shape or adjusting the inclination of the robot's head affects the robot's movement in complex environments.

"We discovered that by changing the shape of the sand-swimming robot's head or by tilting its head up and down slightly, we could control the robot's vertical motion as it swam forward within a granular medium," said Daniel Goldman, an assistant professor in the Georgia Tech School of Physics.

Results of the study will be presented on May 10 at the 2011 IEEE International Conference on Robotics and Automation in Shanghai. Funding for this research was provided by the Burroughs Wellcome Fund, National Science Foundation and Army Research Laboratory.

The study was conducted by Goldman, bioengineering doctoral graduate Ryan Maladen, physics graduate student Yang Ding and physics undergraduate student Andrew Masse, all from Georgia Tech, and Northwestern University mechanical engineering adjunct professor Paul Umbanhowar.

"The biological inspiration for our sand-swimming robot is the sandfish lizard, which inhabits the Sahara desert in Africa and rapidly buries into and swims within sand," explained Goldman."We were intrigued by the sandfish lizard's wedge-shaped head that forms an angle of 140 degrees with the horizontal plane, and we thought its head might be responsible for or be contributing to the animal's ability to maneuver in complex environments."

For their experiments, the researchers attached a wedge-shaped block of wood to the head of their robot, which was built with seven connected segments, powered by servo motors, packed in a latex sock and wrapped in a spandex swimsuit. The doorstop-shaped head -- which resembled the sandfish's head -- had a fixed lower length of approximately 4 inches, height of 2 inches and a tapered snout. The researchers examined whether the robot's vertical motion could be controlled simply by varying the inclination of the robot's head.

Before each experimental run in a test chamber filled with quarter-inch-diameter plastic spheres, the researchers submerged the robot a couple inches into the granular medium and leveled the surface. Then they tracked the robot's position until it reached the end of the container or swam to the surface.

The researchers investigated the vertical movement of the robot when its head was placed at five different degrees of inclination. They found that when the sandfish-inspired head with a leading edge that formed an angle of 155 degrees with the horizontal plane was set flat, negative lift force was generated and the robot moved downward into the media. As the tip of the head was raised from zero to 7 degrees relative to the horizontal, the lift force increased until it became zero. At inclines above 7 degrees, the robot rose out of the medium.

"The ability to control the vertical position of the robot by modulating its head inclination opens up avenues for further research into developing robots more capable of maneuvering in complex environments, like debris-filled areas produced by an earthquake or landslide," noted Goldman.

The robotics results matched the research team's findings from physics experiments and computational models designed to explore how head shape affects lift in granular media.

"While the lift forces of objects in air, such as airplanes, are well understood, our investigations into the lift forces of objects in granular media are some of the first ever," added Goldman.

For the physics experiments, the researchers dragged wedge-shaped blocks through a granular medium. Blocks with leading edges that formed angles with the horizontal plane of less than 90 degrees resembled upside-down doorstops, the block with a leading edge equal to 90 degrees was a square, and blocks with leading edges greater than 90 degrees resembled regular doorstops.

They found that blocks with leading edges that formed angles with the horizontal plane less than 80 degrees generated positive lift forces and wedges with leading edges greater than 120 degrees created negative lift. With leading edges between 80 and 120 degrees, the wedges did not generate vertical forces in the positive or negative direction.

Using a numerical simulation of object drag and building on the group's previous studies of lift and drag on flat plates in granular media, the researchers were able to describe the mechanism of force generation in detail.

"When the leading edge of the robot head was less than 90 degrees, the robot's head experienced a lift force as it moved forward, which resulted in a torque imbalance that caused the robot to pitch and rise to the surface," explained Goldman.

Since this study, the researchers have attached a wedge-shaped head on the robot that can be dynamically modulated to specific angles. With this improvement, the researchers found that the direction of movement of the robot is sensitive to slight changes in orientation of the head, further validating the results from their physics experiments and computational models.

Being able to precisely control the tilt of the head will allow the researchers to implement different strategies of head movement during burial and determine the best way to wiggle deep into sand. The researchers also plan to test the robot's ability to maneuver through material similar to the debris found after natural disasters and plan to examine whether the sandfish lizard adjusts its head inclination to ensure a straight motion as it dives into the sand.

This material is based on research sponsored by the Burroughs Wellcome Fund, the National Science Foundation (NSF) under Award Number PHY-0749991, and the Army Research Laboratory (ARL) under Cooperative Agreement Number W911NF-08-2-0004.


Source

Saturday, May 7, 2011

Robot Engages Novice Computer Scientists

A product of CMU's famed Robotics Institute, Finch was designed specifically to make introductory computer science classes an engaging experience once again.

A white plastic, two-wheeled robot with bird-like features, Finch can quickly be programmed by a novice to say"Hello, World," or do a little dance, or make its beak glow blue in response to cold temperature or some other stimulus. But the simple look of the tabletop robot is deceptive. Based on four years of educational research sponsored by the National Science Foundation, Finch includes a number of features that could keep students busy for a semester or more thinking up new things to do with it.

"Students are more interested and more motivated when they can work with something interactive and create programs that operate in the real world," said Tom Lauwers, who earned his Ph.D. in robotics at CMU in 2010 and is now an instructor in the Robotics Institute's CREATE Lab."We packed Finch with sensors and mechanisms that engage the eyes, the ears -- as many senses as possible."

Lauwers has launched a startup company, BirdBrain Technologies, to produce Finch and now sells them online atwww.finchrobot.comfor$99 each.

"Our vision is to make Finch affordable enough that every student can have one to take home for assignments," said Lauwers, who developed the robot with Illah Nourbakhsh, associate professor of robotics and director of the CREATE Lab. Less than a foot long, Finch easily fits in a backpack and is rugged enough to survive being hauled around and occasionally dropped.

Finch includes temperature and light sensors, a three-axis accelerometer and a bump sensor. It has color-programmable LED lights, a beeper and speakers. With a pencil inserted in its tail, Finch can be used to draw pictures. It can be programmed to be a moving, noise-making alarm clock. It even has uses beyond a robot; its accelerometer enables it to be used as a 3-D mouse to control a computer display.

Robot kits suitable for students as young as 12 are commercially available, but often cost more than the Finch, Lauwers said. What's more, the idea is to use the robot to make computer programming lessons more interesting, not to use precious instructional time to first build a robot.

Finch is a plug-and-play device, so no drivers or other software must be installed beyond what is used in typical computer science courses. Finch connects with and receives power from the computer over a 15-foot USB cable, eliminating batteries and off-loading its computation to the computer. Support for a wide range of programming languages and environments is coming, including graphical languages appropriate for young students. Finch currently can be programmed with the Java and Python languages widely used by educators.

A number of assignments are available on the Finch Robot website to help teachers drop Finch into their lesson plans, and the website allows instructors to upload their own assignments or ideas in return for company-provided incentives. The robot has been classroom-tested at the Community College of Allegheny County, Pa., and by instructors in high school, university and after-school programs.

"Computer science now touches virtually every scientific discipline and is a critical part of most new technologies, yet U.S. universities saw declining enrollments in computer science through most of the past decade," Nourbakhsh said."If Finch can help motivate students to give computer science a try, we think many more students will realize that this is a field that they would enjoy exploring."


Source

Friday, May 6, 2011

EEG Headset With Flying Harness Lets Users 'Fly' by Controlling Their Thoughts

Creative director and Rensselaer MFA candidate Yehuda Duenyas describes the"Infinity Simulator" as a platform similar to a gaming console -- like the Wii or the Kinect -- writ large.

"Instead of you sitting and controlling gaming content, it's a whole system that can control live elements -- so you can control 3-D rigging, sound, lights, and video," said Duenyas, who works under the moniker"xxxy.""It's a system for creating hybrids of theater, installation, game, and ride."

Duenyas created the"Infinity Simulator" with a team of collaborators, including Michael Todd, a Rensselaer 2010 graduate in computer science. Duenyas will exhibit the new system in the art installation"The Ascent" on May 12 at Curtis R. Priem Experimental Media and Performing Arts Center (EMPAC).

Ten computer programs running simultaneously link the commercially available EEG headset to the computer-controlled 3-D flying harness and various theater systems, said Todd.

Within the theater, the rigging -- including the harness -- is controlled by a Stage Tech NOMAD console; lights are controlled by an ION console running MIDI show control; sound through MAX/MSP; and video through Isadora and Jitter. The"Infinity Simulator," a series of three C programs written by Todd, acts as intermediary between the headset and the theater systems, connecting and conveying all input and output.

"We've built a software system on top of the rigging control board and now have control of it through an iPad, and since we have the iPad control, we can have anything control it," said Duenyas."The 'Infinity Simulator' is the center; everything talks to the 'Infinity Simulator.'"

The May 12"The Ascent" installation is only one experience made possible by the new platform, Duenyas said.

"'The Ascent' embodies the maiden experience that we'll be presenting," Duenyas said."But we've found that it's a versatile platform to create almost any type of experience that involves rigging, video, sound, and light. The idea is that it's reactive to the users' body; there's a physical interaction."

Duenyas, a Brooklyn-based artist and theater director, specializes in experiential theater performances.

"The thing that I focus on the most is user experience," Duenyas said."All the shows I do with my theater company and on my own involve a lot of set and set design -- you're entering into a whole world. You're having an experience that is more than going to a show, although a show is part of it."

The"Infinity Simulator" stemmed from an idea Duenyas had for such a theatrical experience.

"It started with an idea that I wanted to create a simulator that would give people a feeling of infinity," Duenyas said. His initial vision was that of a room similar to a Cave Automated Virtual Environment -- a room paneled with projection screens -- in which participants would be able to float effortlessly in an environment intended to evoke a glimpse into infinity.

At Rensselaer, Duenyas took advantage of the technology at hand to explore his idea, first with a video game he developed in 2010, then -- working through the Department of the Arts -- with EMPAC's computer-controlled 3-D theatrical flying harness.

"The charge of the arts department is to allow the artists that they bring into the department to use technology to enhance what they've been doing already," Duenyas said."In coming here (EMPAC), and starting to translate our ideas into a physical space, so many different things started opening themselves up to us."

The 2010 video game, also developed with Todd, tracked the movements -- pitch and yaw -- of players suspended in a custom-rigged harness, allowing players to soar through simulated landscapes. Duenyas said that that game (also called the"Infinity Simulator") and the new platform are part of the same vision.

EMPAC Director Johannes Goebel saw the game on display at the 2010 GameFest and discussed the custom-designed 3-D theatrical flying rig in EMPAC with Duenyas. Working through the Arts Department, Duenyas submitted a proposal to work with the rig, and his proposal was accepted.

Duenyas and his team experimented -- first gaining peripheral control over the system, and then linking it to the EEG headset -- and created the Ascent installation as an initial project. In the installation, the Infinity Simulator is programmed to respond to relaxation.

"We're measuring two brain states -- alpha and theta -- waking consciousness and everyday brain computational processing," said Duenyas."If you close your eyes and take a deep breath, that processing power decreases. When it decreases below a certain threshold, that is the trigger for you to elevate."

As a user rises, their ascent triggers a changing display of lights, sound, and video. Duenyas said he wants to hint at transcendental experience, while keeping the door open for a more circumspect interpretation.

"The point is that the user is trying to transcend the everyday and get into this meditative state so they can have this experience. I see it as some sort of iconic spiritual simulator. That's the serious side," he said."There's also a real tongue-in-cheek side of my work: I want clouds, I want Terry Gilliam's animated fist to pop out of a cloud and hit you in the face. It's mixing serious religious symbology, but not taking it seriously."

The humor is prompted, in part, by the limitations of this earliest iteration of Duenyas' vision.

"It started with, 'I want to have a glimpse of infinity,' 'I want to float in space.' Then you get in the harness and you're like 'man, this harness is uncomfortable,'" he said."In order to achieve the original vision, we had to build an infrastructure, and I still see development of the infinity experience is a ways off; but what we can do with the infrastructure in a realistic time frame is create 'The Ascent,' which is going to be really fun, and totally other."

Creating the"Infinity Simulator" has prompted new possibilities.

"The vision now is to play with this fun system that we can use to build any experience," he said."It's sort of overwhelming because you could do so many things -- you could create a flight through cumulus clouds, you could create an augmented physicality parkour course where you set up different features in the room and guide yourself to different heights. It's limitless."


Source

Wednesday, May 4, 2011

Robots Learn to Share: Why We Go out of Our Way to Help One Another

Altruism, the sacrificing of individual gains for the greater good, appears at first glance to go against the notion of"survival of the fittest." But altruistic gene expression is found in nature and is passed on from one generation to the next. Worker ants, for example, are sterile and make the ultimate altruistic sacrifice by not transmitting their genes at all in order to insure the survival of the queen's genetic makeup. The sacrifice of the individual in order to insure the survival of a relative's genetic code is known as kin selection. In 1964, biologist W.D. Hamilton proposed a precise set of conditions under which altruistic behavior may evolve, now known as Hamilton's rule of kin selection. Here's the gist: If an individual family member shares food with the rest of the family, it reduces his or her personal likelihood of survival but increases the chances of family members passing on their genes, many of which are common to the entire family. Hamilton's rule simply states that whether or not an organism shares its food with another depends on its genetic closeness (how many genes it shares) with the other organism.

Testing the evolution of altruism using quantitative studies in live organisms has been largely impossible because experiments need to span hundreds of generations and there are too many variables. However, Floreano's robots evolve rapidly using simulated gene and genome functions and allow scientists to measure the costs and benefits associated with the trait. Additionally, Hamilton's rule has long been a subject of much debate be-cause its equation seems too simple to be true."This study mirrors Hamilton's rule re-markably well to explain when an altruistic gene is passed on from one generation to the next, and when it is not," says Keller.

Previous experiments by Floreano and Keller showed that foraging robots doing simple tasks, such as pushing seed-like objects across the floor to a destination, evolve over multiple generations. Those robots not able to push the seeds to the correct location are selected out and cannot pass on their code, while robots that perform comparatively better see their code reproduced, mutated, and recombined with that of other robots into the next generation -- a minimal model of natural selection. The new study by EPFL and UNIL researchers adds a novel dimension: once a foraging robot pushes a seed to the proper destination, it can decide whether it wants to share it or not. Evolutionary experiments lasting 500 generations were repeated for several scenarios of altruistic interaction -- how much is shared and to what cost for the individual -- and of genetic relatedness in the population. The researchers created groups of relatedness that, in the robot world, would be the equivalent of complete clones, siblings, cousins and non-relatives. The groups that shared along the lines of Hamilton's rule foraged better and passed their code onto the next generation.

The quantitative results matched surprisingly well the predictions of Hamilton's rule even in the presence of multiple interactions. Hamilton's original theory takes a limited and isolated vision of gene interaction into account, whereas the genetic simulations run in the foraging robots integrate effects of one gene on multiple other genes with Hamilton's rule still holding true. The findings are already proving useful in swarm robotics."We have been able to take this experiment and extract an algorithm that we can use to evolve cooperation in any type of robot," explains Floreano."We are using this altruism algo-rithm to improve the control system of our flying robots and we see that it allows them to effectively collaborate and fly in swarm formation more successfully."

This research was funded by the Swiss National Science Foundation, the Euro-pean Commission ECAgents and Swarmanoids projects, and the European Research Council.


Source