An Indiana University cognitive scientist and collaborators have found that posture is critical in the early stages of acquiring new knowledge.
The study, conducted by Linda Smith, a professor in the IU Bloomington College of Arts and Sciences’Department of Psychological and Brain Sciences, in collaboration with a roboticist from England and a developmental psychologist from the University of Wisconsin-Madison, offers a new approach to studying the way “objects of cognition,” such as words or memories of physical objects, are tied to the position of the body.
“This study shows that the body plays a role in early object name learning, and how toddlers use the body’s position in space to connect ideas,” Smith said. “The creation of a robot model for infant learning has far-reaching implications for how the brains of young people work.”
Using both robots and infants, researchers examined the role bodily position played in the brain’s ability to “map” names to objects. They found that consistency of the body’s posture and spatial relationship to an object as an object’s name was shown and spoken aloud were critical to successfully connecting the name to the object.
The new insights stem from the field of epigenetic robotics, in which researchers are working to create robots that learn and develop like children, through interaction with their environment. Morse applied Smith’s earlier research to creating a learning robot in which cognitive processes emerge from the physical constraints and capacities of its body.
“A number of studies suggest that memory is tightly tied to the location of an object,” Smith said. “None, however, have shown that bodily position plays a role or that, if you shift your body, you could forget.”
To reach these conclusions, the study’s authors conducted a series of experiments, first with robots programmed to map the name of an object to the object through shared association with a posture, then with children age 12 to 18 months.
In one experiment, a robot was first shown an object situated to its left, then a different object to the right; then the process was repeated several times to create an association between the objects and the robot’s two postures. Then with no objects in place, the robot’s view was directed to the location of the object on the left and given a command that elicited the same posture from the earlier viewing of the object. Then the two objects were presented in the same locations without naming, after which the two objects were presented in different locations as their names were repeated. This caused the robot to turn and reach toward the object now associated with the name.
The robot consistently indicated a connection between the object and its name during 20 repeats of the experiment. But in subsequent tests where the target and another object were placed in both locations — so as to not be associated with a specific posture — the robot failed to recognize the target object. When replicated with infants, there were only slight differences in the results: The infant data, like that of the robot, implicated the role of posture in connecting names to objects.
“These experiments may provide a new way to investigate the way cognition is connected to the body, as well as new evidence that mental entities, such as thoughts, words and representations of objects, which seem to have no spatial or bodily components, first take shape through spatial relationship of the body within the surrounding world,” Smith said.
Smith’s research has long focused on creating a framework for understanding cognition that differs from the traditional view, which separates physical actions such as handling objects or walking up a hill from cognitive actions such as learning language or playing chess.
Additional research is needed to determine whether this study’s results apply to infants only, or more broadly to the relationship between the brain, the body and memory, she added. The study may also provide new approaches to research on developmental disorders in which difficulties with motor coordination and cognitive development are well-documented but poorly understood.
Scientists have developed an octopus-like robot, which can zoom through water with ultra-fast propulsion and acceleration never before seen in man-made underwater vehicles.
Most fast aquatic animals are sleek and slender to help them move easily through the water but cephalopods, such as the octopus, are capable of high-speed escapes by filling their bodies with water and then quickly expelling it to dart away.
Inspired by this, scientists from the University of Southampton, Massachusetts Institute of Technology (MIT) and the Singapore-MIT Alliance for Research and Technology built a deformable octopus-like robot with a 3D printed skeleton with no moving parts and no energy storage device other than a thin elastic outer hull.
The 30cm long self-propelling robot is inflated with water and then rapidly deflates by shooting the water out through its base to power its outstanding propulsion and acceleration, despite starting from a non-streamlined shape. As the rocket contracts, it can achieve more than 2.6 times the thrust of a rigid rocket doing the same manoeuvre.
It works like blowing up a balloon and then releasing it to fly around the room. However, the 3D printed polycarbonate skeleton inside keeps the balloon tight and the final shape streamlined, while fins on the back keep it going straight.The robot is capable of accelerating up to ten body lengths in less than a second. In recent laboratory tests, the robot accelerated a one kilogram payload up to 6mph in less than a second. This is comparable to a mini-cooper carrying an additional 350kg of weight (bringing the total weight of the car to 1,000kg) accelerating from a standstill to 60mph in one second – underwater.
This performance is unprecedented in man-made underwater vehicles.
Dr Gabriel Weymouth, Lecturer for the Southampton Marine and Maritime Institute at the University of Southampton and lead author of the study, says:
“Man-made underwater vehicle are designed to be as streamlined as possible, but with the exception of torpedoes, which use massive amounts of propellent, none of these vehicles achieve speeds of even a single body length per second or accelerations of 0.1g, despite significant mechanical complexity.
“Rigid bodies always lose energy to the surrounding water, but the rapidly shrinking form of the robot actually uses the water to help propel its ultra-fast escape, resulting in 53 per cent energy efficiency, which is better than the upper estimates for fast-starting fish.”
The researchers calculate that making the robot bigger would improve its fast-starting performance, which could have applications in the development of artificial underwater vehicles that can match the speed, manoeuvrability and efficiency of their biological inspirations. The understanding this study provides could also have an impact in other engineering fields where drag is critical, such as airplane wing design, and to the study of different shape-changing biological systems.
WildCat is a four-legged robot being developed to run fast on all types of terrain. So far WildCat has run at about 16 mph on flat terrain using bounding and galloping gaits. The video shows WildCat’s best performance so far. WildCat is being developed by Boston Dynamics with funding from DARPA’s M3 program. For more information about WIldCat visit: www.BostonDynamics.com
screenshot from youtube video by DARPA Defense Advanced Research Projects Agency
What has two arms, two legs and 28 hydraulically actuated joints? Its name is ATLAS, and it could be in charge of saving your life someday.
The Pentagon’s Defense Advanced Research Projects Agency is always unveiling new gizmos and gadgets, but the latest effort from DARPA is something that seems more out of a sci-fi movie than a science lab. The Defense Department’s experimental unit is looking to see what kind of robot it can develop to someday provide humanitarian aid and disaster relief, and its outsourcing that mission to scientists from the likes of MIT and other prestigious institutions participating in its latest Virtual Robotics Challenge (VRC).
image by @DARPA
When the VRC got underway last week, DARPA provided the six competing teams with access to ATLAS, a 6-foot-2-inch, 330-pound (1.8 m, 250 kg) robot that ideally will be able to enter disaster areas someday and provide relief efforts in instances where direct human involvement is impractical or impossible.
“During the first 24 hours [of Fukushima], there were several opportunities for intervention to help make the disaster less severe,” DARPA’s Dr. Gill Pratt told Ars Technica, “but unfortunately, people could not go in to that zone because the radiation was too high, and as a result, the disaster was worse than it could have been.”
In order to see what solutions could exist in the event of another nuclear disaster, this year’s VRC will ask scientists to work through December 2013 at programing the ATLAS skeleton to interact in the most efficient way possible. The builders behind the robot’s body, Boston Dynamics, say ATLAS “is a high mobility, humanoid robot designed to negotiate outdoor, rough terrain” with the ability for bipedal locomotion and the other perks that come with a few pairs of limbs.
“In extremely challenging terrain, ATLAS is strong and coordinated enough to climb using hands and feet, to pick its way through congested spaces,” Boston Dynamics said.
Boston Dynamics and DARPA have provided several copies of the ATLAS robot to this year’s VRC competitors in hopes of seeing who can put the best set of brains inside the hunk of metal. Competitors this year include the Institute for Human and Machine Cognition, Worcester Polytechnic Institute, TRACLabs Inc, Caltech’s Jet Propulsion Laboratory, Virginia Tech and the Massachusetts
“The Virtual Robotics Challenge was a proving ground for teams’ ability to create software to control a robot in a hypothetical scenario,” Dr. Pratt said in a statement issued by DARPA. “Now these seven teams will see if their simulation-honed algorithms can run a real machine in real environments. And we expect all teams will be further refining their algorithms, using both simulation and experimentation.”
In a video released by DARPA, so far ATLAS can walk briskly, avoid obstacles, climb stairs and even withstand the force of a mid-sized wrecking ball set to swing at its torso. After all, the six teams will have to prove that their personally-programmed ATLAS can withstand any event that a future army of robots might have to endure if unleashed in the wild: for humanitarian purposes, of course.
A climbing robot that grasps the micro-texture of the surface using special feet and special motions. The development team includes U Penn, Stanford, Berkeley, Carnegie Mellon and Boston Dynamics. The work was funded by DARPA.