News
Key Takeaways
- In a study about perception by touch, SEAS researchers found that people form goals and use different styles of movement when they encounter unfamiliar objects.
- The study offers insight into designing user experiences in robotics, gaming, virtual tours and more.
To estimate the weight of a rock, you pick it up. Is it rough, or smooth? You run a finger over it. We’re constantly gathering information through our sense of touch, which is closely connected to how we move.
Patterns of movement that humans use to explore the physical world have long been studied in contexts ranging from human psychology to robotic perception. Psychologists call these movements “exploratory procedures” – using touch to reach a goal, such as identifying a property like weight or roughness.
Researchers at the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) wondered if other, new patterns of movements emerge when people encounter unfamiliar objects without any explicitly defined goals. A study led by robotics researcher and artist Buse Aktaş, former SEAS graduate student and current research group leader at the Max Planck Institute for Intelligent Systems, blended interactive art and observational science to examine how open-ended touch informs new, distinctive categories of movement.
Published in PLOS One, the results could lead to advances in human-robot or human-machine interactions, or more sophisticated uses of human touch responses in medical, industrial, or artistic applications.
“I’m very interested in how engineering design principles map to human behavior – especially if it’s in a creative context, and if we give creative freedom to our subjects,” said Aktaş, who led the project during her Ph.D. work with Professor Robert Howe, the Abbott and James Lawrence Professor of Engineering at SEAS.
Art installation-like experiment
In their study, Aktaş and team designed an art installation-like experiment to systematically observe how people touch and interact with unfamiliar objects. For the study design, they enlisted the expertise of co-author Roberta Klatzky of Carnegie Mellon University, a leading expert in the psychology of haptics (perception by touch).
During data collection, 40 participants cycled through three stations set up with various objects, some familiar – like a potato chip bag and a rolling pin – and some abstract, like a purple geometric from with a black strip. The third station was biomorphic, consisting of an intestine-like tubular structure with soft spikes.
Each structure would periodically stiffen and soften using pumped air, so that the researchers could see how participants responded to unexpected physical state changes. All interactions were recorded and analyzed.
Following the study, Aktaş created a related art installation at the Harvard Art Lab.
Detailed analysis revealed that participants performed “distinct and reliably observable interactive procedures,” according to their paper. These actions could all be linked to “self-determined goals” each participant devised upon encountering the strange objects, such as information-gathering, manipulation or play.
“I think one of the biggest takeaways was that people, even if you give them no goal, make up their own goals,” Aktaş said.
New categories of physical interactions
Four categories of physical interactions emerged. In “passive observational” interactions, users learned about the properties of the objects with little to no touch, such as hovering their hands, or stepping back to observe. “Active perceptual” interactions gave participants more information through touch, such as pressing, lifting, or rubbing. “Constructive” actions were performed to create new shapes or arrangements, like stacking, coiling, folding, flattening, or making knots or dents. Finally, “hedonic” actions elicited sensory experiences, such as stroking, flicking or massaging.
The researchers found that patterns of movement varied based on the type of object. People performed more “constructive” interactions in the abstract station, but they were more passive and observational in the station with the potato chip bag, suggesting that prior knowledge about objects affected what and how much physical interaction followed.
They also found that state changes, such as stiffening and softening, tend to lengthen the amount of time people spend physically interacting with an object.
The categories and observations could offer a foundation for designing interactive experiences more intentionally – for example, in an immersive video game, distinguishing between passive observation and active perception could inform the balance of visual and tactile clues, the researchers wrote.
The results could also hold value for designing interaction protocols for “safe, fluent, intuitive and rich” human-machine collaboration, the researchers wrote. This could have implications for the growing use of virtual tours and walkthroughs in art museums or real estate, as well as computer games where open-ended exploration is core to the experience.
As robotic systems increasingly embed intelligence through smart and responsive materials, eliciting human interactions through material design is another potential direction for the work.
Other paper co-authors were Paris Myers and Emily Salem.
Topics: Design, Materials Science & Mechanical Engineering, Robotics
Cutting-edge science delivered direct to your inbox.
Join the Harvard SEAS mailing list.
Scientist Profiles
Robert D. Howe
Abbott and James Lawrence Professor of Engineering
Press Contact
Anne J. Manning | amanning@seas.harvard.edu