E. Otwell E. Otwell

Robots are becoming classroom tutors. But will they make the grade?

Mechanical mentors try to find their place as teacher’s helper

Pondering a tablet screen displaying a town scene, a pre-K student tilts her head to the side and taps her lip thoughtfully.

“What are we trying to find?” asks the plush, red and blue robot called Tega that’s perched on the desk beside the girl. The bot resembles a teddy bear–sized Furby.

“We are trying to find lavender-colored stuff,” the girl explains. Lavender is a new vocabulary word. “OK!” Tega chirps.

The girl uses her forefinger to pan around the scene. She eventually selects an image of a girl — not wearing purple. The game puts a red mark through her choice: wrong.

The girl slumps down in her chair, head dropped to her chest as Tega says, “I’m sure you will do better next time. I believe in you.”

The robot, which MIT researchers are testing with students in a Boston-area public school, tilts toward the girl, who leans in close so that her cheek is right next to Tega’s.

Now it’s the robot’s turn. “Time to perform!” it says. The scene on-screen shifts, as though the bot is telepathically controlling the tablet. “Hmm …”

Tega looks up at its partner, as though seeking confirmation that it’s doing this right, and the girl cups the bot’s cheeks encouragingly. The robot looks back at the screen. The girl rests her hand in the robot’s soft fur and murmurs, “I believe in you.”

This kind of tight connection is typical of child-robot interactions, says MIT social robotics and human-robot interaction researcher Cynthia Breazeal. Her team is investigating how this turn-taking robot can help students learn. Kids have a “special kind of affinity” with robots, she says.

When working with social robots, like this furry Tega, kids will chat with, hug, pet and otherwise treat the machines like living beings.

 H.W. Park et al/Proceedings of the 2017 ACM/IEEE International Conference on Human-Robot Interaction

Although adults might quickly become disenchanted with machines that aren’t very perceptive or don’t speak more than scripted sentences, children are liable to chat with, listen to and otherwise treat even basic robots as sentient, social beings, says Tony Belpaeme, a social roboticist at Ghent University in Belgium. Researchers like Breazeal and Belpaeme are trying to leverage that connection to create robots that engage with kids as tutors and peer learners.

These robots aren’t meant to replace human teachers, says Paul Vogt, a social robotics and language development researcher at Tilburg University in the Netherlands. But customizable, endlessly patient automatons could provide students with one-on-one attention in crowded classrooms. That extra support may be especially helpful for children with special needs or for students who are learning in a different language than they’re used to, says Belpaeme, who is studying how robots can help immigrant children in Europe pick up a second language.

Robots might also help homeschooled students, proponents say, or teach in areas where human experts are in short supply. English-speaking robots are slated to enter some 500 Japanese classrooms this year for exactly that purpose. Hundreds of Chinese kindergarten classes also have adopted educational robots. But in Western countries, these devices have yet to invade classrooms.

Just like any expensive educational technology, however, classroom robots may never make it to every classroom. Computer and cognitive scientist Brian Scassellati of Yale University and colleagues have had success with a device named Keepon that looks like two stacked yellow tennis balls with eyes and a nose. “When we produce them in the lab, they’re probably costing us about $200 total,” he says. But many researchers use the humanoid Nao robot, which costs several thousand dollars a pop, raising the question of how many schools will be able to afford the classroom helpers.

Relatively inexpensive robots such as Keepon, which costs researchers about $200 to produce, may be an affordable option for schools on tight budgets.

 D. Leyzberg et al/Proceedings of the Annual Meeting of the Cognitive Science Society 2012

“There’s a lot of hype about robots,” says Goren Gordon, a natural and artificial curiosity researcher at Tel Aviv University. At this point, most testing has been short-term in small groups of children. So little is known about the potential risks involved when young kids keep close company with automatons. Yet early testing suggests that robots could help students learn new skills and promote good study habits and positive attitudes toward learning. Researchers still have a lot to figure out about best practices and potential impacts if educational robots are going to achieve tenure.

Here to help

Before grading robots on their teaching abilities, consider why automated educators might work better as physical rather than virtual entities. It turns out that a robot’s body may be just as important as its brain. A review of 33 studies that examined how adults and children respond to physically present robots, videos of the robots and animated versions of those same robots revealed that people generally view physical robots more positively and find them more persuasive than videotaped and animated robots. Jamy Li of the University of Twente in the Netherlands reported these results in 2015 in the International Journal of Human-Computer Studies.

“There’s something about robots that sets them apart from a computer,” Belpaeme says. “The exact same content delivered by a robot somehow makes our brains sit up and pay attention…. We don’t yet know why that is.” Still, roboticists have exploited that attention-grabbing edge to build machines that relay information on everything from math to nutrition and sign language.

Of course, a well-rounded education is about far more than learning facts. It’s also about developing good study habits and attitudes toward education that will make students lifelong learners. In this area, robots have proved useful.

On a very basic level, robots can make schoolwork more fun, proponents assert. “If kids enjoy learning, they’re going to learn more,” Belpaeme says. “It’s really as simple as that.” Researchers at the University of Wisconsin–Madison witnessed robots’ power to make schoolwork fun when they designed a bot named Minnie to support children’s reading at home. Minnie, described last August in Science Robotics, comments on a book as the child reads aloud, shows emotional responses to stories and summarizes plot points to support reading comprehension.

Educational robots have been designed to help students learn a wide range of topics, from handwriting to mathematics. This robot, named Minnie, supports children on reading assignments.

Sarah Maughan (2018)

Roboticist Bilge Mutlu and learning researcher Joseph Michaelis randomly assigned 24 students ages 10 to 12 to either two weeks of reading aloud alone or with Minnie. Afterward, the solo readers gave the activity more mixed reviews, reporting, for example, “I didn’t not like it, but I didn’t, like, really enjoy it.” Only four said the activity motivated them to read more. Kids in the robot group said reading to Minnie was “fun” and “a cool experience.” Seven students said they felt more motivated to read. 

Robots can also encourage specific reasoning strategies, such as thinking aloud, which is supposed to help students craft more deliberate, organized plans for multistep problem-solving. Computer scientist Chien-Ming Huang of Johns Hopkins University and colleagues programmed a Nao robot to nod along with a child’s speech and remind students who lapse into silence to keep going.

To test whether this supportive robot helped students learn, researchers randomly assigned 26 kids who were about 11 years old to solve math word problems while thinking aloud with or without the robot’s encouragement. From a pretest to a posttest taken about one week later, the robot-trained children increased their own scores an average of 52 percent; solo students self-improved by an average of 39 percent, the researchers reported last March in Chicago at the International Conference on Human-Robot Interaction, or HRI 2018.

For a more deep-rooted effect on students’ educational experiences, robots can model certain beliefs about learning, like a growth mind-set: the idea that success comes from effort and perseverance, rather than inherent ability.

In one experiment, 33 children ages 5 to 9 solved geometric puzzles called tangrams with a Tega. Half the kids partnered with a robot that made growth mind-set comments, such as, “You are not afraid of a challenge. I like that!” Other students worked with a bot that stated facts only: “You solved the puzzle.”

Before and after working with the robot, each child completed an assessment that rated growth mind-set from 0 to 10. The growth mind-set cohort’s scores, on average, increased a small amount, 7.63 to 8.06, but the neutral bot group’s scores dropped from 6.94 to 6.59, Breazeal and colleagues reported in Vienna at HRI 2017.

“As soon as you put a robot in the classroom, everything changes,” says Brian Scassellati of Yale University. “The kids are excited, they’re engaged.” The trick is getting devices like this Nao robot to hold students’ attention after the novelty is gone.

 Courtesy of B. Scassellati/Yale Univ.

Personalization problems

Although robots show the potential to positively influence students, tailoring a bot’s behavior to an individual is still a major challenge. Roboticists have created machines that can make some simple decisions, like choosing when to encourage a student to take a break. In a study presented at HRI 2017, Scassellati’s team found that when robots offered breaks as a reward for good work, or an opportunity to refocus if a student was struggling, children learned more than if the robot called time-outs at regular intervals.

Designing robots that track student performance to adjust pacing and choose what to teach next is trickier. Some robots have been programmed to adjust activity difficulty based on student proficiency, but researchers have had trouble showing that these bots help students learn more than generic robots do.

What if robots could go beyond responding to performance by keeping tabs on how a student is feeling? Gordon and colleagues at MIT explored this idea by creating a Tega robot that analyzed facial expressions for levels of engagement and valence, which is basically “the goodness of the emotion,” Gordon says. For instance, happiness has positive valence and anger has negative. While working with students on a Spanish-language learning game that involved packing for a trip to Spain, the robot offered various types of feedback, from an excited “Woo-hoo, you’re trying so hard!” to game-related comments like, “The suitcase looks heavy.”

“The robot slowly learns which … behaviors result in high valence and high engagement,” and becomes more likely to use those behaviors at the right time, Gordon says. In three to seven sessions over two months, two groups of nine preschoolers worked with either this adaptable Tega or a nonadaptive Tega. From the first to final session, kids in the personalized group generally became more positive about the interaction, with their valence increasing an average of seven points on a scale of −100 to 100. In the impersonal group, valence dropped an average of 18 points, researchers reported in Phoenix in 2016 at an Association for the Advancement of Artificial Intelligence conference.

Robots attuned to students’ thoughts and feelings may make better tutors and learning companions if they can offer the right level of personalization without becoming predictable. But some educators are concerned about the amount of data machines would have to collect and store to do that job right. Human teachers may be able to get a general read on a student’s state of mind. But a robot designed to exhaustively analyze every facial expression or game move a child makes may be able to gather such detailed information on kids that it would constitute an invasion of privacy.

This concern was raised in a series of focus groups with certified and student teachers discussing educational robots. Some participants worried that companies might try to buy that student data from schools, Sofia Serholt, a child-robot interaction researcher at Chalmers University of Technology in Sweden, and colleagues reported in November 2017 in AI & Society.

Robotics ethicist Amanda Sharkey also notes that kids might feel compelled to share private information with a robot peer that acts like a friend. One remedy might include requiring robots to continually divulge what information is being collected and who the robots share it with, says Sharkey, of the University of Sheffield in England.

Social savvy

If privacy concerns about oversharing with robots could be addressed, kids’ comfort with robotic companions could be a strong force for good in the classroom. Scassellati recalls one first-grade boy who worked on English language skills with a robot. “He was so afraid to talk in class, he was so worried about making mistakes,” Scassellati says. But when the student worked one-on-one with a nonjudgmental, patient robot peer, “the first time he made a mistake … and the robot corrected him, he paused for a second, and then he went on, and it was OK.”

Gordon similarly recalls an especially shy student, who “only whispered in your ear; he didn’t talk at all,” he says. But “after the fourth or fifth interaction [with a robot], he started hugging the robot. Every three or four minutes, just stopped and started hugging the robot.”

Capitalizing on this potential for child-robot kinship could help keep students invested even after the novelty effect wears off, so that educational robots don’t end up collecting dust in a corner, Michaelis says. To that end, researchers have begun investigating how robots programmed to be more convivial can better hold students’ attention and improve learning. 

Social robotics researcher Ginevra Castellano of Uppsala University in Sweden and colleagues programmed iCat, a yellow robot with a feline face, to express empathy to test if that would help keep kids engaged with the robot for the long term. Over five weeks, iCat played weekly chess exercises with 16 children in Portugal, ages 8 and 9. The robot, described in 2014 in the International Journal of Social Robotics, monitored the game status and students’ facial expressions and offered advice or emotional support when students looked unhappy. After the first and final interactions, kids filled out questionnaires that rated their feelings of social presence with the robot — that is, how much working with iCat felt like interacting with an intelligent, emotional being — from 1 to 5.

In an earlier study with a similar setup but a nonempathetic iCat robot, kids generally rated their sense of social presence between 2 and 4, and these scores declined between the first and fifth interactions. The empathetic iCat kept the kids at a high level of social presence — between 4 and 5 — from the first through the final session.

Students who played chess with an iCat robot designed to express empathy reported feeling as if they were having a real social interaction.

 I. Leite/Int. J. of Soc. Robotics 2014

But robots’ sociability can be a double-edged, distracting sword, as Belpaeme’s team discovered when using a sociable Nao robot to teach 7- and 8-year-olds in the United Kingdom a strategy for identifying prime numbers. Twelve kids worked with this robot, which used social behaviors, calling the child by name and making eye contact. Another 11 students worked with an asocial bot. From a pretest to a posttest, kids who worked with the asocial bot improved their scores on a 12-point test an average of 2.18 points; the social robot group improved an average of 1.34 points, researchers reported in Portland, Ore., at HRI 2015.

The socially adept bot may have diverted attention away from the lesson; children spent about 45 percent more time looking at the social robot than the asocial one.

There are other reasons not to make the robots too engaging. Huang likens the dilemma to concerns about excessive screen time, which may put young children at risk for speech delay (SN Online: 5/12/17). “Obviously we have good intentions for these educational robots,” he says, “but the long-term side effects … are unclear.” Some teachers in Serholt’s focus groups expressed similar concerns that kids who spend too much time chatting with robots may lose some ability to decode human facial expressions or the youngsters may adopt more robotic mannerisms.

For Sharkey, “the main concern would be that [kids] come to prefer interacting with the robot.” A robot that’s always encouraging and never disagrees would probably be easier company than other kids. A child who spends more time hanging around agreeable machines than with peers may not develop the social skills necessary to navigate interpersonal conflict, Sharkey says.

Bridges left to cross

So far, investigations of student-robot interactions have typically lasted a couple of weeks or months at most. “What we would want to get up to is a full academic year,” Breazeal says. Roboticists also need to test their technology with children from more diverse backgrounds. Belpaeme and colleagues recently ran an experiment with tutoring robots that helped about 200 children learn a second language. Compared with most educational robot studies, 200 students is a staggering number, says Huang, but “in the real world, this is like nothing.”

Amid questions about how they should or shouldn’t behave, today’s robots are still pretty limited in what they can do. Educational robots are typically designed to work on very specific tasks. The robots still have trouble understanding the high-pitched and grammatically spotty speech of little kids and don’t have the dexterity to participate in many physical learning activities such as science lab experiments.

“We are still a long way” from educational robots that can interact with students like real people, says Ana Paiva, an artificial intelligence researcher at the University of Lisbon in Portugal. Still, it’s difficult to watch a kid doting on a fluffy Tega or making small talk with a seemingly interested Nao and not imagine a future where robots might join teachers and students in class photos.

For full references please use source link below.

REGISTER NOW

By Maria Temming / Science News Technology Writer

Maria Temming is the technology writer at Science News. Maria has undergraduate degrees in physics and English from Elon University and a master's degree in science writing from MIT. She has written for Scientific AmericanSky & Telescope and NOVA Next. She’s also a former Science News intern. 

(Source: sciencenews.org; February 12, 2019; https://tinyurl.com/y5bdcnej)
Back to INF

Loading please wait...