f - Semantic Scholar

Report 2 Downloads 130 Views
Robotiic Learning Co ompanions for Early y Language D Develop pment Jacqu ueline M. Kory K

Sooy yeon Jeong g

Cynthia a L. Breaze eal

MIIT Media Lab 20 2 Ames St. Cambridge, MA 02139

MIT 77 Mass. Ave. Cambridge, MA 021 39

MIT T Media Lab 20 0 Ames St. Cambrid dge, MA 0213 39

jakory@ @media.mitt.edu

[email protected] edu

cynthiab@ @media.mitt.edu

interveentions targeeting young children’s language develoopment must acccount for this social context.

A ABSTRACT T R Research from th he past two decades indicates th hat preschool iss a ccritical time fo or children’s oral language and vocabulaary ddevelopment, which w in turn is i a primary predictor p of latter aacademic success. However, giv ven the inherentlly social nature of laanguage learnin ng, it is difficult to develop scallable interventions ffor young children. Here, we prresent one solutiion in the form of robotic learning g companions, using the Drag gonBot platform m. D Designed as inteeractive, social characters, these robots combin ne thhe flexibility and personalizzation afforded by education nal ssoftware with a crucial social co ontext, as peers and conversatio on ppartners. They can supplement teachers t and carregivers, allowin ng remote operation as well as the potential for f autonomoussly pparticipating witth children in language l learnin ng activities. Our O aaim is to demon nstrate the efficaacy of the Drago onBot platform as aan engaging, social, learning com mpanion.

In our research, we hhave found thatt sociable roboots, used as roboticc learning coompanions, cann provide the necessary social setting for lannguage learningg [1,4,8]. Sociaable robots leveragge the ways ppeople alreadyy communicatee with one anotheer – that is, soccial cues that hhumans easily interpret – to cre ate more intuuitive interfacees for interacttion. Their behaviiors may incoorporate speecch, nonverbal behaviors such m mimicry, gaze following, andd synchrony, m movement, or exp xpressions of affect. Combbined with eeducational softwaare, which caan provide a student-pacedd learning experieence, customiized curriculaa, and more individual attentioon than teacheers can allocatte in a classrooom setting, we arrgue that ourr robotic learnning companiions could providde an effectivee, engaging aand scalable eeducational experieence for youngg students.

C Categories and a Subjectt Descriptorrs H H.1.2 [Informattion Systems]: User/Machine Systems S – huma an fa factors, human in nformation processing; I.2.9 [Ro obotics]

K Keywords E Education; learn ning; play; robotic learning com mpanion; sociab ble robots

11. MOTIVA ATION R Research from m the past two o decades has revealed that a pprimary predicctor for schooll-aged children n's learning an nd aacademic succeess is the early development of oral languag ge kknowledge an nd vocabulary y skills [2,7 7]. Critical in ddeveloping these skills is earrly exposure no ot only to a ricch sset of words bu ut also to a larrge volume off words [2]. Th he ccontext in whiich these word ds are encountered is equally im mportant – that is, languagee is inherently y social. Infan nts ccan learn to diffferentiate pho onemes when hearing h a perso on sspeak, but not from f an audiov visual recordin ng [3]; similarlly, cchildren can learn some vocabulary when w watchin ng television, butt they may not n learn graammar rules or o ccomplex senteence structurees [5]. Interacctivity and th he sshared contextt of speaker and a listener are crucial. An ny

Figu ure 1: Two sociaal, interactive roobotic characteers, called drago nbots, used as llanguage learniing companionss for young children.

2. RO OBOT PLA ATFORM We arre using the DrragonBot platfform, designedd by Adam Setapeen and collaboorators [1,6], as our robotiic learning compaanion (Figure 1). The robot is based on “ssquash and stretchh” principles oof animation [99], creating moore natural and orrganic motion and allowing ffor a range of expressive body m movements, w while keeping thhe actuator couunt low. A smart phone runs thhe software coontrolling the robot and providdes a screen for the robot’s animated face. The phone’’s web camerra, microphonee, speaker, annd wireless capabiilities are uused to suppport remote presence

Permission to mak ke digital or hard copies of part or all of this work for fo personal or classro oom use is granted without w fee provideed that copies are not made or distributed d for profit or com mmercial advantage and that copies bear this notice and thee full citation on th he first page. Copyrrights for third-partty components of this work must be ho onored. For all oth her uses, contact th he Owner/Author. b the owner/authorr(s). Copyright is held by ICMI '13, Decembeer 9-13, 2013, Sydn ney, NSW, Australiaa. ACM 978-1-4503-2129-7/13/12. http://dx.doi.org/10 0.1145/2522848.253 31750

71

peer, and conversation partner for young children. Future work includes refining the robot’s repertoire of behaviors, introducing more autonomy, developing more extensive curricula and games to support joint child-robot language activities, and formally evaluating the robot’s abilities to support language development at a local preschool.

interactions. A custom tele-operation interface that runs either on a tablet or laptop computer allows researchers, caregivers, or teachers to speak and act “as” the robot.

5. ACKNOWLEDGMENTS We thank Natalie Freed, Adam Setapen, and the Personal Robots Group for their work developing the DragonBot platform. This research was supported by the National Science Foundation (NSF) under Grants 122886 and CCF-1138986, and Graduate Research Fellowship under Grant No. 1122374. Any opinions, findings and conclusions, or recommendations expressed in this paper are those of the authors and do not represent the views of the NSF.

6. REFERENCES

Figure 2: The tablet-based tele-operation interface for the dragonbot, which allows remote presence interactions.

[1] Freed, N. A. 2012. "This is the fluffy robot that only

The robots can display different emotions and internal states through their body motion and facial expressions, including agreement, disagreement, surprise, interest, confusion, and shyness, among others. They can follow or direct a child’s gaze by turning their bodies and moving their eyes. Speech is available through pre-recorded audio tracks or real-time voice streaming and voice pitch shifting from the tele-operation interface.

[2]

[3] [4]

3. LEARNING COMPANION The robots are versatile and could support a variety of interactions and curricula. For example, in upcoming work, we use plastic animals as well as tablet-based games as conversation props or play scenarios. In an initial study, a DragonBot played a digital tablet-based “food-sharing” game with preschool-age children and their parents [1]. This study revealed that through the interaction, children engaged in communicative and social behaviors, as well as language mimicry [1]. Parents guided children’s behavior and reinforced the robot as a social actor in the interaction, without prompting. This highlights how educational technologies, like these robots, are not designed to be replacements for parents or teachers – quite the opposite. The goal is to supplement what caregivers are already doing and scaffold or model beneficial behaviors that caregivers may not know to use.

[5]

[6]

[7]

[8]

4. CONCLUSION Our goal is to demonstrate the efficacy of the DragonBot platform as an engaging and social learning companion,

[9]

72

speaks French": language use between preschoolers, their families, and a social robot while sharing virtual toys. Master's Thesis. Massachusetts Institute of Technology. Hart, B. and Risley, T. R. 1995. Meaningful differences in the everyday experience of young American children. Baltimore, MD: Paul H Brookes Publishing. Kuhl, P. K. 2007. Is speech learning ‘gated’ by the social brain? Developmental science, 10, 1, 110-120. Movellan, J., Eckhardt, M., Virnes, M. and Rodriguez, A. 2009. Sociable robot improves toddler vocabulary skills. In Proceedings of the 4th ACM/IEEE International Conference on Human Robot Interaction. (La Jolla, CA, March 09-13, 2009). ACM, New York, NY, 307-308. Naigles, L. R. and Mayeux, L. 2001. Television as incidental language teacher. In Handbook of children and the media, D. G. Singer and J. L. Singer, Eds. Sage Publications, Inc., Thousand Oaks, CA, 135-152. Setapen, A. M. 2012. Creating robotic characters for long-term interaction. Master's Thesis. Massachusetts Institute of Technology. Snow, C. E., Porche, M. V., Tabors, P. O. and Harris, S. R. 2007. Is literacy enough? Pathways to academic success for adolescents. Baltimore, MD: Paul H Brookes Publishing. Tanaka, F. and Matsuzoe, S. 2012. Children teach a care-receiving robot to promote their learning: Field experiments in a classroom for vocabulary learning. Journal of Human-Robot Interaction, 1, 1. Thomas, F. and Johnson, O. 1981. Disney Animation: the Illusion of Life. Walt Disney Productions.