2,578
views
0
recommends
+1 Recommend
1 collections
    4
    shares

      Celebrating 65 years of The Computer Journal - free-to-read perspectives - bcs.org/tcj65

      scite_
       
      • Record: found
      • Abstract: found
      • Conference Proceedings: found
      Is Open Access

      The Effects of Video Instructor’s Body Language on Students’ Distribution of Visual Attention: an Eye-tracking Study

      proceedings-article
      , ,
      Proceedings of the 32nd International BCS Human Computer Interaction Conference (HCI)
      Human Computer Interaction Conference
      4 - 6 July 2018
      Video lectures, Social signals, Eye tracking, Embodied pedagogical agents
      Bookmark

            Abstract

            Previous studies have shown that the instructor’s presence in video lectures has a positive effect on learners’ experience. However, it does increase the cost of video production and may increase learners’ cognitive load. An alternative to instructor’s presence is the use of embodied pedagogical agents that display limited but appropriate social signals. In this extended abstract, we report a small experimental study into the effects of video instructor’s behaviour on students’ learning experience, with the long term aim of better understanding which instructor’s social signals should be applied to pedagogical agents. We used eye-tracking technology and data visualisation techniques to collect and analyse students’ distribution of visual attention in relation to the instructor’s speech and body language. Participants also answered questions about their attitudes toward the instructor. The results suggest that the instructor’s gaze directed towards the lecture’s slides, or a pointing gesture towards the slides, is not enough to shift viewers’ attention. However, the combination of both is effective. An embodied pedagogical agent should be able to display a multimodal behaviour, combining gaze and gestures, to effectively direct the learners’ visual attention towards the relevant material. Furthermore, to make learners pay attention to the lecturer’s speech, the instructional agent should make use of pauses and emphasis.

            Content

            Author and article information

            Contributors
            Conference
            July 2018
            July 2018
            : 1-5
            Affiliations
            [0001]Beijing University of Posts and Telecommunications
            [0002]Queen Mary University of London
            [0003]Tokyo University of Agriculture and Technology
            Article
            10.14236/ewic/HCI2018.101
            8a1abcdf-5800-4022-9e19-6e2812dad46a
            © Zhang et al. Published by BCS Learning and Development Ltd. Proceedings of British HCI 2018. Belfast, UK.

            This work is licensed under a Creative Commons Attribution 4.0 Unported License. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/

            Proceedings of the 32nd International BCS Human Computer Interaction Conference
            HCI
            32
            Belfast, UK
            4 - 6 July 2018
            Electronic Workshops in Computing (eWiC)
            Human Computer Interaction Conference
            History
            Product

            1477-9358 BCS Learning & Development

            Self URI (article page): https://www.scienceopen.com/hosted-document?doi=10.14236/ewic/HCI2018.101
            Self URI (journal page): https://ewic.bcs.org/
            Categories
            Electronic Workshops in Computing

            Applied computer science,Computer science,Security & Cryptology,Graphics & Multimedia design,General computer science,Human-computer-interaction
            Video lectures,Social signals,Eye tracking,Embodied pedagogical agents

            References

            1. (2009). Informative or Misleading? Heatmaps Deconstructed. Human-Computer Interaction New Trends. Springer Berlin Heidelberg.

            2. , & (1991). Cognitive load theory and the format of instruction Cognition & Instruction 8(4), 293–332.

            3. , & (2015). Effects of different video lecture types on sustained attention, emotion, cognitive load, and learning performance Computers & Education 80(5), 108–121.

            4. , & (2011). E-learning and the science of instruction: proven guidelines for consumers and designers of multimedia learning. Pfeiffer.

            5. (2015). Eye-Tracking Analytics in Instructional Videos ISECON.

            6. , & (2014). How video production affects student engagement: an empirical study of MOOC videos ACM Conference on Learning @ Scale Conference (Vol.43, pp. 41–50). ACM.

            7. , & (2014). MOOCs: Expectation sand reality. Full Report Center for Benefit Cost Studies of Education, Teachers College Columbia University, NY.

            8. , & (1980). A theory of reading: from eye fixations to comprehension Psychological Review 87(4), 329.

            9. (2014). ANVIL: A Universal Video Research Tool. In (Eds.) Handbook of Corpus Phonology Oxford University Press 420–436.

            10. , & (2014). Showing face in video instruction: effects on information retention, visual attention, and affect 2095–2102.

            11. , & et al. (2013). A review of using eye-tracking technology in exploring learning from 2000 to 2012 Educational Research Review 10(4), 90–115.

            12. , & (2015). Social robots and virtual agents as lecturers for virdo instruction Computers in Human Behavior 55 1222–1230.

            13. (2001). Multimedia Learning Cambridge University Press.

            14. , & (2015). Designing effective video-based modeling examples using gaze and gesture cues Educational Technology & Society 18.

            15. , & (2017). Effects of the instructor's pointing gestures on learning performance in video lectures British Journal of Educational Technology 48(4), 1020–1029.

            16. , & (2014). How Students Learn using MOOCs: An Eye-tracking Insight EMOOCs 2014, the Second MOOC European Stakeholders Summit.

            17. , & (2016). A gaze-based learning analytics model:in-video visual feedback to improve learner's attention in moocs 417–421.

            18. SoftBank Robotics (2017) “Find out more about Pepper”. [Online] Available from: http://www.ald.softbankrobotics.com/en/robots/pepper [accessed 28 March 2018].

            19. , & (2016). Lecturers' Hand Gestures as Clues to Detect Pedagogical Significance in Video Lectures European Conference on Cognitive Ergonomics (pp.2). ACM.

            20. , & (2017). Instructor presence in instructional video: effects on visual attention, recall, and perceived learning Computers in Human Behavior 71 79–89.

            21. , & (2014). Video-Based Learning: A Critical Analysis of The Research Published in 2003-2013 and Future Visions eLmL 2014 : The Sixth International Conference on Mobile, Hybrid, and On-line Learning (pp. 112–119).

            22. (2012). Upper body gestures in lecture videos:indexing and correlating to pedagogical significance ACM International Conference on Multimedia (pp. 1389–1392). ACM.

            Comments

            Comment on this article