ALC2022_Poster_AIChatBotExcel.pdf (339.77 kB)
Download file

AI Chatbot to teach Microsoft Excel

Download (339.77 kB)
poster
posted on 25.03.2022, 08:58 authored by Huiyu Zhang, Linda Fang, Huang Miao, Ester Goh
At Temasek Polytechnic, an independent learning week, termed FLEX week, provides students the opportunity for competency development and enrichment beyond the classroom. Many face-to-face workshops had to move online because of the COVID-19 pandemic. In Oct 2020, the School of Informatics and Information Technology (IIT) partnered the School of Applied Sciences (ASC) to develop an Artificial Intelligent (AI) Chatbot using Google Dialogflow. It was to act as an online learning assistant to teach Microsoft Excel to first and second year ASC students from the Diploma Chemical Engineering (CHE) during the FLEX Week from 19 to 23 July 2021. The Chatbot was designed to guide students on the use of Microsoft Excel (version 2013) and let them acquire the required skills to pass the Microsoft Excel Expert Certification. Its menu allowed users to select what to learn. The Chatbot was to provide scaffolds by recommending learning tasks that matched learners’ prior knowledge and organised in order of increasing difficulty. 149 one-minute video lessons were prepared. It also provided just-in-time content answers to student questions, but would not perform any emotional understanding of the intent. Students could access the lessons from their laptop or handphone by clicking an URL, whereby the dialog window would pop up. This paper examines the challenges of this pilot study and provides recommendations for the design. Data came from a post-workshop survey with 44 respondents. The Chatbot as a learning assistant was generally perceived positively. Its learning menu helped students to plan what they were to learn. However, a major challenge was to sustain the usage of the Chatbot for independent learning, and avoid the temptation of turning first to their human tutor, peers or the internet for answers. A second challenge was to ensure conversational dead-ends were avoided. While the Natural Language Processing (NLP) engine fuelling the Chatbot is mature and reliable, to further mitigate this challenge, the curated database of training phrases (intents/questions) and responses (answers) had each intent nine other variations of phrasing it. Yet, it was a big turn-off when the intent of the user interactions were not identified correctly. A third challenge was viewing the responses in a small default messenger window which either filled a handphone screen or a quarter of the laptop screen. To address the first challenge, animated responses (screenshots showing a sequence of steps into a Graphics Interchange Format (GIF) would be more helpful and could replace text-based answers. For the second challenge, more variations to questions expressed in “student lingo” could be included. If the Chatbot does not detect the intent, an email would be triggered to the human agent (aka tutor), who can then respond via email. He/she could provide updates for new training phrases to be included in this intent. The third challenge would require the modification of codes/scripts to expand the window to occupy the whole webpage for the computer screen. The learning menu could be limited to three levels of hierarchy. Once the design is improved, further work using analytics can be applied to the history or logs of their interactions, such as identifying most visited topics or concepts to relate to areas of doubts that require further elaboration during class; and the seasonality of the usage to schedule additional help sessions in these key periods. This would help the academic staff and developers better understand the learner’s learning journey with the Chatbot.

History

Journal/Conference/Book title

Applied Learning Conference 2022, 20-21 January 2022, Online

Publication date

2022-01