Chatbots are software that allows interactions through text messages, commonly used to provide responses to FAQs on various platforms. In educational settings, chatbots have been used to determine students’ struggles during their university journey in a bid to encourage student retention [1], as well as to provide factual administrative-related answers (e.g. when is Assignment 1 due?) in university courses. It is rare, however, for chatbots functioning as a tutor, providing context-based scientific answers to student questions in an online learning course. In this study, we implemented a chatbot in Chem Quest (CQ), a non-compulsory, non-credit bearing online chemistry preparatory course for matriculated students in the Chemical Engineering and Food Technology (CEFT) cluster in Singapore Institute of Technology (SIT). The chatbot is designed to provide answers to content-related questions that students may ask while completing the tutorial questions on CQ in the absence of an instructor. It is worth noting that the chatbot was designed not to provide direct solutions to the tutorial questions, but rather to scaffold the learning for the students and guide them towards the solution of the problem. Furthermore, the questions answered are context-based, instead of factual questions that can be easily answered by search engines (e.g. why do I divide by 3 instead of 8?). Herein we present the chatbot usage-related findings of this innovative project. A proof-of-concept of the chatbot [2] was implemented on Chatlayer, a conversational Artificial Intelligence (AI) platform [3] as a Software-as-a Service (SaaS) solution. The chatbot was offered alongside CQ for two months from July 2021 to 194 matriculated CEFT students. The chatbot is equipped with a knowledge base to store all CQ-relevant textual information, from which answers to students’ questions are retrieved. Twelve (12) paid participants were involved in this study, with no specific instructions on the type of questions to ask the chatbot. The chatbot usage statistics was analysed to study the student-chatbot interaction level. 33% of the students offered CQ completed the voluntary online course, with 24 students interacting with the chatbot at least once (including the study participants). The student-chatbot interactions resulted a total of 123 conversations being recorded, where each conversation is defined as a continuous interaction with user text messages sent within 15 mins of the chatbot’s last message. a total of 979 messages have been sent by students to the chatbot, averaging nearly 8 messages per conversation and 5.3 minutes per conversation. During these conversations, specific topics received more questions (the topic "Matter and Energy" in this case), giving insights to students’ understanding of a particular topic, encouraging further enhancements to the content of said topic. Despite high volume of messages sent, 90% of messages sent by users were correctly understood by the chatbot. Importantly, 53% of user messages were sent outside of office hours of a typical online instructor (9 am to 5 pm weekdays with a lunch break), suggesting the 24/7 availability of the chatbot is a useful feature to students. Finally, students who rated the support given by the bot, an average neutral score of 3 out of 5 was given. The usage analytics show students’ level of interaction with an available chatbot during an online course through various metrics. While the chatbot was able to provide instant answers to students’ questions (especially to 53% of questions asked outside of office hours), further improvement to the chatbot is necessary considering neutral rating of the chatbot’s level of support. To derive further insights into these results, the complementary qualitative portion of this research project will be analysed, with aim to develop a chatbot competent in supporting students through their online Chemistry course.
History
Journal/Conference/Book title
Applied Learning Conference 2022, 20-21 January 2022, Online