My HCII REU Experience
Hello! My name is Jordan, and I spent summer 2020 participating in a remote research experience for undergraduates (REU) through Carnegie Mellon University. There, I was a researcher for the Human-Computer Interaction Institute, and was lucky to be placed in Dr. Vincent Aleven’s lab, researching intelligent tutoring systems for middle school students. Specifically, I worked on designing a dashboard for one of these systems, Lynnette. This is what my summer looked like:
Week One
Although remote, the first couple of days of the internship felt largely as expected, with lots of introductions and icebreakers and protocols. After orientation, the new interns were split into groups to participate in a contest to design the t-shirts for this summer. My group, Team JAKA (an abbreviation of our names) spent the hour focusing on designs that would highlight the uniqueness of this summer while also displaying that we’re remaining connected through the internship. Since I had Adobe Illustrator installed, I drafted up our designs:
Another team’s (amazing!) design was selected for the shirt, but the t-shirt contest was a fun way to get to know some of the other researchers.
I also got to meet with my research group for the first time within the first couple of days. I learned about my assignment for the summer — designing a student dashboard for the Lynnette tutoring system — and began reading previous research on Lynnette and dashboards. I attended a crash course Zoom meeting on developing analytics and detectors for tutoring systems such as Lynnette, and even though a chunk of it went over my head, I got my feet wet in what the implementation of my dashboard could entail.
Week Two
This week involved continuing to feel my way around Lynnette and finishing readings about the tutoring system as well as doing dashboard research. This involved collecting examples of dashboards from various well-known online education tools, such as Khan Academy, Udemy, Codecademy, and edX.
Something I tried to keep in mind while doing this background work was that Lynnette is specifically for middle school students, whereas some of the aforementioned tools like Udemy are optimized for older students and professional adults. Additionally, I was continually reminded that the dashboard should be oriented toward mastery and not just progress, which will likely affect design decisions and questions asked in the upcoming survey for the middle school students.
Week Three
This week, I began putting together a survey to be sent to middle school students who have previously used Lynnette. It was quite difficult to come up with questions at first. With some guidance, I was able to come up with general objectives for the survey and then use those to generate questions.
I also helped lead the REU’s participation in #ShutDownSTEM and #ShutDownAcademia on June 10th; the day was intended to be used to take action to confront anti-Black racism in academia and STEM fields. Our agenda included reading about racism in technology and human-computer interaction as well as at our institutions, writing emails to legislators and school administrators, and discussing what commitments we’d make going beyond the end of the day.
Toward the end of the week, I met with Jeff, another intern in my group. His project involves making Lynnette more engaging, so we came up with a couple of questions which will hopefully provide us with a sense of what middle school students find engaging in an online algebra tutor. I’ve been trying to follow Pew Research guidelines on effective surveys, which has shaped how the responses to the questions look: for example, instead of yes/no questions, providing a scale of “strongly agree” to “strongly disagree” to provide richer insight.
Week Four
Week four involved changing the survey/questionnaire to an interview format, as Jeff happened to already have the necessary clearances to interview the middle school students. My mentor and I felt that it would provide richer information than a simple online survey. Most of the questions translated easily to an interview setting; while creating the online survey, I was very conscious of the fact that making it long and tedious would reduce the quality of the data I gathered, and I wanted to ask in-depth, open-ended questions anyway. With an interview, the students could ask clarifying questions and build upon their answer more easily than they would if typing out a response. For example, “How do you know that you’re learning something?” is especially suited for an interview — it’s a very open-ended question, so there was potential for confusion with an online survey. However, I still wanted to include the question so I could have ideas on what metrics to add to the dashboard to make students feel like they were learning.
Week Five
Somehow this was the halfway point of my internship! Wild!! This week, I focused on creating hypotheses for my prototypes. By this, I mean that I tried to come up with a focus for what each prototype would be testing. The dashboard design provided by the interns from last summer included both a bar graph (displaying current mastery of the various algebra skills tracked) and a line graph (showing the progression of the student’s mastery over the number of problem attempts). It made sense to me to test how the students felt about these different ways of presenting related data; would the middle schoolers only care about a bar graph, showing their current mastery? Would they be interested and motivated by a line graph showing their progress with more attempts? Would they enjoy having access to both graphs, or would this feel overwhelming or confusing to the students?
To find the answers to these questions, I began making similar prototypes in Marvel, each emphasizing the bar graph, line graph, or both.
Additionally, this week I finally received what I needed to be finished with my Pennsylvania Act 153 clearances, which would allow me to join Jeff in conducting the interviews. Definitely good news, as this way I could be sure to target my questions precisely toward what I was trying to determine through my research.
Week Six
During this week, I finished the interview protocol and changed the focus of the prototypes I would be testing in the interviews. I liked the idea of receiving feedback on the different types of graphs, but I was worried that focusing on only this aspect of the dashboard wouldn’t provide me with enough information to know what direction to go in when creating the design for the dashboard. Because of this, I decided to revisit some of the literature on educational dashboards. In Bodily et al., 2017, the authors mention that the important aspects of a dashboard are to:
- Help students understand what happened in their learning activities, and
- Help students know what to do going forward.
The second point seemed like a good area to learn more about. Presenting data in the form of bar and line graphs seemed to fall under the umbrella of “helping students understand what happened”, but I wondered if there was a simple way to go a step further and guide students on what to focus on the next time they worked on problems. Since Lynnette already tracks skill mastery, I wondered if simple “recommendations” based on the least-mastered skills could benefit students. I tweaked the prototypes to test students’ thoughts on these hypothetical recommendations, and Jeff and I finished the protocol for our first interview with a middle school student.
Week Seven/Eight
Weeks seven and eight blended together; with my mentors, I planned what the last few weeks of the internship would look like. Due to the limited timeframe, I decided to focus my efforts on additional testing of the prototypes (to provide more robust feedback from more students) rather than scrambling to begin implementation of the prototype. Additionally, as Jeff and I continued to conduct user interviews, we tweaked the protocol. For example, the interviews were mostly very swift. I had originally been concerned about exceeding our allotted time with the students, but many of them answered my questions succinctly and zoomed through the prototypes, exploring them even before I finished my script of instructions. For the three additional interviews we had, I added several new questions to get the most out of our time, and also added more specific tasks for the students to attempt with the prototype.
Weeks Nine/Ten+
Wild to have finally arrived at the end of the REU </3. Weeks nine and ten were all about wrapping up our interview learnings and presenting our work. Jeff and I conducted our last couple of interviews, and I created a (very) rough draft of my presentation for feedback from Tomo and Vincent. They provided some very rich feedback, helping me to refine a description of my research and how to present my findings in an academic setting. Below, I’ve added the slides from my presentation (short and sweet, we only had five minutes to present!):
After giving my presentation and watching everyone else’s — I was surprised at the breadth of research being conducted, one student was literally working on fabricating exoskeleton-like tubing — the REU was over. I had a nice time talking and saying goodbye to other researchers in the lab over on gather.town, a fun, refreshing switch from Zoom. It was a peculiar feeling to feel the bittersweetness of wrapping up another internship and to simply close my laptop and take off my headphones; no flights home, no physical handshakes or in-person farewell events. A few days later, I finished putting together the most updated version of the prototype in Figma for researchers after me to use and was finished for the summer.
I’m truly thankful for such a positive internship experience in spite of being remote in the middle of a pandemic. A significant part of this is due to my mentors Tomo Nagashima and Dr. Vincent Aleven for their insight and patience, so I’m very thankful to have been placed with them and to have had the chance to pursue interesting research. I can’t wait to see what this next year has in store!