What we learned from the first group trial of adapptED

TL:DR →We’re all busy, I get it. Here’s the summary of what happened

  • We ran a term-long trial of adapptED together with an after-school program. 9 students were involved across grades 3,4 and 5 with varying abilities.
  • Students met for 45 minutes each week in a computer lab where they worked on the adapptED platform. A teacher was available and provided support on concepts if they got stuck.
  • Pre/post tests were conducted to test the hypothesis that the adapptED platform would improve maths knowledge.
  • Results: Average 8 mark increase with p<0.01 (statistically significant).
  • Conclusion: adapptED can be effective in improving student outcomes in a small-group mixed ability environment. Most students didn’t practice during the week and so improvement expected to be > 8% with homework completion.

Context and background on adapptED

My personal mission is to make effective learning available for all students. adapptED, the learning platform my company is building, is one piece of solving the puzzle.

Ultimately for adapptED to have a large impact I believe it needs to operate in schools, where students spend most of their time. However , before being successful in a full classroom environment many questions need to be answered and validated,

  • Can adapptED improve learning for individual students?
  • Can adapptED improve learning in a group environment?
  • What additional teacher tools are needed to support a classroom?
  • How can a teacher successfully run a class where students are working on different content at the same time?

Rather than jumping into a full 25–30 student classrooms we are adopting a phased approach

  1. Make it work in a 1-on-1 setting
  2. Make it work in a small group setting
  3. Make it work in a classroom setting

Since July 2017 I have run a tutoring and coaching organisation in Sydney called Sandbox Learning Australia. We help students improve confidence, understanding and fluency in a range of school subjects through evidence-based strategies, committed tutors and supporting technology.

Sandbox Learning provides an opportunity to test and refine the adapptED platform to improve student learning. We are able to prototype new adapptED features, validate them with students in real settings and then iterate based on feedback. You can read here more about the history of adapptED.

Our focus to date has been on the 1-on-1 experience. Through 2018 we progressively rolled out new functionality and features on adapptED including multiple choice format questions, parent profiles, and an algorithm that tracks how quickly students forget concepts so that tutors know when it is time for revision.

By early 2019 we had become confident that adapptED significantly assists students learn in 1-on-1 tutoring over-and-above regular tutoring. Our data showed that students who regularly completed their adapptED homework performed better than those who attended tutoring but didn’t use adapptED.

With phase 1 completed we have turned our attention to small group settings. After contacting a number of schools, we secured an after-school pilot with a primary school. We ran a term long trial of adapptED together with an after-school program. 9 students were involved across grades 3,4 and 5. Students met for 45 minutes each week in a computer lab where they worked on the adapptED platform. A teacher was available and provided support on concepts if they got stuck.

How the program ran and what we learned

Setup and early weeks

The first week of the program involved getting students set up on the adapptED platform and giving a diagnostic quiz we call a Maths Checkup. While most of the students were set up in advance, 2 students signed up on the day. This caused issues as the tutor was trying to get them registered while helping the other students begin their diagnostic tests. Once registered and signed in the students had no issues with diagnostic tests.

Learning: Ensure all students are set up prior to the first lesson

The early weeks of class were frantic. We hadn’t accounted properly for how much energy 8–10 year olds would have after school. It was encouraging that the students were keen but they weren’t prepared to wait. Most of the time they would call out as soon as they had a problem or got stuck. Putting up their hands didn’t help much as they would just shout with hands up. With hindsight we should have established clear class rules from the outset about what was and wasn’t acceptable behaviour.

Learning: Establish class norms from the outside. You need to be firm about what will and won’t be tolerated. This seems to be balanced with appropriate time to have fun at the start and end of the lesson.

It was important to have headphones so that the students could listen to the help videos. During week 2 we purchased 5 pairs of headphones for the 9 students since the local store ran out. This lead to a few kids getting stuck, then bored, then disruptive. From the next week we had headphones for each computer which improved the focus considerably.

Learning: Ensure you have all the infrastructure needed from the outset.

The middle weeks

By week 5 the students had settled in to the program. They knew to come into class, log in to adapptED and start working on their assigned concepts. In general the behaviour had improved and students were putting up their hands if they needed help without shouting. However, when they put up their hands they would then stop working, delaying progress in class. Asking them why, lead to the insight that they feared losing their spot if they moved onto other concepts.

Learning: We needed a ‘digital handup’ feature that would allow students to request help without them losing their spot in the queue.

The class contained students across grades 3, 4 and 5. Furthermore even within the grades students were at very different abilities, as you find in any class or grades. For example the top student in grade 4 may be at a grade 6 level and vice versa. The adapptED platform was able to cater to this effectively since each student could work on different concepts and move at their own pace. Throughout the term however we noticed often that different students would have similar questions. There appears to be an opportunity to make this process more efficient by grouping students with similar misunderstandings.

Learning: There should be a way to identify and group students with common misunderstandings.

The final weeks

In response to earlier learnings we created a digital handup feature. By clicking a button students could log a help request that was automatically relayed to the teacher in a first-in-first-out (FIFO) queue. The prototype of this worked reasonably well in achieving the goal of letting students continue to solve problems while waiting for help.

After testing it out for a lesson I noticed that in many instances students would submit a handup but by the time the teacher came to them they had sorted it out themselves. This suggested future improvements such as clearing the help request if the student completes the concepts. I’m also interested in exploring ways to involve peer learning into the program.

Learning: Are there ways to involve peer learning to address a handup?

Towards the end of the program it was clear that few students were practising at home during the week. Those that were were retaining more learning while those that weren’t needed to repeat concepts week after week due to forgetting.

Learning: We need to find an improved way to bring parents into the process so they work with their children to practice during the week.

Results and discussion

The chart above shows the test results for the class. On average, student scores increased by 8% marks between the pre and post tests. One student was not present during the final lesson and so did not sit the post-test. 7 of the 8 students improved while 1 student performed worse.

The table below shows the raw score data. Each tests contained 40 questions.

A common mistake that analysts and organisations make when running experiments is to focus purely on the difference in the means (averages). However, it is very possible that the difference is just due to chance rather than the effect you are testing. Therefore it is important to check whether the result is statistically significant.

The experiment was conducted using a repeated-measures design and so a paired student t-test was selected. In short a paired student t-test looks at the changes in scores between of pre and post tests across the students to see how likely it is due to chance (the null hypothesis so to speak) rather than due to using adapptED.

With a p-value less than 0.01 this translates to meaning that there is less than 1% likelihood that the score improvement was due to random chance. Given this, we are confident, despite the small sample size, that the score improvement was due to using adapptED.

Where to from here?

While these results are encouraging, we need to be able to confirm our results in further trials. Ideally we would be able to run such programs in multiple settings and see the results extend.

Could you help us conduct extra pilots?

If you are connected with schools — as a student, parent, teacher or general community member — and could make an introduction, I’d love to hear from you. You can reach me at jesse@sandboxlearning.com.au

Happy Learning!

EdTech entrepreneur, passionate about improving education impact through tech and research-driving practice. Former consultant and engineer. Harvard MBA.