Measuring learning results

Recently, I taught a course on content curation for the Basque Government Library Service (I had already worked for them several times in the past). The course was delivered to six different groups between May and July, with a total of 178 students. Each of the groups worked on it for three weeks.

Satisfaction survey

Here you can see how the participants rated the course after finishing it:

I think these are results to be proud of. The best qualification is to my performance as a teacher (A: 9.34 out of 10), being the worst result a B (7.59 to the course length: some of the less experienced students would have appreciated a bit more of time). The rest of the concepts are rated between A and B, being the global satisfaction level of 8.71. Outstanding.

About the course

I asked the students to do three different activities. I wanted them to work on a content curation project, completing all the phases at least one time. They had instructional materials available (text documents, images, and videos) and all the help they could need from me. The platform we used was Google Classroom, but I also asked them to share the results of their work on Twitter.

The experience was intense, everybody being very involved. For me, for example, these are the numbers:

  • About 950 emails exchanged (course developing and management).
  • About 700 private messages with the students within the platform.
  • About 230 public messages within the platform.
  • 355 practical activities reviewed. I provided individual feedback for all of them.

That’s a lot of time invested in providing the best possible learning experience to everyone. But if you look at the survey results, apparently it was worth the effort.

Beyond the survey

Even though the students have shown their satisfaction about the course, this is not the only factor the client should consider when assessing if the expense was worthwhile (not only thinking about this particular case but any other one). I believe they should try to answer to the two following questions:

  • Did the students acquire the necessary skills aimed?
  • If so, will their professional performance improve because of that?

It’s out of my hands to impact on the second one (except when they hire me to do the follow up) because this is something that happens after the course. But to know about the first one, I have to rely on the survey and on what I have seen during the course.

In this particular experience, two-thirds of the participants (65%) succeeded with all the proposed activities. They were capable of completing a content curation process and did it by using an enough variety of tools.

So, what happened to the other 35%? The answer is something like this:

  • A 20% didn’t finish the course because of the limited time they could devote to it (but all of them worked on some phases successfully).
  • A 5% drop out because they didn’t feel confident to do what they were asked to.
  • A 10% registered but never actually worked on the course.

From the first group, I am very confident that most of them will continue moving forward on the topic. It won’t probably be as fast as it could be, but it will anyway. They know the basics, and all the instructional materials and other stuff (messages, shared work, etc.) are at their disposal.

It’s sad not being able to convince someone stuck that they can do it. When that happens, I always offer more help and try to be supportive. I tell the person that the most important thing is to start, and then continue and taking one step at a time. Sometimes, I succeed, and he or she ends up finishing the course, but not always (so I have to accept it).

And finally, every online course has always a percentage of people who won’t try it because of different reasons. I don’t like it, but it happens (and it’s commonly more than a 10%).

In short

So it’s still early to know the degree of professional improvement that will occur thanks to the course. But those who finished and the ones who didn’t but plan to keep moving forward have work to do: plan, start developing the project, keep improving their abilities, and gradually building new services or enriching the ones they were already offering. As I’ve said, it has to be one step at a time.

The results we have for now are encouraging: the finishing rate was good, and the participants showed a high level of satisfaction towards the content, the support, and the experience as a whole. If most of them are happy, shouldn’t I be pleased? 🙂