We're now about 6 weeks into our experiment of transitioning our University instruction to an online environment. Because we transitioned in the middle of the semester, I've been waiting to finally have some exam results that were based on all online instruction. This is a 3 part post to discuss the effectiveness of the three courses I am now teaching online.
A big question people are wondering is: are students learning online? I've got two courses to evaluate that question: one techniques course (Advanced GIS), and one traditional course (Statistical Problem Solving in Geography). My evaluation is both anecdotal and quantitative.
First, a little background..
I am teaching three courses this semester: an online graduate class in GIS Management, an advanced GIS course, and a quantitative geography course.
The Advanced GIS course is one I've taught for 13 years, and if focused mostly on the advanced tools in ArcGIS Pro.
Statistical Problem Solving in Geography is a 200 level quantitative geography class, based on my textbook An Introduction to Statistical Problem Solving in Geography.
When all this COVID-19 stuff hit the fan, I was able to adapt fairly well, as I've been teaching online for 15 years, and I also have numerous video instruction that I produce. So, I decided to open my courses up to instructors around the world. So, we have hundreds of people participating in my FOSS4g and Statistical Problem Solving in Geography course. However, I still maintain a private course for my students at Salisbury University, where I can administer exams, practicums, and lab assignments.
Statistical Problem Solving in Geography
We'll start out looking at my statistics class. This is your basic meat-and-potatoes quantitative geography class that introduces geography students to statistical analysis. Many geography students are a bit intimidated by mathematics and quantitative analysis, so for this class you have to tread somewhat lightly, making sure to illustrate how all the concepts apply to their discipline. You'll see from my syllabus that we don't water anything down, and actually get students to perform multivariate regression.
My now online version of this class has video lectures and exercises, and I also provide the students with video recitation to help them perform the laboratory assignments. To help the students during this time, we have weekly mental health check-ins, just to see how everyone is doing - this is my favorite part of the class, as I really miss all the students, and I think they miss one another.
We finally had an exam that covers material that was only produced online. This exam covered inferential statistics (central limit theorem, one, two, and three or more sample different tests, and Goodness-of-Fit tests). There were two exams: a traditional exam, and a hands-on practicum. The practicum was open book, while the traditional exam was closed book. Here were the results:
Closed book Exam
There's really no way to fully protect against online cheating. But, you can mitigate it in two ways: (1) tell the students that the results are going to be used in a statistical test to see if the exam results are statistically different from other years, and hope they buy-in to it; (2) make it a timed test, so you either know the answer or not, and if you start looking stuff up, you will run out of time.
This was the same exam I've given the last three semesters (I had to make some slight modifications to accommodate the online nature), so I was able to compare it to different semesters:
What I found was that there was no real statistical difference among the three semesters that the exam was administered. Now, with a p-value of 0.1, if you really want to get technical and perform a pairwise comparison of the years, you'll find that the Fall of 2019 was statistically less than the Spring of 2020. So, not only were the results of our closed-book, online exam statistically similar to to last year's Spring class results, it was actually statistically better than last Fall. So, in terms of outcomes, it appears the students are learning just as much online as they did when the course was live.
I also gave an online practicum. This is open book, where I give the students a bunch of geographic situations, and they have to determine which test to perform, perform the test, and then interpret the results. For example, the following is one of the questions on the exam:
A fisheries company is testing a new hormone treatment on salmon. They want to determine if salmon growth is significantly more than ordinarily expected when utilizing the hormone treatment. The average growth in salmon after one year is typically .75kg per year.
Given the following table, state the null and alternative hypothesis, choose the appropriate test, and calculate the results. Upon completion of the test, indicate your conclusion about the introduction of the hormone treatment on the growth rate of the salmon.
So, in reality, these are not trivial questions, and I don't explicitly tell them to perform a matched-pairs test. They have to figure that out on their own. As for the results,
you'll see that there is not statistical difference between the practicum from last Spring, and this year. This year had a much larger variance than last year, but both had a fairly normal distribution.
The students like that the lectures are online. They can watch the lecture, and if they don't understand something, they can rewind the video, and watch it again.
It's not all roses...
This is good news as far as the efficacy of my online version vs. a live version for a lower level undergraduate course: students are learning, and performing fairly well. But, there are some disconcerting things I've noticed: all anecdotal.
For one, our weekly check-ins really illustrate how much the students miss being together (and I miss them, too!). Also, as we converse online, I get a sense that the students are operating on adrenaline - I know I am. Again, that is anecdotal, but they say this lock down is taking it's toll, and they are growing discouraged. I am, too. Maybe you feel like me - I just want this thing to end! I wonder: are we able to do this because we are only keeping this pace up for 7 weeks? What if it was an entire semester? Personally, I'd feel like I was burning out. Not from school, mind you. But from all the other pressures.
One thing I learned the students need is structure. I started our online endeavor in a sort of go-at-your-own pace. Some liked it, but most did not. They really want the structure of letting them know when you want them working on something. I can't blame them. The go-at-your-own pace risks having students fall behind, or even work too fast. So, in mid stream I put together a task list on a week-by-week basis. This was also good, as it let me see that I was rushing certain chapters as well.
In conclusion, and moving on to the next course...
I'm actually very proud of my students, and how they have handled this transition. They are real troopers, and are showing a lot of tenacity. Remember, we sprung this on them at a moment's notice. That is a hard thing to recover from. But, at the end of the day, they are learning, and they are applying what they are learning with real life data sets.
So, now on to the next course: Advanced GIS. In my next post, I'll answer the question: Can you teach someone FOSS4g using online videos, and can they complete the same assignments they did with software they have been using for years. (Spoiler alert: yes, you can!!).