a blog about technology management
Currently Browsing: Evaluation

CourseEval3 block for moodle 1.5.x

I whipped together a block for the new RPI interface to CourseEval3 (what we use for our course evaluations).  You can find it on my moodle mods page.  The RPI provides a data stream for the current user (in this case a javascript table) to external portals.  I look forward to seeing if it improves our response rate in the winter trimester and spring semester.  Our students are constantly going to moodle so it’s a logical place to put it.


Focus on Course Evaluations

I made a new category, Evaluation, for this blog to make it easier to find my posts on online course evaluations. It was interesting to look back to this old post and see where my thought process was at the time. This is one of the things I like about blogs and blogging — seeing what I was thinking. We’ve since gone with CoursEval 3 for our evaluation software. After having some great conversations with colleagues I’m working on a proposal for Educause 2006 on this subject. Fingers crossed!


edublogs and course evaluations

Thanks to Sean who pointed out that edublogs is meant for educators rather than students. I had better read those FAQs!

I also wanted to link to my annotated bibliography version of the articles in my previous post. The annotations will give you an idea of what was studied in some of the articles.

Blogging has taken a real hit with fall semester looming. I will return!


Blog lag

My blogging sure has taken a hit as of late. Things have been busy at work (migration from Blackboard 5.5 to moodle 1.5 — and all of the customizations to moodle, course evaluations online — both a new set of questions and doing it online) and at home (our 2nd story is about to be removed and rebuilt anew).

The online course evaluations have been very interesting. It’s a real intersection of students, faculty and technology. You have the factors of student attitudes to course evaluations — are they anonymous? do the faculty care? does my opinion matter? And then the faculty attitudes towards evaluations — what if only students with negative opinions do them? non-tenured faculty worry about tenure decisions. And then the technology factor of being online adds the new variable of response rate. Doing paper evals during class gives a captive audience — the evals are optional, the faculty member has to leave the room, but the time allotted varies. The handwriting issue also comes into play for anonymity. Doing them online makes it easier to not do them. There hasn’t been extensive research on doing course evaluations online, but there are some articles I’ve found.

First of all, some effective practices are emerging. The TLT Group’s Flashlight Program BeTA Project has some insights to successful online evaluations. What is interesting is that several of the articles I found on the subject echo similar findings. Generally, institutions awkwardly start doing online evaluations. Sometimes things go bad, they try a few things to improve response rates, and then find things that work. These practices match quite closely the BeTA findings above.

What I find interesting is that the institutional culture around evaluations seems to influence their success when taken online. I’ve learned from smarter people than I that the social aspect can overwhelm a technology project. This is why Dr. Pike used Bolman and Terrence’s 4 frames (structural, political, human resource, cultural) when approaching the course evaluation redesign last summer (see our paper for more).

Back to some resources if you’re looking to move your institution to online course evaluations. I’ve tried to link to them all and note the institutions. Some focus on response rates, some are more general. Some have bibliographies that can lead you down more paths.


Innovations Conference presentation

I almost forgot to mention here that I presented at the first Innovations in the Scholarship of Teaching and Learning at the Liberal Arts Colleges. Originally it was to be Diane Pike and myself presenting but the Midwest Sociological Society Annual meeting was the same weekend and Diane was an organizer. So it was up to me!

I presented a brief (20 minute) summary of the paper we submitted to the conference. The paper is titled “Student Evaluation of Courses: Kicking and Screaming into the 21st Century.” It was progress update of the ongoing online course evaluation project. The second pilot is currently in process, closing April 29. It’s been a great project and I’ve learned a lot.

The progress we’ve made would not have been possible without the excellent programming work by Robert Bill. His python code has made the process possible without having to rush into buying a product. Assuming the new form is approved by the faculty, as well as the online delivery, it looks like Flashlight Online is our best bet. Being a nonprofit and an organization dedicated to teaching and learning — as well as a resource for best practices — makes it a good fit for us. No product will be perfect, but Flashlight has the right approach to the endeavor and makes a good partner.


Next Entries »

Powered by WordPress | Designed by Elegant Themes