Do Coursera staff really monitor the discussion boards? I am increasingly thinking that they do not; if they do, that is even worse, since no response of any kind has been provided for the people who have been complaining for weeks about abusive feedback. I'm not just talking about bland, unhelpful, vague, or inaccurate feedback, but instead about abusive language and mean-spiritedness of the worst kind. Here are a couple of examples people have complained about at the discussion board:
I believe you are either awfully young, typing for a parent who has no time to do it herself, or simply have received an inferior education. My guess is you lived in one of the Carolina’s where you neither spoke nor wrote a high quality of English. YOU CAN CHANGE THAT, if you work hard at it. If there were a zero to give, that would be your grade.
Well, I just have to say this. What the fuck? You completely force your arguments as if you were trying to fit a square into a circle. (Might this be another homosexual sign to you?)Then there are the one-word comments:
Ug.Of all the mean-spirited feedback I have seen reported, I would say the most bizarre and strangely cruel is this one:
One, two, three, four, five, six, seven, eight, nine, ten, eleven, twelve, thirteen, fourteen, fifteen, sixteen, seventeen, eighteen, nineteen, twenty, twenty one, twenty two, twenty three, twenty four, twenty five, twenty six, twenty seven, twenty eight, twenty nine, thirty.Yes, this is because our comments are supposed to be 30 words long. The software does not police this (hence the abundant one-word and two-word comments: "good!" or "liked it!") - but the idea that someone would deliberately put in a comment like this to meet the word count shows that there are some serious problems with the feedback culture in the class. Even if it is just a tiny percentage of the feedback overall, Coursera is going to have to find a way to do something about this; you cannot mandate participation in a peer feedback system as a requirement for the grade/certificate while allowing this kind of thing to go on unchecked and unattended. We are "graded" on our participation, and that participation grade consists of completing the peer feedback assignment. Someone who submits a comment that reads "One, two, three, four, five, ..." gets a full participation grade. Of course, a GREAT solution would be simply to get rid of the grading entirely - but I don't think Coursera has any intention of doing that.
By far the biggest problem, though, is vague and/or inaccurate feedback… and that's a much harder problem to solve. It's much like the problem with the poor quality of the essays overall; yes, there are inappropriate essays (blank essays, essays only a few words long, plagiarized essays, even spam essays) that need to be flagged - but the larger problem is the bewildering number of essays that are of such poor quality that it gets very discouraging to spend time on them. Without some kind of additional instructional component to the class, I am just not convinced that this often unreliable and/or unhelpful anonymous peer feedback can really help people to improve their writing.
Of course, to get a sense of what is going on overall, Coursera would need to ask us how things are going - for Week 4 (most recent week completed), it appears that 2500 people turned in essays (compared to Week 1, when apparently 5000 people turned in essays), while there are maybe a hundred or so people (just a guess) who participate at the discussion boards. So, without gathering feedback from us week by week about our experience (self-assessment of our own writing, self-assessment of our improvement in writing, feedback about the feedback we are receiving, etc.), there's really no way to know what's going on overall. Yet Coursera is collecting no feedback of any kind, except for the chaotic comments at the discussion board and the grades assigned by the peer reviewers.
One discussion board thread proposes: "Peer Grading Exposed as Milgram Experiment." I have to admit, that made me laugh. But it's not a happy laugh. I really hope Coursera does something about this. Since they added a flagging system for the discussion board, maybe they will eventually do that for the inappropriate essays and feedback, too. It would also really be nice for there to some kind of communication about all this, as opposed to the outdated and stale content that currently appears on the homepage Announcements (the last announcement was made on August 14, ten days ago). I participate pretty regularly at the discussion boards (although, admittedly, less than I used to; it's not the most fun place to spend time), and it's been weeks since I saw a discussion board comment that came from a Coursera staff person or a member of the course's instructional staff. Is it a Milgram Experiment... or, shudder, Lord of the Flies...?