* Indeed, I need to insert a caveat here--I have read the retraction letter and related materials but not the original article nor is this in my area of expertise. I am just discussing what it means for other folks in this business.
The student may have falsfied data, altering existing data rather than doing the work he was supposed to have done--surveys, etc. That the issue involved was attitudes about gays only makes it more salacious and salient for observers. That it happens when the social science funding for the National Science Foundation is under attack makes it ever worse.
For me, the questions being raised about co-authors and advisers are the ones that concern me. Some folks are saying that this guy's adviser and/or co-author failed the discipline by not discovering the fraud.
My problem with this is: what do we expect advisers and co-authors to do? As I have been in all four spots here (the co-author joining a project, the guy asking folks to join a project, the advisee and the adviser), I have to think a bit about this. And when I think aloud, I type here.
- The point of co-authoring is to have a division of labor so that the various folks involved are not duplicating the efforts of the other(s) that much. One does not expect one's co-author to lie/cheat/steal or else one would not choose that person. Engaging in intensive oversight over co-authors makes little sense (back to that in a second).
- The job of an adviser is to train, direct and provide feedback. Certainly, the adviser should read the work of the advisee with care, but the relationship involves trust. I didn't ask my students to provide me with plane tickets, hotel bills, photos of interviews, nor did I plumb the dark depths of their datasets. I did read their work to make sure that their efforts were sound and such, but it again is a relationship of trust. One tries to verify but only to a modest degree.
This fraud was revealed because other students wanted to use the data and once they worked with it extensively, it became clear that there is something wrong. They notified the adviser and the co-author, both pressed the student to explain, the student provided inadequate explanations. This might all have been ugly, but the system kind of worked.
The point really is that fire alarm forms of oversight are largely reactive and public. Someone notices a problem that already happened, complains, and then folks react. That this system is in place serves as a deterrent in so far as a person's academic career is trashed if they do something that activates the alarm.
If we used police patrol oversight--constant patrolling and monitoring--we might be better able to deter, but at the cost of much time and money (grant money for profs to accompany students while they are doing field work?) This kind of oversight can be more quiet (or not) and can be more preventative.
We, of course, really do not know enough to judge much of this. But we can think about the process and how we could do it better. Just as some are thinking more today about the pressures facing grad students to publish quickly.
One other thing: what about the money? This was supposedly funded research so either the student didn't really have the money to do the work OR did but didn't spend on the survey firm which raises the question of what did he spend the money on OR the money is still sitting in a research account. And, yes, this is a big deal. Fraud over ideas? Bad. Fraud over ideas and abusing research accounts? Much worse--as it brings in cops, auditors, IRS, etc, etc.
If anyone has suggestions of how to mentor better or co-author better yet not foster distrust, let me know.