investigating the nature of fact in the digital age

On Research

In Academic reflection on April 4, 2013 at 9:12 am

WHILE doing the Professional Research Methods and Evaluation course at RMIT I gave a lot of thought to the nature of academic research. I’ll concentrate on that rather than on professional research because the “pure” nature of academic study appeals to me more – with some caveats.

Academic research is clearly an inexact science but its aim to reveal the faces of some kind of sacred truth or ultimate understanding of the world really is a noble one. Whether the subject is “hard” (engineering, chemistry) or “soft” (the humanities), academic research has gleaned insights into the world and ourselves that have driven us forward, for good or ill.

That said, I’d like to offer some critical thoughts.

“Did we really need to spend three years and $100,000 to learn THAT?”

The results of some research seem obvious, while some research topics are plain crazy. Three examples of eyebrow-raising studies (all winners of the Ignoble Award):

Suicide rates are linked to the amount of country music played on the radio (“The Effect of Country Music on Suicide”, Social Forces, 1992)

Dog fleas can jump higher than cat fleas (“A Comparison of Jump Performances of the Dog Flea, Ctenocephalides canis (Curtis, 1826) and the Cat Flea, Ctenocephalides felis felis (Bouche, 1835),” Veterinary Parasitology, 2000)

Rats can’t always tell the difference between Japanese spoken backwards and Dutch spoken backwards (Effects of Backward Speech and Speaker Variability in Language Discrimination by Rats,” Journal of Experimental Psychology: Animal Behavior Processes, vol. 31, no. 1, January 2005).

The media love to criticise what they see as irrelevant, useless research, and often present academic research in a poor light, oversimplifying complicated research to the point that any value it has becomes invisible. Perhaps there is great value in knowing that rats can’t always (can’t always!?) tell the difference between backwards spoken Japanese and backwards spoken Dutch. We can’t know until we know, if you know what I mean.

On the other hand, it seems obvious that suicide rates can be linked to country music (especially Billy Ray Cyrus) without research telling us so.

Is there value in confirming apparently common sense observations? Should something be considered not true until observed/proven/validated by research? Should grants be given out for any old research project just to be able to tick off that, yes, we now know that less smoking and more exercise leads to a longer life (I’ve seen that study)?

Good research is vital but bad research – that is, research based on flawed assumptions, imprecise questions, questionable motives – can do great damage. When public confidence in academic research is eroded to the point that people claim that scientists warning about climate change are only in it for the research funding, we’re in for a world of pain.

Was Darwin a plagiarist?

At university it is drummed into us that in all assignments we submit for assessment we must acknowledge the work of others, including “thoughts, ideas, definitions or theories”. Neglecting to do so is considered a breach of academic integrity, a serious matter.

If you’ve been reading and thinking about an issue for a long time you will have absorbed many concepts and themes and can easily believe those ideas are your own. It is also possible that you will reach conclusions that others in the same field of study are coming to, will come to, or have already come to.

Several years ago I wrote a newspaper column about Twitter. At the time there were two common analyses of the microblogging platform. The first focused on the potential of Twitter to become a substantial source of crowd-sourced citizen journalism that could supplant traditional, professional journalism. The other concentrated on how many followers users had; whether you were a “thought leader” on Twitter, how much influence you had among the cognoscenti in your field, how you used the platform to extend and strengthen your “personal brand”.

In the column I presented my theory that Twitter is just another broadcast medium, like radio: a minority of voices (equivalent to radio broadcasters) are “listened to” by a majority (the audience). To get value from Twitter you don’t need to be one of the big voices, you just need to be a wise listener and, if inclined, you can chip in with the occasional contribution (like listeners do on radio talkback).

Since that column was published, this has become a common third analysis of Twitter. Had I read an early version of this analysis before writing the column and forgotten about it, or did I come up with it independently? It’s unlikely I was the first person to think of it but I feel sure I’d never encountered it before. Had I unwittingly plagiarised someone? If, in a research paper, you present a similarly “original” idea that someone else has already come up with, are you guilty of plagiarism?

Before starting a research project a thorough literature review should uncover whether certain theories or findings have previously been unearthed. But sometimes a new idea can coalesce in the ether and drop to earth, in different places, simultaneously.

English scientist Isaac Newton and German mathematician and philosopher Gottfried Leibniz developed calculus around the same time. Charles Darwin and Alfred Russel Wallace both “discovered” the theory of evolution. History is full of such examples of simultaneous invention. When a set of facts is in wide circulation it’s likely that more than one mind will put two and two together and come up with similar conclusions around the same time. In a super-fast, digitally-connected world it’s likely new ideas and inventions will arise simultaneously more often, in more places, created by more people. Original ideas will be harder to come by and the competition will be to be first to get them in front of the public or your academic peers.

Alfred Russel Wallace came up with the theory of natural selection before Darwin, but whose name is associated with it?

Should we tolerate this?

When setting out the intellectual/philosophical framework of their research context and method, the authors of many papers blithely accept the conclusions of previous research without critiquing them. “Following the analysis of Bifur, Bofur and Bombur, we base our research framework on their finding that …”

This practice is a useful convenience, but very few academic papers offer simple black-and-white conclusions. There’s usually a lot of “on the one hand” followed by “but on the other”. Yet in paper after paper such qualifications are glossed over and a simple conclusion is plucked out to support a position, and used to build new conclusions which, in turn, are simplified and used by subsequent researchers in their research.

Nuance and inconveniently conflicting views are thereby erased in the hope of building a singular truth. In mechanical engineering such discrepancies are known as “tolerance” – the amount of permissible “wiggle room” or tiny amounts or error by which a designed object will still function. (For example, beyond a certain degree of tolerance a bolt will not screw into a nut; metal needs to be able to withstand certain amounts of weight/pressure.)

It’s an inexact but necessary process: we shouldn’t – and can’t – reinvent or test every element of every piece of research we rely on. But when doing our own research we need to be aware of the wiggle room we build into our set-ups and arguments (context and method) and acknowledge it. As researchers we should also be aware that allowing too much tolerance weakens the whole edifice. As precision declines, so do solidity and reliability. This applies as much to academic research and argument as it does to bridges and skyscrapers.

Some thoughts about abstruse language in academic papers (I’m looking at you, Cultural Studies)

Journalism students are taught to avoid using jargon because it is the enemy of clear expression. But much academic writing seems to revel in being obscure. If you’ve put years of study and effort into your research, why make the results of your work difficult to understand for a broad audience by using words in combinations not found in any dictionary or guide to grammar?

Academics argue – as do all industry specialists who develop a secret, specific language of their own – that existing language and vocabulary cannot capture the depth or complexity of the concepts they need to express. In that case, I’d suggest, such research approaches the realm of the mystical, and its “results” are unavailable to anyone whose world-view is rooted in the practical and the solid. What tangible benefit can come out of this kind of research? How can it be used? Is it just talk and wordplay and esoteric insight for its own sake? Is its only value that it can be cited by the next study in a continuing procession of abstruse studies written by other members of the same club? I wonder.

I blame the French. Foucault, Latour, Baudrilleux and Bordieu would be easier to understand if they’d spent more time in primary school learning Euclidean geometry rather than Existentialism. Once you get the French writing and thinking as those blokes do, everyone in the Cultural Studies business starts doing it because they don’t want to appear dumber than the French.

There’s another possibility: perhaps I’m dumber than the French and those who follow in their intellectual footsteps. Still, even if these studies do mean something and have great (though hidden) practical value, my bottom line is this: I’m not willing to spend the time to work it out. The studies’ conclusions aren’t worth the effort of unpacking the language. As far as I can tell.

When he addressed us in Week 9, Tony Jaques suggested research should build on previous findings, it should teach us something new and it should lead to action. He was referring mainly to professional research rather than theoretical cultural studies-style research, but the point was a good one.

Tony Jaques also recalled an occasion when a paper he wrote was criticised as “too journalistic”. He took this to mean that the paper was readable and comprehensible – both good things – rather than high falutin’ and obscure.

Despite my rant at the top of this paper about the failings of journalism, I’ve come to realise that my preference in research methods sits somewhere in the middle of the continuum that stretches from quality journalism to research that is rooted in obscure theories only available to certain members of an academic club. Real numbers, trends and behaviour you can identify, results you can use to improve understanding and apply to improve policy (government, corporate, social): these are the kinds of research practice that interest me.

Lights! Camera! Action Research!

I was always marked highly for the Systems Theory papers I wrote when I dabbled in the subject at Monash a decade ago – even though as far as I could tell my essays meant nothing, I was making it all up as I went along, couching wads of nonsense in language that made sense only if you were the kind of person who habitually wears a colander on your head with an array of wires connected to a tin foil body suit beneath your clothes.

Did the lecturer (the late, great Frank Fisher) dare not mark me low because he wasn’t sure whether my papers contained great insight or were utter drivel? I don’t know. The thing is, I now realise that what I was doing was Action Research, which turns out to be a legitimate kind of research, not a subsection of the airy-fairy French wankery referred to above. Cool.

I quite liked doing the Systems Theory version of action research but do wonder: how can we be sure everyone is on the same page if experiential-based research is acceptable? Won’t subjectivity dominate? What are the common parameters that enable comparison of studies? Are such questions relevant?

I’d like to end with a passage from Jean McNiff’s Action Research for Professional Development:

In traditional forms of research – empirical research – researchers do research on other people. In action research, researchers do research on themselves. Empirical researchers enquire into other people’s lives. Action researchers enquire into their own. Action research is an enquiry conducted by the self into the self. You, a practitioner, think about your own life and work, and this involves you asking yourself why you do the things that you do, and why you are the way that you are.

Permission to rabbit on after a bit of self-reflection? Granted. Plus we’ll validate your parking as thanks for coming to the conference.

  1. Some thoughts on your thoughts:

    Yes, research questions should always be contestable (Booth et al again), and yes orthodox sociology has a knack of working over the apparently self-evident, but how high fleas jump is probably pretty important if you’re thinking about transmission of flea-born disease (am i thinking about plagues, here?). What do we know about headlined research; what don’t we know? (That said, what worries me is research which, at least at first blush, seems to be reinventing the wheel. Seems to be more and more common. This has more to do with an exploding publishing industry, and diminution of historical knowledge of fields of scholarship, than anything else, I think.)

    To put alongside the reasonable point about jargon, Booth et al identify as an ethical problem the practice of simplifying what is actually complex — just as much a problem as complicating the straightforward. They’re possibly concerned with positivism and empiricism, where the problem is not a ‘world-view rooted in the practical and the solid’ but an ignor-ance of the institutional production of sense around the practical and the solid.

    French wankery? Yeah, but consider that there are many Foucaults, just as many Latours, etcetera. Any worthwhile use of Foucault’s work (and it’s not a unity, by the way) would identify what difference it does make to consider things genealogically, for instance. Check out Graham Thompson reviewing work in International Studies drawing on Foucault, as an example (‘The paradoxes of liberalism: can the international financial architecture be disciplined’, in Economy & Society, vol.40, no.3, 2011 ) — quite a useful account of where we are today, I’d say. And really, it’s not fair to lump the French together, all noses in their primary school existentialism primers. Of the many Foucaults there are, the one I find useful helps us get outside philosophy: quite a different proposition therefore, than Baudrillard.

    And, final observation, there are quite a few different journalisms too, aren’t there?

    Like

Leave a comment