Category Archives: Logic

Like, Wow! – The search for extraterrestrial intelligence and Humanity’s inescapable fear of cosmic loneliness

“Late night for Doctor Jerry Ehman

6EQ and it’s bigger than it came in”

SETI vs the Wow! Signal, The Dandy Warhols, 2012

The Wow! Signal referenced here by Millennial Alt Rockers the Dandy Warhols is considered by many to be the strongest contender yet detected for a message beamed to Earth from an extraterrestrial civilization.

Looking through his printouts of data one evening, volunteer analyst Jerry Ehman saw something remarkable – a strong and coherent signal that had all the hallmarks of originating from an artificially engineered source in deep space. Dandies frontman Courtney Taylor Taylor only captures half of the signal in his 2012 lyrics – but in his defence “6EQUJ5” doesn’t scan so well in 4/4 time. What those letters and numbers describe is the changing signal strength in a narrow band of radio wave energy received by the Ohio State University Big Ear radio telescope as it scanned across the night sky on August 15, 1977. Not the random chirps and squawks of cosmic background noise or radio interference that the facility had been recording since it was first turned to the search for alien messages four years earlier, but a clear and substantial signal that systematically rose and fell in intensity over 72 seconds.

Stunned by what he was seeing, Ehman circled the record on his printout and added the notation “Wow!” in red pen to signal his reaction – thereby creating the catchy moniker by which the signal is still known, even in serious scientific discussion. Some commentators have suggested that if he’d written what he’d actually been thinking, we’d now be calling it the ‘Holy shit!’ signal – but I’ve never had the pleasure of meeting Ehman myself, so I can’t judge how excitable he might be.

Wow_signal

Scan of a color copy of the original computer printout, complete with Dr Ehman’s excited notation that gives the Wow! Signal its name.

 

Why the ‘shock and awe’ response to this brief radio signal? Well, put yourself in Ehman’s no-doubt-sensible shoes. Imagine listening to 4 years of static hiss and the occasional random squawk coming from your home sound system, then suddenly having your speakers burst into life with 72 seconds of music at full volume.

Only it wasn’t music. Or at least, we can’t say whether it was music or not. Each of those 6 alphanumeric digits in the record simply reflects the total energy received by the detectors over a 10 second period. We have no way of breaking that down to say whether there was any kind of modulation to the frequency or amplitude of the radio waves over time – something that might represent the complexity of real information – or if this was just a burst of energy (albeit curiously narrowly focused). So in essence, the speakers roared into life, but we’re not sure whether it was Motzart’s Eine kliene Nachtmuzik, Eminem giving out Will the Real Slim Shady Please Stand Up, or Dylan Thomas reciting Under Milkwood. Or indeed, nothing more than intergalactic feedback.

The problem we run up against with applying any kind of deeper interrogation to the Wow! Signal is that the plan of the SETI (Search for Extraterrestrial Intelligence) programme of the time was essentially ‘lets see if we pick up any signals, then think about how we might analyse them and look for information later’. An understandable deficiency, perhaps – after all, if you’ve never seen a candidate signal before and don’t even know whether or not you’ll detect one, it’s probably not your top priority to invest the limited resources that you have in working out the details of what to do with one. Remember, back in the 70s the search for extra terrestrial life was basically thought of – and funded – a bit like Bill Murray and Dan Ackroyd’s parapsychology research lab in the opening act of the original 1984 Ghostbusters movie. There’s a reason why Dr Ehman was a ‘volunteer’ analyst – very few people were actually getting paid to do this stuff as their day job.

In its practical application, unfortunately, this strategy is a bit like going to a bar with the vague idea of picking up girls, but not getting any farther than a plan of ‘if one comes up to talk to us, we’ll work out what to say then’. If a dark eyed vision of feminine beauty then draws herself up on the barstool next to you and starts speaking huskily in French, its too late to start wondering what she’s saying and making plans of how you’ll respond.

At the end of the day though, the Wow! Signal fit every criteria set by the SETI scientists in their “what to expect in a contact with an intelligent extraterrestrial communicator” guide for young spotters. No one-off surge that could be dismissed as an artifact of 20th century electronics, no random scatter of values that could reflect some radical malfunction of the experimental apparatus – the signal progressed in stately fashion from strong, to stronger, to the strongest value ever recorded by Big Ear in its entire 22 years listening to the skies between 1973 and 1995, and then equally steadily decayed away back down to background levels – exactly what would be seen if the stationary telescope was slowly being scanned across a point source in the distant reaches of deep space by the rotation of the Earth. Hackers playing a malicious joke on the Big Ear team can be ruled out because – sit down for a minute to process this one – it was before the development of the internet, at a time when computers were individual monoliths of mute silicon and wire rather than the networked hive minds of the modern day.

Wait – did I say every criteria? Every criterion, that is, except for one crucial element: whoever it was never got back to us. Despite re-scanning the relevant areas of the sky many times (somewhat problematically, because the Big Ear telescope had two detectors, each focused on a slightly different area of the sky, we can’t be sure which one of those the signal came from) both with Big Ear, and with other more sensitive telescopes of the era and in more modern times nothing, not the slightest apparition of a comparable signal, has been seen again. So unless it was the equivalent of an alien civilization being caught whispering “Shhhh – they’re listening – don’t call me on this number” it does become increasingly hard to credit it as an intelligent communication with every passing year.

Whatever the Wow! Signal was though, what this opens up is the interesting question of just why it is that we are so obsessed with the idea of who or what might be out there among the stars.

Humanity has always populated its Universe with creatures of the imagination – fellow travellers that we have imbued with such agency that we have built stories, mythologies, and even religions around them (and in L. Ron Hubbard’s case, all three). In earlier centuries the ‘outside’ domain where these others might wait for us started in the terrestrial sphere – blank spaces on the map filled with dragons, eldritch creatures, and kingdoms of gold – but as exploration has doggedly filled that vacant territory, alien life forms have been pushed ever further from our doorstep, until now the cracks and crevices and distant spaces of our own world are so thoroughly tested that the location of possible ‘others’ has been pushed far from our own neighbourhood, into the realm of different worlds in the far depths of space.

That’s not to say that our deep desire to find a partner has been diminished by this shift in our horizons.

I’m not even talking about fictional imaginings here. Big Ear, after all was just one cog in a substantial and coordinated investigation that has occupied the energies of serious scientific players since the 1960s. Perhaps even more telling of our human obsession, in the modern era, tech billionaire Yuri Milner has recently committed $US 100 million of his own money to a new, large-scale SETI initiative. That’s not just idle curiosity, that’s someone really willing to buy a full-price ticket on the fairground ride – investing 1700 person-years of equivalent resource (at the average Australian salary – proportionally more if you wanted to outsource it to a call centre in India) in the exercise.

What makes Milner and the SETI community think all this investment – time, money, whole careers of activity in some cases from talented and active scientists – is worth it? Do we have any real reason to believe that there are others out there wondering, like us, at the mysteries of the Universe? Or is it just an existential feeling that, as captured in the words of punk pop balladeer Feargal Sharkey in his 1985 single A Good Heart, “Anything is better than being alone”.

Looking to Geological history for insight on this question, life appears to have evolved pretty much as soon as it could have here on our own planet. The oldest sedimentary rocks preserved on Earth contain within them un-mistakable fabrics revealing the presence of bacteria living 3,700 million years ago. Earlier still, even though their body forms have been erased by the tectonic recycling of the crust, isotopic ratios of carbon reveal the telltale signature of biological processing by ancient organisms as far back as we have rocks to measure them in. Over time, these early inhabitants gave rise to multicellular life, vertebrate skeletons, and ultimately, the emergence of all the glorious complexity and variety of our worldly domain. And, of course, our own sentience – and the accompanying blessing (or curse) of wonder at our existence.

As the late paleontologist and prolific essayist Steven Jay Gould was fond of observing though, there is a real and fundamental question as to what would happen if we re-wound the clock and let the experiment start all over again. I’m not talking here about peripheral issues like whether humans would have tails (or in the prosaic words of comedian Rowan Atkinson, we would perhaps have a different shaped gear stick on the Mini Metro). We don’t even know something as fundamental as whether life of any sort would evolve, or the Earth would instead remain a sterile ball of silicate rocks.

As anyone who has ever tried to bleach a shower curtain can tell you, once life gets going it is remarkably persistent and self-moderating. But that initial quickening – the fundamental transition of inorganic chemistry into living organisms…was it a one-off event of miraculous unlikelihood here on Earth? Or is it inevitable if you put carbon, energy and liquid water together? There, surely, is one of the most fundamental questions at the heart of the mystery of the Universe.

Many theoretical concepts have been developed in this space, but empirical testing is rendered problematic by the issue of pathetic statistics: we’ve basically only got a sample set of one to look at – our own home (and history) here on Earth.

This is one of the reasons why Mars assumes such scientific interest. Ever since 1877 when Italian astronomer Giovanni Schiaparelli pointed his telescope at the Red Planet and claimed to see channels built by Martian inhabitants, we Earthlings have been titillated by the possibility of life on Mars. Subsequent probing of our neighbour by observation missions and un-manned landers has clarified that, while Schiaparelli was well wide of the mark, the dry valleys of Mars may indeed have a tale to tell on the evolution of early life.

Why the big deal though? What possible relevance could the presence of life (either now or in the distant past) out there on the frigid surface of Mars have to us here on Earth? The key is that the Red Planet represents only the second place we’ve really had the opportunity to explore, even in passing. If life also developed there, then you go from a single point of data and the corresponding possibility of life originating by near-miraculous happenstance to the (still statistically dubious, obviously) situation of ‘well, every viable place we’ve looked, life developed’ – which would strengthen our expectations that it may also exist elsewhere in the Universe.

1024px-Burns_cliff

Approximate true-color mosaic image of Burns Cliff in Endurance Crater on Mars, captured by the NASA rover Opportunity. Proof that life once existed on the Red Planet’s surface would assume huge significance to thinking about our place in the Universe by demonstrating that the creation of life is replicable, and our own existence is more than the outcome of a cosmic lottery win of unimaginable unlikelihood.

 

So what about that wider universe then? In the words of Douglas Adams:

“Space is big. Really big. You just won’t believe how vastly hugely mindbogglingly big it is.”

Gaze up into the night sky (as the Big Ear team were probably fond of doing in between their volunteer shifts crunching data back in 1977), and the points of light you see mark out just some of the uncounted billions of stars in the Milky Way galaxy and, in the further distance, billions more galaxies just like our own. We have enough experience now with the careful observations of celestial mechanics necessary to say that most, if not all of these distant stars are probably orbited by their own families of planets. Some proportion of those will presumably sit, like our own comfortable residence, in the so-called ‘Goldilocks zone’ around their respective sun – not too hot, not too cold – where liquid water is stable. If we assume that some proportion of those potential alien domiciles see life kick-started as it was here on Earth (however that happens), some proportion of those biological incubators see the emergence of multicellular life, some proportion of these see development of some form of sentience…the powerful and attractive logic of extraterrestrial civilisations out there – alien eyes staring up at alien suns – becomes obvious.

Which brings us to the Fermi Paradox: when you put it like this, logical argument would seem to suggest that many technologically advanced civilizations might exist in the universe, but this belief seems inconsistent with our lack of observational evidence to support it. Or, as put more pithily by the great Nobel Prize winning Physicist Enrico Fermi himself – “Where is everybody?”

For all our uncounted generations of staring heavenwards and looking for a sign, all the millions of dollars invested in serious SETI research over the past 50 years, what have we got to show for it? No invitations to intergalactic councils. No imperious threats of our imminent destruction. Not even a poignant “I am Ozymandias, King of Kings, look on my works ye mighty and despair” from some long-vanished civilization.

For a point of comparison, the new enhanced Laser Interferometer Gravitational-Wave Observatory in the United States picked up two black holes colliding pretty much the first time it was turned on for a test run earlier this year, and detected another collision just last month. Going by those statistics, collisions between black holes – astronomical features so vanishingly rare in their own right that until recently they were nothing more than abstract Cosmological theory and the fodder for science fiction imaginings – appear to be vastly more numerous than advanced alien civilisations out there.

Actually, speaking of science fiction, for my money it’s probably 20th century writer and futurist Isaac Asimov whose musings on this point best capture the philosophical implications of the search for extra-terrestrial life:

“Two possibilities exist: either we are alone in the Universe or we are not. Both are equally terrifying.”

So to turn full circle back to the curious event that kicked off this discussion in the first place – what was the Wow! Signal? Was this Jor-El beaming out the sum total of Krypton’s knowledge as his world collapsed, in the hope that our distant civilization would receive it and carry on his work? And we’re caught here on Earth saying “hang on, I’ll just get my pencil…oh, they’ve gone.” Or possibly nothing more than some previously unknown natural radiowave phenomenon reaching us from deep space – still a mystery to be explained, to be sure, but lacking the radical overtones of extraterrestrial contact.

Well, perhaps…but then again – I’m sure I’m not the first person to notice this, but the Wow! Signal was received the day before Elvis Presley ‘died’. Coincidence? Or the King being called home?

Advertisements

Bling, Rings, and Blind Spots – Alan Bond and the Capacity for Human Self Deception

“They say time is the great healer. But the extent of Australian gullibility when it comes to the life and crimes of Alan Bond seemingly knows no bounds.”

Ian Verrender – ABC News business editor

If all the world is indeed a stage, then the late Alan Bond bestrode it like a personification of 1980s glitz and excess – a walking moral tale…only re-written with a post-modern ironic twist so that rather than being brought low by hubris, the hero rides off into the sunset with the money, the girl, and the applause of the good townsfolk he’s just sold the monorail to still ringing in his ears. It’s probably fair to say that Sophocles – the ancient Greek master of tragedy and comeuppance – wouldn’t approve.

Coverage of Bond’s recent death must have consumed a veritable forest of newsprint, and depending on which side of the line the writers sat (or perhaps whether they had ever invested money with him), he was held up as either a colourful larrikin who dragged plucky little Australia onto the world stage, or an inveterate shyster and self-serving hypocrite. In a way, rather than the stereotypical “somewhere in the middle”, Bond was really both – simultaneously occupying the polar extremes of description like the quantum behaviour of an obscure subatomic particle, and equally impossible to classify by the rules of normal behaviour.

The late Alan Bond - tarnished Australian treasure, visionary entrepeneur, criminal fraudster par excellence. We may not see his like again in this more worldly age. And if you do, check your wallet...and possibly count your teeth.

The late Alan Bond – tarnished Australian treasure, visionary entrepeneur, criminal fraudster par excellence. We may not see his like again in this more worldly age. And if you do, check your wallet…and possibly count your teeth.

Amongst the commentary and hyperbole filling the airwaves though, the element that most intrigued me was the widely reported ‘denial’ of a state funeral for Bond. To be fair, this was never a serious proposal instigated by political authority of any sort, so far as I could determine – but a substantial sector of the national media seemed to grab hold of the idea from somewhere and present it as if it should have been taken seriously – as if he was somehow an exemplar – a national treasure to be held up and emulated by all Australians.

To fully appreciate the Bond story, you need to think back to the heady exbuerance of the 1980s. The deregulation of the Hawke-Keating years in Australia heralded a new era in finance and banking in the formerly sleepy and somewhat conservative markets of this Great Southern Land. Foreign banks moved in like dubious relatives after a lottery win, flooding the nation with cash. Money was not so much easy as full-on sexually aggressive, with bankers knocking on the door of any two-man outfit with an ABN, begging businessmen to borrow more. And boy, was Bond happy to oblige.

In an era of little financial transparency and minimal disclosure rules, Bond and his trusted lieutenants worked the system with verve and exuberance, shaking up the Australian business world with a series of audacious mergers and takeovers…and not coincidentally paying themselves massive personal success fees every time they notched up a deal.

And therein lay the secret to Bond’s success – a fortune built around the classic swindler’s trick (or investment banking fund management strategy, depending on which side of the financial divide you fall) of creaming off massive fees from his shareholders for every deal – good or bad. They might lose their shirts or win big from his market betting, but Bond always made sure to secure his own skin, whatever the outcome.

Even that wasn’t enough for Bond though, and when the great speculative bubble inevitably burst in 1987, he was caught out shuffling, borrowing, and outright stealing millions from the companies across his empire to keep the good times rolling. Bond’s personal holding company eventually collapsed owing more than half a billion dollars, and Bond reached something of a personal nadir by declaring himself bankrupt in 1992 with debts of $600 million…although having a trusted and well-paid accountant at his disposal, this development didn’t seem to impact on his lifestyle or personal comfort to any great degree. By all accounts Bond’s personal business dealings with Switzerland involved a lot more than clocks and chocolate.

Above and beyond the usual chicanery and bastardry of high finance though, Bond was also convicted three times on serious criminal fraud charges (one of which, admittedly, was overturned on appeal) – including the capping majesty of stealing $1.2 billion from Bell Resources, really putting him in a class of his own when it comes to Australian crime. If you’ve ever tried to negotiate with someone from this wide and sunburnt land and wondered why “my word is my Bond” doesn’t cut much ice – the record of the good Alan is probably your explanation.

And that’s really where my curiosity over Australia’s continued bimodal views on Bond starts from too. Okay, in the excesses of the 80s when a whole generation of conservative middle-class culture seemed to catch up with the entrepreneurial vibe in a rush and Alan Bond was the largest and flashiest of the showmen who seemed to be standing on every street corner producing endless streams of money out of the air, the average investor could be forgiven for taking a punt on Mr Showmanship. After all, he seemed to be capturing the zeitgeist of the era – and a shed load (a boat shed, as it turned out) of money along with it.

But post the 1987 stock market crash and corruption scandals? Post the fraud convictions ($1.2 billion of other people’s money, let’s remember – that’s ‘billion’, with a B) and the sheer ludicrous front of the “sorry, I’m brain damaged and can’t remember anything” defence?

Yes, unbelievably in my book, when Bond came out of prison and re-launched himself on the world looking to wring a fortune out of the now-booming minerals sector (presumably his opening patter was along the lines “Brain damage is much better thanks – now let me tell you about these diamond mines in Africa” ) there were people willing to invest money with him again. I mean…what?

It seemed a selective amnesia surrounded Bond like a fog – people could remember he had made himself incredibly wealthy, but somehow forgotten that he did it by making ludicrously risky bets in the business world with lots of other people’s money – salting away the windfall and stiffing his investors with a shiny toad-like grin when the market turned. This has all the intellectual merit of asking the Pied Piper of Hamlin to babysit because you remember something about him being good with kids.

Basically, Alan Bond succeeded and thrived – and cemented himself a place in the Australian psyche – because we want to believe. We simply want to believe. Especially at our most vulnerable, when we need a miracle to appear, we are desperate to believe. Our own internal narrator says it should – it’s the most classic of plot lines, after all. In act two it all goes wrong for the hero – the battle seems lost, your true love’s plane is missing, your puppy has cancer. Then act three dawns – boom – the cavalry arrives, everything turns out for the best, and you ride off into the sunset.

We all get ludicrous phishing spam clogging our email in trays – and I know your first thought as you hit delete is probably “Why do they bother with this? Surely everybody must be able to see straight through this transparent drivel?”

And yet, according to a recent report from Ultrascan AGI (admittedly not a disinterested source, given they are in the business of Open Source Criminal Network disruption), losses from Nigerian 419 Advance Fee Fraud (prosaically named for section 419 of the Nigerian criminal code dealing with fraud) totaled $12.7 billion worldwide in 2013. That’s a figure to draw a blush from even the Cheshire Cat-like visage of our Mr Bond.

What makes outwardly sensible and often well educated people part with their money to these seemingly obvious scams? Well, while I can’t exactly say I’ve been in that position myself, I have been close enough to peek through the door and see the magic kingdom on the other side. In the weeks after being told I had become a luxury my current employer couldn’t afford, while working through the accompanying grieving process of anger and frustration – wondering how the mortgage was going to get paid and where my next loaf of bread was coming from (okay, I was still planning on it being artisanal hand made bread – but a guy’s got to have standards, right?), those emails actually started to look pretty good on some levels.

I was still a couple of weeks of serious sleep deprivation, serotonin imbalance and (I’d like to think) a lobotomy away from “yippee, it came true – here’s my bank account and IBAN code” – but those mystery benefactors, previously unknown deceased relatives, and Iraqi generals burdened with over-full bank accounts they wanted help with gave me a momentary frisson of hope when they arrived – not belief as such, but a tiny spark of ‘wouldn’t that solve everything?’ that had me looking forward to them.

There are a lot of people out there at any point in time in desperate, desperate trouble. There’s presumably a subset of those people so emotionally or physically drained that they’re unhinged enough to believe, maybe just for a second, in the salvation narrative. And that’s all the spammer needs – the lucky happenstance of having that 10,000th email hit the inbox of someone ready to believe in miracles.

It’s not just criminal fraud that relies on these principles. People don’t buy lotto tickets because they support opera, or sports gear for kids, or any of the myriad of borderline welfare programmes funded by the state lottery organization.

Nor, I would venture, do most people buy them with any serious expectation they will win. We buy them for the story we sell ourselves – the moment of “mmmm…but wouldn’t it be nice” fantasy. At the risk of outing myself as an irretrievable nerd (or more worryingly, as lacking imagination) by using a Star Trek metaphor for the second post in a row – charismatic risk-taking Captain James T. Kirk would buy a ticket without a second thought (indeed, Chris Pine’s devil-may-care Kirk from the rebooted film series would probably expect to win). Spock would calculate the odds, realize that, statistically speaking, the expected return on investment was so low as to be indistinguishable from zero, and raise a quizzical half-alien eyebrow over these strange human foibles. I get that. Most of the time I pass on the lotto ticket myself and buy lunch instead…but if the jackpot is over $10 million, the fantasy is big and bold enough for me and I’ll happily put my money down.

To quote the late Terry Pratchett, our species should be labelled not as Homo sapiens – thinking man – but rather Pan narrans – the storytelling ape. Man is not, after all, the only animal to think (any owner of a cat can tell you the little deviants are continually plotting evil deeds of varying magnitude) – but we are unique in our ability to tell stories – to imagine life could be different than the way it is.

This truth – this brilliant point of difference at the root of our success as a species – also lies at the heart of every fraud ever committed. The secret of the successful grifter, after all, is not to fool a mark – but rather to give them the opportunity to fool themselves.

That back door vulnerability in the mental fortress of the human psyche has been exploited throughout history – Alan Bond is far from the first inveterate rogue to achieve popular fame and love from a public who should really know better.

The framing character in Peter Carey’s “True History of the Kelly Gang” (yes, I know this is a work of historical fiction – but it’s a great yarn, and if you’ve got this far I assume you can appreciate that I’m a sucker for powerful narrative drive) is school teacher Thomas Curnow. Curnow brings about the fabled bandit’s ultimate undoing, slipping away from Glenrowan to warn authorities of Kelly’s plot to ambush the police train, but in later years relates his bitterness that it is the criminal bushranger Ned who the public remember and revere.

Likewise in Andrew Dominik’s 2007 drama ‘The Assassination of Jesse James by the Coward Robert Ford’ – it’s outlaw James the contemporary public eulogise, not Ford – who ultimately suffers the stereotypical gunslinger’s death, shot down by yet another violent misfit seeking notoriety.

And lets not forget – Bilbo Baggins made no attempt to track down the actual owner of that ring he found in the goblin tunnel, so Golum was quite within his rights to feel legally aggrieved about the blatant theft by appropriation.

We all root for the underdog at heart. Ultimately, it’s the understandable natural tendency of the human mind to favour the interesting character, the rough diamond. What we can learn from the misplaced lionization of Alan Bond though is that alongside that internal narrative humming as a backbeat to our thoughts, we shouldn’t lose sight of that other great human power – logic.

Fool me once, shame on you – fool me twice, shame on me.

Delusions of Inadequacy: A review of Curtis White’s ‘The Science Delusion’

In the opening scenes of Mel Brooks’ classic 1974 Western parody Blazing Saddles, Burton Gilliam’s racist overseer Lyle demands that his black railroad labourers perform “a good ol’ nigger work song”. When the obliging work gang, led by Cleavon Little’s Bart, burst into an elegant a cappella version of Cole Porter’s ‘I get a kick out of you’, Lyle stops them, and he and his cowboy colleagues show them what a ‘real’ negro song is supposed to be like, with a spirited rendition of ‘De Camptown Ladies’ – complete with minstrel dancing and derogatory mispronunciation – while the black labourers look on dumbfounded, unable to relate to the racist caricature played out before them.

The stereotyping skewered so effectively in this scene bears close parallels to the views presented by Curtis White in his book ‘The Science Delusion’.

White’s stated aim is to critically examine the ‘privileged place’ of science in secular western society. His thesis though is built around a definition of ‘science’ that I fail to recognize, and a crude caricature of the scientist – a cold, detached savant unable to appreciate human emotions or true beauty – that seems to have been based largely on watching re-runs of Big Bang Theory.

Bart and his fellow convicts responding to overseer Lyle's entreaty: “Now come on boys, where's your spirit? I don't hear no singin'. When you were slaves, you sang like birds. Go on. How 'bout a good ole nigger work song?” Still image taken from Mel Brooks' 1974 comedy "Blazing Saddles".

Bart and his fellow railroad labourers responding to overseer Lyle’s entreaty: “Now come on boys, where’s your spirit? I don’t hear no singin’. When you were slaves, you sang like birds. Go on. How ’bout a good ole nigger work song?”
Still image taken from Mel Brooks’ 1974 comedy “Blazing Saddles”.

At the heart of White’s failed analysis lies the equation of science with a moral and political philosophy. It isn’t enough that science produces new ideas and insights to inspire discussion – White wants to be told what to think about them, complaining that science doesn’t inform us how to judge its discoveries. “Far too many scientists”, he writes, “leave the ethical meaning of their work to people bereft of moral imagination”.

But as any serious student of science could tell you, White is tilting at windmills of his own imagination here. Science is nothing more nor less than an ordered way of assessing problems and testing ideas – a systematic approach to problem solving and self criticism. It is explicitly not a code of Bushido by which to live.

At the heart of White’s critique though is not a dislike of science itself – indeed, as becomes apparent in the later chapters of his work, White actually both understands the nature of the scientific method, and appreciates the beauty to be found in science through the challenge it poses to the existing order. Rather, the over-riding narrative is one of visceral disrespect for the practitioners of science, expressed through sweeping generalizations that if you replaced the word ‘scientists’ with ‘Asians’ might have him suspended from the faculty at Illinois State University. It got to the point where I kept expecting arguments to be prefaced by a paraphrasing of the old vituperative racist’s standby: “Don’t get me wrong – some of my best friends are scientists, but…”

It is not that White’s criticisms are entirely without merit – indeed, many of his individual targets are well chosen. Richard Dawkins and the late Christopher Hitchens can, as White pointedly observes, be overbearing and un-necessarily dogmatic in their commentary. Likewise, Sebastian Seung and others in the public vanguard of neuroscience do often lapse across the dividing line between science communication and science fiction. And yes, I’m comfortable conceding White’s point that many scientists – even very good ones – are probably mediocre to appalling poets, falling at the first hurdle when trying to use evocative language and imagery to capture the beauty and nuance of their work.

At the same time though, I don’t get a sense that White is looking to establish a level playing field in these regards – there is certainly no indication he would expect a poet writing on the nature of life and the universe to have a firm grasp of virology or particle physics.

Such uneven treatment is a notable and distracting element throughout the text. White calls to Tom Waits lyrics, pop culture movies, and novels as sources of rhetorical strength in his own writing, but would deny his antagonists any written form not as formally codified and structured as the 19th century German philosphical treatises he is so fond of.

White is also quick to point out sloppy structure and inadequate definition of terms in the writing of those he seeks to criticize…but then takes equal liberties himself, and all too readily forgives the linguistic sins of those whose work he would co-opt to his purposes. He takes umbrage, for example, with Jim Watkins’ description of humans as ‘products’ of evolution – suggesting with a sniff of derision that such terminology places us on a par with the output of some cosmic factory conveyor belt. A virtually identical linguistic allusion, however – the world being a ‘product’ of the self in Schelling’s philosophical arguments – is then later adopted by White himself without comment.

Such uneven handling might be forgivable in an undergraduate thesis, but White is a career wordsmith – a novelist, essayist, and academic. He obviously cares about, and has considerable mastery of, the English language – continually waving the flag for well-structured and elegant communication throughout this book – so such double standards are at best sloppy, and at worst self-serving hypocrisy.

With his fundamental thesis stretched painfully thin and his narrative structure riddled with such inconsistencies, White frequently over-reaches in seeking to generalize his individual criticisms to a comprehensive attack on the essence of science – as where he argues that because a scientist (specifically in this instance, neuroscientist Seung) has produced a bad philosophical argument, science cannot relate to philosophy or art. The logical corollary of this is that if Salman Rushdie burns the toast, writing can have no relationship to cooking.

If scientific writers and commentators have failed to appreciate the artistic and sociological implications of their ideas – and rest assured, many have – it is because of their personal inadequacies, not because of anything to do with the nature of science. There is no ‘Science Pope’ sitting on a throne issuing encyclicals about how scientists should relate to other ideas and viewpoints.

When he writes that Christopher Hitchens “reduces religion to a series of criminal anecdotes…[ignoring] virtually all of the real history of religious thought, as well as historical and textual scholarship”, White himself willfully overlooks the contextual framing of Hitchens’ writing. Hitchens is not offering up this critique of religion de novo, but as a deliberate counterpoint to the historically promulgated view that we should think only of the positive philosophical and sociological functions of religion, and ignore the darker side of its use as a justification for war, social injustice, and other abuses.

Hitchens and Dawkins are not writing to engage with the enlightened philosophers and thinkers of the world. The two are the bulldogs of the humanist viewpoint deliberately setting out to take on and worry the one eyed commentators expounding conservative religious polemics and attempting to dominate the cultural airwaves. Rather than taking the moral high ground by constructing an artful, comprehensive and syntactically complete thesis and sitting back to watch it be ignored or misrepresented by their opponents, they plunge in with the rhetoric of broad engagement – taking on their intellectual opponents in hand-to-hand combat.

I might not approve of everything Dawkins says, but I am grateful for his occupation of this position. Dawkins is the Charles Bronson of scientific writers – an enforcer out there brawling in the street to control the baying mob so the rest of us can get on with our work.

By far the most distracting of White’s narrative straw men though is his placement of words into the mouths of his intellectual sparring partners to make them into obvious cartoonish buffoons, so that he can then ride to the rhetorical rescue with a devastating rejoinder. Yes, I know the Socratic dialogue (an imagined discussion with a naive or foolish companion) is a valid and effective rhetorical device with a long and rich history in Philosophy – one I appeal to myself on occasion. I’m just not sure that Socrates ever used the approach to so transparently belittle his contemporaries or score cheap personal points. Fundamentally, to paraphrase US Senator Lloyd Bensen in his 1988 Vice-presidential debate smack-down of Dan Quayle: Professor White, you’re no Socrates.

Putting foolish words into the mouth of your opponent does not enhance your own intellectual credibility. Reproduced from Bill Waterson’s Calvin and Hobbes cartoon strip, originally published January 18th, 1987.

Putting foolish words into the mouth of your opponent does not enhance your own intellectual credibility. Reproduced from Bill Waterson’s Calvin and Hobbes cartoon strip, originally published January 18th, 1987.

Behind the claims to an intellectual high ground, White’s true motivation perhaps emerges as something closer to sour grapes in a series of unguarded comments around the demarcation he sees between science and what he would stake out as his own home ground – art and literature.

He criticises Hitchens’ “privileged position on the New York Times best-seller list” as if this was some hereditary title Hitchens had unfairly usurped from its rightful holder, and complains of science “being given every kind of opportunity to make its case to the public, including high-tech presentations and best selling books”, while philosophers sit unread and unremarked upon on unvisited library shelves. White seems to imagine there are an endless array of publishers and TV commissioning editors beating down the door of theoretical physicists desperate for new books on their work and its relationship to the fabric of reality. In the real world though, I’m afraid that if Schelling and his fellow academic philosophers can’t get their own TV series, its not because those nasty scientists have formed a cabal to dominate the Western cultural conversation and black listed them.

In seeking to establish the boundaries between the qualities of art and science, White speaks of transcendence – but his framing of the term is deeply flawed. His juxtaposition of Beethoven with the industrial design team at Proctor and Gamble is manifestly ludicrous (although not as over the top as his analogy of the company motto “GE: Imagination at work” with the historically laden Nazi concentration camp slogan “arbeit macht frei”). Under such a framing we might equally compare Isaac Newton with George Formby to establish (with apologies to fans of “When I’m Cleaning Windows”) a countervailing majestry of science over art.

If ‘art’ is the common factor, White would lump together the good, bad and indifferent – Beethoven, Picasso, and Hitchcock with internet memes and LOL cats. It is actually the transcendence of genius of which his examples speak, and I would be far more generous than White in my attribution of this to other fields. Contrary to his criticism, there is unquestionably a form of genius in convincing an already overweight, well-fed consumer that they really want to eat a hamburger right now. Or in convincing an intelligent human being to take up a habit like smoking when they know it will shorten and decrease the quality of their life. Yes, it may be an evil genius…but genius, none the less.

White’s view ultimately comes across as narrowly bourgeois – creation is only worthwhile if it occurs in the critical space occupied by he and his coterie. Things that touch, move, or inspire the masses are worthless. We (I would happily include myself among White’s great unwashed masses) are not even allowed to appreciate and value art unless it’s for the reasons he wants us to. According to White the ‘wrong’ view of art “is the assumption now even of arts councils and, as far as I know, the artists they fund.”

The artists they fund? Did I mention the odour of sour grapes?

Wow. So now its not just the scientists, but a conspiracy running right to the heart of the all-powerful arts council Illuminati! Did White have Dan Brown ghost write this chapter?

With presumably unintended irony, the nature of what I would define as science is actually not far removed from White’s phrasing of Schiller’s definition of art: “It refuses the world as something already determined…[offering] a welcoming openness to change”.

I find myself suspecting that a much more positive contribution to public discourse could have been laid on this foundation, but if White’s fundamental drive is to encourage active thought and challenging of ideas in society, this book has probably done a good job of alienating a substantial slice of his potential constituency – namely those critical thinkers who would describe themselves as scientists.

Therein lies the tragedy of this work – I know I’m not White’s target audience…but I could have been. There are many areas in which I have some sympathy with White. I share his appreciation for the Romantic spirit – with the best art (like the best science) to be found in the overturning of paradigms, and the challenging of comfortable authorities.

And when he can rise above the vitriol and histrionics, he’s a good writer – even with a pleasingly dry sense of humour. Anyone who can skewer his opponent’s philosophical knowledge with the pithy “Dawkins knows sweet nothing about Foucault” has my utmost respect. Yes, this does rely on a mispronunciation of Foucault, but I imagine you’re there well ahead of me on this one.

Part of me even thinks it might well be enjoyable to share a bottle of good wine with White and talk about his ideas…but in the face of his espoused views on scientists, I fear this might be a bit like Louis Armstrong sitting down for a chat with Benito Mussolini – a big fan of jazz music according to his son Romano, but otherwise unlikely to have much common ground with the great African-American musician.

At the end of all the analysis I find myself left, disappointingly, as Bart – the hero of Blazing Saddles – looking on bemused from the sideline as White crow dances and sings his minstrel song of science – ultimately showing more about his own biased views than offering any kind of serious analysis.

Sticky Fingers: Changing Old Noise to New Data in the Course of Scientific Discovery

“I suppose you won’t be able to find one of your famous Clues on the thing?”

“Shouldn’t think so, sir. Not with all these fingerprints on it.”

Terry Pratchett, “Feet of Clay”

Captured in Pratchett’s satirical writing here is a key concept underpinning the advancement of science: In recognising the deficiencies of our understanding, we identify pathways to more fundamental, deeper insight.

If you might indulge me in illustrating this concept: The science of geochronology is only a little over a hundred years old. The genesis of this field – the direct measurement of the age of Earth materials in the millions and even billions of years, putting a timescale to the grinding wheels of geological process – is probably traceable most directly to the New Zealander (albeit that we should add the prefix ‘Colonial’ to that label, given he undertook his post-graduate education and scientific career at Cambridge University in Britain) Ernest Rutherford – one of the great figures of 19th and early 20th century physics. Scientists don’t often ascend to the pantheon of cultural heroes, so the fact that Rutherford’s distinguished portrait graces the $50 note of his country of birth is probably as effective a mark as any of the degree to which he bestrode the world stage, and the respect in which he is still held.

With his finger on the scientific pulse of the Edwardian age – and in particular the atomic theory at the heart of his own cutting-edge research – Rutherford was quick to appreciate the significance of the new phenomenon of radioactivity discovered by his contemporaries Marie and Pierre Curie. In the words of the great man himself:

“The helium observed in the radioactive minerals is almost certainly due to its production from the radium and other radioactive substances contained therein. If the rate of production of helium from known weights of the different radioelements were experimentally known, it should thus be possible to determine the interval required for the production of the amount of helium observed in radioactive minerals, or, in other words, to determine the age of the mineral.”

Ernest Rutherford – Silliman Lectures at Yale, 1905

With those words, a scientific revolution began.

Rutherford quickly set to work encouraging collaborators in the fields of chemistry, physics, and geology to put that principle into practice, but it didn’t take long for the community to recognise that his original elegant concept wasn’t going to be the simple path to greater knowledge that they had hoped for. The problem was that helium – the simple, easily extracted product of radioactive alpha decay – wasn’t fully retained in the mineral structures they were testing. In the words of John Strutt, one of the key figures in this early research:

“[helium ages provide only] minimum values, because helium leaks out from the mineral, to what extent it is impossible to say.”

R. J. Strutt (1910), Proceedings of the Royal Society of London

Some process – unknown at the time – was allowing the helium to escape from the crystals. Like a water clock with a leak in it then, there was no true fundamental way to calculate age from the system.

The key to our story here is that the contemporary paradigm to which these scientists were working was that the only age that mattered was the time at which a sample crystallised – nothing else entered their world view. When helium dating returned values that were clearly far too young and inconsistent to reflect such formation ages, the method was consequently abandoned, with the scientific community pursuing other isotopic systems – notably the pairing of uranium isotopes with their ultimate stable decay product of lead – as the pathway to temporal understanding of Earth evolution.

90 years later at the turn of the 21st century though, helium dating was back on the scene and a hot property (quite literally, as it turns out – but more of that later) in the field of geochronology – and it remains so right up to the present day. Why? Have we just forgotten the lessons of the past?

To understand the answer to that question, you need to appreciate that an isotopic ‘age’ is fundamentally just a ratio of chemical species – namely the abundance of a radioactive parent isotope – the ticking clock of the system – and the product of its decay within your sample. Ultimately, this is just a number – nothing more or less…unless you have a physical event you can relate that number to. If I toss you a rock and say “this rock is 7,000 years old” – what does that mean? Is it 7,000 years since the rock crystallised? 7,000 years since it was knocked from a large boulder upstream? That it has spent 7,000 years tumbling back and forth in the surf? 7,000 years lying on the beach? All these ‘ages’ might have meaning – telling us something interesting about the history of this particular sample – but unless you know which one I mean, the manifold possibilities obscure the potential insight.

Nice looking piece of rock - so how old is it? And how would we tell?

Nice looking piece of rock – so how old is it? And how would we tell?

To address this confusion from the perspective of helium, let’s drill down from the scale of rocks and hammers to the sub-microscopic world of a crystal lattice. The comforting solidity and discrete character of the everyday is replaced by a dynamic constellation of atomic structures held in place by overlapping and interfering clouds of electrons and opposing forces – a seething maelstrom of movement and change. As those particles spin and vibrate, the force balances governing their interactions rise and fall, bonds parting and re-forming in the blink of a conceptual eye as their stability waxes and wanes. Take a moment to watch this video clip from Dr Erik Laegsgaard at Aarhus University.

Scanning Tunneling Microscope imagery of atomic-scale diffusion in titanium dioxide, created by Dr Erik Laegsgaard, Aarhus University. Recorded at 300 degrees Kelvin, and at 8.6 seconds/frame.

Each of those glowing orange orbs is actually an atom of oxygen resolved by advanced scanning tunnelling microscopy of a sample of titanium dioxide. To my thinking, this movie is mind blowing – this is not a cartoon, or a fancy computer model – this is an actual resolved record of real individual atoms, in solid material, at room temperature. Reflect for a moment on just how we see those atoms behave as the movie advances through time. Rather than locked in place like mosaic tiles set in mortar, they skitter back and forth – momentarily held in the embrace of one bond, but then twisting away across the crystalline dance floor to some new partnership. The movement is random and unpredictable – particles as likely to jump one way as any other.

This atomic diffusion is what was responsible for Strutt’s anomalous ‘leakage’. Although the movements are individually random, if you’re building up an increased concentration of something (as with the helium produced by alpha decay in the example of our geochronometer), then you’re statistically more likely to have those random movements going out of the radioactive crystal structure than into it. It follows that this diffusion will prevent the build up of your daughter product (helium), keeping the isotopic age stuck stubbornly at zero.

So how then do we stop diffusion happening and allow our ticking clocks to record time? How do we set the geological stopwatch running? The simple answer is temperature – you cool things down. The rate at which diffusion occurs is proportional to temperature raised to an exponential power. In essence, this means that even a small change in temperature leads to a very large change in diffusivity, and the transition from rapid diffusion – so rapid that all the daughter product produced by radioactive decay is lost – to negligible diffusion where all that daughter product is retained – occurs across a very narrow temperature range.

Rather than the aberrant or spoiled data Strutt took them to be then, helium ages, once we understand this process and calibrate its thermal sensitivity, become sensitive records of the temperature change associated with dynamic geological history.

How does this help us?

When Gil Grissom finds a gun at the scene of a murder in CSI (yes, I know Grissom left the show after series 9, but I always thought he had excellent style as an on-screen scientist, and geologically speaking, his tenure is pretty much still within error of the present), his first thought isn’t “I must find out how old this gun is” – no – there are far more dynamic aspects of the weapon’s history he would like to see resolved. When was it bought? How long ago was it fired? Who pulled the trigger?

Similarly, if we focus purely on the crystallisation age of our samples, as Strutt, Rutherford, and their contemporaries were, there are many potential insights we will miss.

When were our samples last thrust beneath the crushing weight of an uplifting mountain range? When did they last feel the rush of superheated steam carrying rich mineral endowment through subterranean fluid conduits, or the frictional warmth induced by an active fault boundary radiating through the crust? When did erosion wear away its weighty overburden to exhume our rock from the hot interior of the Earth? With the thermal ages provided by helium dating and its correlatives, these dynamic episodes come within our grasp.

What was simply noise becomes, when we understand and can translate its origin, a sensitive new record of dynamic geological processes.

Unlike Pratchett’s protagonists, our FBI database is ready, and the fingerprints of geological systems are waiting to reveal themselves to our careful detective work.

Old Men and the Sea – the curious persistence of willful disbelief in Anthropogenic Climate Change

Imagine yourself, for a moment, adrift in a storm-tossed wooden lifeboat. Yours is the only vessel in sight – the only refuge in the heaving sea stretching to the horizon all around you. With a sinking heart – rightly concerned by the potential consequences – you realise the water level in the bottom of the boat is rising. You have nowhere else to go.

Now, the environment in which you find yourself may well be the source of this water – the persistent rain, the sea spray washing over the sides, perhaps even marine bivalve Teredo navalis – shipworms, as they were known in the days of grand wooden ships plying the seven seas – chewing their way through the hull of your fragile boat. That doesn’t mean that the signal fire you lit in the stern might not also be causing a leak. It’s not like you have a leakage budget to work within – “it’s okay, I’m going to take on a gallon of water an hour, so I can shave some more wood out of the sides and stoke the fire, and the rain will ease up to compensate”. Aware that fire is known to consume wood, and that, as my boat is made of wood, I could reasonably infer that my cheery blaze might be a factor – and one over which I had control, the prudent thing to do until I was pretty darned sure of things would be to douse it.

Of course, despite my obvious and melodramatic allegory here, we’re not really talking about drifting lifeboats. Rather, in a summer in which the Australian Bureau of Meteorology has found it necessary to add new colours to the temperature scale on their national synoptic charts, changing climate is probably a fair topic for engaged conversation.

Although I’m a professional scientist, and try to keep myself pretty well informed, I’m under no illusion that I can offer a fair and valid critique of understanding in this area. If that’s what you’re after, I heartily suggest you check out the US Global Change Research Program (www.global change.gov). Me? I’ll put my hand up right now and tell you I don’t fully understand the physics of greenhouse warming, the consequences of changing landscape albedo to a solar energy budget, the details of orbital precession, or the design and function of supercomputer models of climate sensitivity. Unlike a number of (usually self nominated) commentators on climate science though, my philosophy in these circumstances is not to go ahead and shoot my mouth off anyway – at least not without a few glasses of good wine inside me – so this is not principally an essay about the rights and wrongs of understanding on anthropogenic climate change.

Instead I want to talk about the (to me, anyway) curious fact that vested interests and enthusiastic amateurs from all walks of life – politicians, newspaper columnists, school teachers, Jeremy Clarkson – seem possessed of an unshakeable belief that their understanding of climate change and its causes should be given equal weight to, say, Roger Revelle, or the IPCC.

I’m not talking here about debate over how we, individually and as a society, should respond to climate change – what steps we should take, how the cost should be borne. Here opinion and debate clearly should be entertained as we move towards a social contract. But the facts of the matter, the understanding of physical phenomena, does not submit to willpower or popularity. You don’t get to vote by SMS on whether anthropogenic carbon dioxide emissions are a driver of dangerous levels of climatic warming or just a combination of snuggly global duvet and healthy plant food.

By way of analogy to the problem here – diesel has a greater energy content than unleaded petrol. I know that’s true because I read it on the internet. Logically then, if I start putting diesel into my car, it will be more powerful and go further on a tank. Right?

The fact that I used the word logically probably tells you everything you need to know about why it’s important that I listen to my mechanic rather than trying to fix my car myself.

Let’s pursue that metaphor a little farther. Think about it – if this was your car we were talking about (and I might venture here that the climatic system of our entire planet might be a bit more important than that – even if you do wash yours more often than I manage and rotate the tires every 6 months) I doubt you’d be up for self diagnosis – or even taking the advice of Alan Jones – when your engine started knocking. No. I suspect that, like me, you would far rather trust the judgement and experience of auto mechanics who have trained for years and devoted themselves professionally to the diagnosis and correction of engine problems. Even if you did roll up to the workshop door with a worrying knot in your stomach over what they might find under the hood, and just how eye-wateringly expensive it might be to fix.

Yes, some mechanics are better than others, and there may even be shonky ones out there that don’t know what they’re doing, or worse, who are criminally intent on defrauding you by exaggerating or inventing problems. If you do your research though, and find out who other mechanics respect and what they think of each other’s work, I’m pretty confident you could probably do a good job of picking the right person to deal with any engine trouble you might have.

The problem, in its essence, is that ‘opinion’ is a complex and chimeric beast. It covers a spectrum from tastes or preferences, through views on issues of common concern – the ethical and political questions of the day, to views grounded in technical expertise – and here I’d include legal or scientific opinions. The common thread is that all these areas admit a degree of subjectivity and uncertainty – but not all are equal.

You can’t really argue about the first kind of opinion. It would be ludicrous for me to tell you that you were wrong to prefer sticky date pudding to cheesecake. Where this issue starts to go off the rails though is that we sometimes take opinions of the ethical and even the expertise-based sort to be unarguable in the same way such questions of taste are.

The silly – even embarrassing – thing here, at least for somebody coming from a Western philosophical tradition, is that Plato pretty much had this distinction sewn up 2400 years ago.

Today though all too often – whether by design or, I suspect usually more likely, ignorance, we seem to have forgotten this lesson.

Bob Brown, former leader of the Australian Greens and Federal Senator, argued long and vociferously against nuclear power throughout his career, despite not being a nuclear physicist. All well and good – but Meryl Dorey – leader of the Australian Vaccination Network (don’t be fooled by the name – this is a group vehemently opposed to childhood vaccination in all its forms) has used Brown’s record to argue that she should, in a similar vein, be listened to in regards to the healthcare of our children, despite having no medical qualifications of any stripe. The crucial difference between the two is that Dr Brown never represented himself as an authority on the physics of nuclear fission. He was always, entirely appropriately, commenting on policy responses to science, not the underlying scientific understanding. Dorey, in contrast, essentially tries to represent that her views should factor in debate regarding the biomechanics of vaccination and immune response itself – that her personal biases should be weighted equally to expert and scientifically validated opinion.

So – back to climate change – let’s take on board Plato’s distinction for a minute and ignore the opinions of the Nick Minchins and Lord Moncktons of the world. What do professional climate scientists – those experts who have devoted themselves to understanding the detailed interactions of climatic systems and earned the respect of their critical peers – understand to be happening to our climate?

First and foremost, our planet is warming up. Using any of a wide range of indicators (ocean heat content, sea surface temperatures, sea level, temperatures in the lower and middle troposphere, the rates at which glaciers and ice sheets are melting), the overall temperature of the Earth and the corresponding energy in our climate system are increasing.

According to a study recently published by a team of scientists from the Potsdam Institute for Climate Impact Research, there are now on average five times as many months with record-breaking high temperatures at measured locations worldwide than could be expected without significant and ongoing warming occurring. In parts of Europe, Africa and southern Asia, the figures are even worse – with instances of record-setting monthly temperatures exceeding statistical expectation by a factor of ten.

While there are a number of influences on the climate system, such as changing levels of solar radiation and abundance of atmospheric aerosols, independent climate researchers also almost universally conclude that this warming has been produced dominantly by increased levels of carbon dioxide in the atmosphere, with a significant proportion of this emitted by human activities.

Now remember, that’s not me saying this – these are the expert opinions of the big beasts at the climate science waterhole with the expertise and experience to give their opinions real weight. These are the people we should be listening to.

In a recent essay on Science Communication (‘Three Monkeys, Ten Minutes: Scientists and the Importance of Communication Skills’ – WordPress, 18 October, 2012), I used the metaphor of taking sides in a scientific debate you don’t understand being like weighing in to an argument in a language you don’t speak – a Frenchman and a German speaking in Spanish, let’s say – on the basis of liking one participant’s accent more than the other. In the field of climate change, the position of someone who would deny the reality of anthropogenic warming is even more tenuous, because as the debate stands its like 97% of the Spanish speakers in the room (everyone except Pierre’s mother and the crazy old guy whose brother was killed in the second world war and hates all Germans with an unquenchable rage, let’s say) agree that our French friend is wrong-headed and it’s Heidi we should listen to – but still there are non-linguists willing to back up the Frenchman with an unthinking “yeah, what he said” against all comers.

To come full circle to our fragile boat alone on the stormy seas – although the consequences of putting out my fire if it wasn’t reducing my vessel’s seaworthiness might be unpleasant (I’d be wet and cold – and frightened, alone in the vast empty expanse of the ocean), the consequences of not taking action if my hypothesis ultimately proved correct would be much, much worse.

More importantly, although I would really like to know exactly where the water was coming from and which source was the most important (sorry, scientific curiosity has me in its thrall), my first reaction wouldn’t be to set up an interim enquiry and design some experiments. No. Me? I’d start bailing.

French String – Mathematics, Linguistics, and the Nature of Reality

“Daddy,” my eldest daughter asked me, some years ago now (at a time when Europe was but a short train ride away and a welcome escape from the grey winters of Surrey), “What’s the Spanish word for thank you?”

“Gracias.” I replied, pleased by her inquisitiveness “And denada means you’re welcome.”

Warming to the conversation, she went on “Oh. And what’s thank you in French?”

“Merci.”

“And what do they say for you’re welcome?”

I paused for a moment (but only, it must be said, a moment) reflecting on the fact that I had no idea how to express that concept – my ability with the French language extending little further than ordering coffee and croissants for breakfast – before telling her, with all my fatherly sincerity “The French have no phrase for that.”

Now yes, I admit it was a cheap knee-bend to Francophone stereotypes, and a ‘Dad joke’ to cover my linguistic ignorance…and it was probably inappropriate for me to let an impressionable child go on believing this for as long as I subsequently did.

But it does introduce an interesting and important concept – our ability to describe something has no bearing on its reality. Even if my statement were true and the French had, through some curious artifact of linguistic heritage, failed to develop a phrase capable of expressing gratitude, it would not change the fact that such feelings could – and do – exist. Language describes reality. It does not – outside of the most extreme hardline views of social constructivism – define it.

Mathematics too is essentially a language – a language, moreover, that we can use to describe the physical reality of the universe. Most of the time. As with the example of spoken language above though, the critical caveat is that however well mathematics describes physical behaviour, again, it does not define it.

Sir Phillip Bin, the fictional hero of Mark Evans’ radio comedy ‘Bleak Expectations’, muses wistfully on the days before Sir Isaac Newton ‘invented’ gravity, when people falling from great height would ‘simply drift gently and harmlessly to the ground’.

Such satirical diversions aside, Newtonian mechanics works pretty well in describing the interactions of macroscopic objects under the conditions of our everyday experience. But gravitational attraction between two bodies doesn’t fall off in proportion to the square of the distance between them because that’s the way the equation is written – rather, the equation seeks to empirically describe the behaviour that occurs.

As Einstein recognised in his theories of general and special relativity, under certain circumstances – far removed from the world of everyday experience – objects behave in ways that are incompatible with Newtonian physics. In formulating expressions to account for this relativistic behaviour, Einstein did not change the nature of the universe – he simply gave us a new form of language by which to describe the poetry of our existence.

Similarly, the remarkable duality of electrons – whereby they can be shown through physical experiment to possess the characteristics of both a continuous wave function and a discrete physical particle – is only a paradox in the context of the ways in which we have come to describe these sub-atomic features. Fundamentally, the electron is what it is, and if theories are unable to fully account for its behaviour, it is a reflection of the inadequacy of our mathematical approximations for reality, not proof of some cosmic trick set up to titillate a Vegas audience on the quantum scale.

Perhaps the most interesting example of this concept in action, however, is the search for an ultimate physical ‘theory of everything’. The properties of electromagnetism, strong nuclear and weak nuclear attraction, and gravity – the fundamental forces that define and control interactions of matter and energy throughout the universe – converge at high energy, and it is theorized that all four derive from a common underlying property. But just what this is remains a point of hard debate, as none of the individual equations that are so successful in describing the behaviour of each of these forces on the macroscopic level of the everyday can adequately cope with the conditions of this theoretical point of convergence.

This does not mean that there are somehow four separate overlapping layers making up the Universe that don’t quite fit together perfectly where they join, like some kind of badly put together set of existential DIY shelves. Rather, the theory runs that there is one reality, where all aspects of the physical behaviour that we observe in the universe must somehow derive from the fundamental character of matter and energy. The failure lies in the mathematical language in our possession – it’s not just that it’s tricky to calculate the results, standard mathematics is literally unable to describe reality under those conditions.

The ‘theory of everything’ that can account for the emergence and existence of these separate forces is one of the great challenges at the business end of modern physics where the big kids of theory get serious. Tackling this problem however requires not just a dab hand with a slide rule, but the creation – literally – of entirely new forms of mathematics, incorporating additional physical fields and interactions, and even extra dimensions of space.

For the record, I should confess that I’m not one of those big kids – a real physicist would have stolen my mathematical lunch money and sent me crying for home long before we even got to string theory – which I understand is regarded as one of the more accessible (and promising) of these approaches. As secret shames go, I can appreciate that this is not exactly stupendous, but I’ve been happily married for 16 years and don’t get out to as many wild parties as I used to.

The point is, I’m fine with that. I don’t need to understand the higher order branches of mathematics – the high linguistics of the Physicist’s hymnal – to appreciate the reality and significance of what they are trying to achieve in understanding the nature of reality. I wish them well, and look forward to the day that Google produces a Mathematics-English translator so I can appreciate the beauty of their work.

I’m sure even the French would be grateful for that.

Fairy Tales on Shaky Ground – Scientific understanding and the Italian court system

On the 6th of April, 2009, a 6.3 magnitude earthquake centered 9.46 km beneath the Abruzzo region in central Italy devastated the city of L’Aquila, ripping the historic heart out of the city and killing 309 people. While the physical scars from this tragedy are fading, cultural aftershocks are still rippling through the scientific community, and reached a peak unexpected by many this week with the conviction of six Italian scientists and a former government official for involuntary manslaughter.

The seven – members of the Italian National Commission for the Forecast and Prevention of Major Risks at the time of the earthquake – were sentenced to 6 years in prison, and ordered to pay court costs and damages amounting to some 7.8 million Euros.

Their conviction was based principally on a number of statements made six days before the damaging tremor, downplaying the likelihood of a major earthquake. Given the manifest impossibility of reliably predicting Earthquake occurrence with our present understanding of seismic processes, the legal precedent for this is presumably drawn from the Brothers Grimm, with imprisonment the prescribed punishment for failing to spin straw into gold.

Earthquakes are caused by the release of energy as fractures propagate through rocks. They are focused in particular regions where creeping deformation deeper in Earth leads to the build up of stress in the more brittle rocks near the surface. Once a fracture has occurred, the crack remains a discontinuity – a weakness – and tends be a site of further failures in the future.

If you live near such an existing fracture – or fault – the likelihood of experiencing an earthquake increases dramatically. This much we can say with confidence. Certainly, in the long view, no-one can claim to be surprised by the damage wrought on L’Aquila, given the city is built on an ancient lake bed known to provide a geological framework that amplifies the local effects of seismic waves, and has been devastated by earthquakes on no fewer than seven historical occasions, in 1315, 1349, 1452, 1501, 1646, 1703, and 1706.

But when will the next Earthquake happen? Now this – the 7.8 million Euro question – is the holy grail of seismic hazard research, and to date, there is no answer. Anybody who tells you differently is probably emailing from Nigeria to offer you ‘the investment opportunity of a lifetime’.

Despite all the work by teams of dedicated and sometimes brilliant researchers over the past century or so, all the collection of data and analyzing of patterns – even for the San Andreas Fault in California, probably the most intensely monitored fault zone in the world – no distinctive and reliable precursor patterns for major earthquakes have ever been recognised.

To hold someone responsible for failing to predict an earthquake on the basis of preceding activity makes all the statistical sense of having your first tip on Melbourne Cup day romp home, doubling down your house on the trifecta in the next race, and then suing your bookie when the horses fail to place.

So how can this miscarriage of justice have occurred? Why weren’t the charges thrown out at the earliest opportunity?

Fundamentally, as I wrote last week (Three Monkeys, Ten Minutes – Scientists and the Importance of Communication Skills – WordPress 18/10/2012):

“Society is complex, and people hold views for all manner of reasons – personal, cultural, logical, or religious, among others. We [as scientists] do not have to share those views, but we do need to appreciate and respect their reality”

When I wrote those words, I hadn’t expected to be confronted by such a glaring (and dark) example of this relationship at work quite so quickly. Scientifically, the question of earthquake prediction doesn’t even get off the ground, but to a broader population un-tutored in statistics and the language of scientific uncertainty, a population here stung by a great tragedy and searching for someone to blame – a sadly common human trait – the Committee’s statements painted them with a target.

People are incredibly good at recognising patterns. This, as much as anything, is the key to our astonishing success as a species. Unfortunately, the flip side to this is that we look for – and expect to see – patterns even when they are not there. This makes us very bad at evaluating the true risk of rare events.

James ‘the Amazing’ Randi is conducting a long term demonstration of this phenomenon. Every morning he writes on a card “I, James Randi, will die today”, which he then dates, signs, and keeps in his pocket in the knowledge that it will one day (may it be far in the future) be a fitting final demonstration of how apparent correlation can be manipulated to lead our minds astray.

Richard Feynmann related a similar story in his memoir “Surely you’re joking, Mr Feynmann” – where he writes of hearing the phone ring in his University dormitory and having a sudden premonition that his grandmother had died. She hadn’t. The phone wasn’t even for Feynmann, and his grandmother continued in rude good health for some time to come. The point is, we have such thoughts and intuitions all the time – for the most part, they don’t turn out to be correct, but occasionally the fates line up. When they do, the glorious pattern-seeking engines that are our brains get a kick of reinforcing dopamine to say ‘job well done’ and we forget about the 999 previous times it hasn’t worked and start to see a correlation.

If you live near a fault line, you will, inevitably, experience earthquakes. Sometimes big ones, often small ones. Sometimes a large one will be preceded by small ones. But usually not. The stochastic patterns – one earthquake here, two the next week, none for six months – have no significance.

There is a real tragedy at L’Aquila, and there are people who should be held to account. But they are not the scientists who gave an accurate representation of the processes at work beneath the town, and the statistical meaninglessness of looking for patterns in the tea leaves of local seismic activity. Rather, the guilty parties – those who should have known better – are the officials and engineers who built structures – schools, gymnasiums, dormitories – in the city that were not designed or constructed to withstand the well known and historically proven earthquake risk.

So, we find ourselves at the end of act 1 in our re-imagining of Rumplestiltskin – the Miller’s daughter has proven unable to spin her straw into gold and the King is about to imprison her in the highest room of the tallest tower. Perhaps in act 2, Uri Geller will step up to the title role and offer to solve her problems by magic in exchange for first authorship on the resulting scientific paper.

Three monkeys, ten minutes – Scientists and the importance of communication skills

Science and technology have changed almost every aspect of the way we live our lives over the past 100 years, and are at the heart of many major challenges we face today.

Science, however, is nothing more, nor less, than a process by which we seek to understand the forces that shape and control the universe around us, and understanding is not the same as a need (or permission) to act. We can produce fresh water through desalination, or treat waste water (yes, sewage) to the point where it is potable. We can produce genetically modified crops and organisms resistant to disease. Even engineer changes in the system of climate. We know public health would be improved if we banned tobacco. But should we? Do we want to?

How we act on scientific understanding is in essence a social compact between scientists, policy makers, politicians, and the public. At the heart of this is a need that these groups understand one another.

Now, scientists are not so irredeemably bad at the communication game as some stereotypes (such as Scott Adams’ acerbic put-down above) would have us believe. A core part of the scientific process is the need to clearly and persuasively explain ideas to others, and to engage in and foster discussion, testing, and criticism of those ideas. This can take the form of conference and symposium presentations to our peers, tutorial sessions for students, written papers – but however it takes place, an ability to communicate is a key pillar of the scientist’s skill set.

Stephen Hawking is acknowledged as one of the most significant and influential thinkers in late 20th and early 21st century Physics. His mind is capable of soaring on phenomenal flights of mathematical and scientific creativity beyond the realms of thought commonly occupied by many. Were it not for the technological aids that allow him to communicate electronically, however, many of those beautiful theoretical constructions would have remained locked away inside his progressively failing body. A poignant, albeit extreme, variation on George Berkeley’s philosophical construction “If a tree falls in a forest and no-one is around to hear it, does it still make a sound?”

All successful scientists, certainly, must either become good communicators, or develop symbiotic relationships with colleagues able to support them in that domain. Fundamentally, brilliant ideas are not enough – if you are not able to clearly and persuasively explain your idea, it will go unremarked – Science will not grow and prosper from your contribution.

Where scientists may sometimes fall down in the communication stakes is an over-specialisation. We invest so much in developing an ability to discuss and exchange ideas at a high level with our peers that the communication of ideas to non-specialists may become neglected.

Unfortunately, nowhere outside of macroeconomic modelling equations do ‘the general population’ actually behave as perfect, rational beings. Society is complex, and people hold views for all manner of reasons – personal, cultural, logical, or religious, among others. We do not have to share those views, but we do need to appreciate and respect their reality if we are serious about influencing policy decisions. It’s not enough to expect the general populace to accept a paternalistic “trust me, I’m a scientist” as a reason for following your advice. It’s also not among the most successful of pick-up lines.

In the words of Jesse Shore, National President of the Australian Science Communicators:

“…few people base their decision making on just being presented with good science. The communicator’s message must have meaning, be useful and acknowledge the needs, aspirations and concerns of each intended audience.”

It is in this context that a Scientist failing to represent their work to the general population becomes significant – a weak link in the nexus required for the hard-won scientific understanding of natural systems to play a significant role in the development of meaningful policy.

Ceding the communications role to the existing media system may not always be a helpful substitute either. Conventional reportage is built not around nuance and weighted discussion, but the manufacture and presentation of conflict and controversy – which is doubly harmful for complex issues. If you don’t understand the methods used or the calculations undertaken to reach a scientific conclusion, taking sides in the debate – or basing a serious policy decision on it – would be like listening to a Frenchman and a German arguing in a third language you yourself have no understanding of (those dashed Europeans can be so clever that way), and concluding “I agree with the French guy because his accent sounds sexier.”

This is why the scientific community should appreciate – even treasure – those scientists and writers able to genuinely translate our work – to explain complex ideas and arguments to others without diluting their meaning. Simon Singh, Robert Winston, the late Carl Sagan…maybe not Simon Winchester, who has a nice turn of phrase, but to my thinking a tendency to undertake diversions off-topic that detract from the flow of thought (for anyone who may have read any of my previous posts, yes, I know – pot, kettle, black) – for these are our ambassadors – our public face.

This is also why we should welcome and encourage the incorporation of communication skills teaching into science degree programmes. This addition has recently become a core element of the new degree structure at the University of Western Australia where I work – not without some controversy among both staff and prospective students. Personally, I have never needed convincing in regards to the importance of training scientists in this area. Indeed, I would go so far as to say that no-one who has ever marked undergraduate essays from science students could ever query that suggestion.

Beyond my self-serving investment in the idea, however, is a more serious foundation. By training future scientists in the skills and strategies of communication – or at the very least making them aware of the significance of this area – we can work to close this gap and see a better informed discussion of scientific subjects in the broader public sphere.

Increasing the fundamental communications skills of our scientific graduate cohort has additional benefits too. This is about more than just making your ideas sound impressive. Learning to structure an essay, or mastering the rhetoric of a compelling argument can in themselves make our students better scientists – providing a mental template for the robust logical interpretation of ideas. You can collect all the data you want, allow your thoughts so roam as wide and soar as high as the limits of infinity, but like an inversion of Heisenberg’s uncertainty principle, it is the act of precisely describing your findings clearly to others that ultimately crystallizes them – pinning them to the page and making them real.

So my fellow scientists, let us value and applaud the communicators in our midst and work together towards a future of better informed, relevant debate of scientific ideas within the social landscape. To take the discussion full-circle then – we might not be able to touch Shakespeare, but at the very least, let’s all try to up our Monkey Quotient a few notches.

I wouldn’t start from here Gina: Historical contingency and the anthropic principle

In science we seek to explain and predict natural phenomena, and part of this process is the identification or construction of cyphers – representative examples or ideas through which we can encapsulate and illustrate our arguments. These fill a significant role in making otherwise irreducibly complex systems tractable, helping us get our heads around – and explain to others – multi-dimensional and abstract concepts. At the risk of getting a bit too George Lucas early in the essay though, in this rhetorical power there is also a dark side – apply a cypher the wrong way and it changes from being an illustrative device to a philosophical delusion that can make ridiculous conclusions seem reasonable, so we sometimes need to take a step back and check the fundamental concepts underlying our thinking.

Alongside her vigorous public advocacy for government deregulation and the lowering of taxes for the wealthy, Gina Rinehart (richest person in Australia and an excellent candidate for poster-child should the Federal government decide to move on from giving the tobacco industry a thorough and long overdue kicking and run a ‘money can’t buy you happiness’ campaign as a contribution to the nation’s health) showed her magnanimous side in August of 2012 by offering some free advice to the less fortunate working citizens of Australia struggling to make ends meet in the face of spiraling living costs:

“If you’re jealous of those with more money, don’t just sit there and complain. Do something to make more money yourself – spend less time drinking or smoking and socialising, and more time working.”

Now, whatever you take from that personally (and I’ve got to think there aren’t many people having that particular mantra tattooed across their chests in gothic script as a life changing rule to live by) the interesting thing to realize is that Rinehart’s suggestion probably made absolute sense to her. I say only ‘probably’ as it’s not impossible she’s perfectly aware how offensive her comments were to a large number of people and was just doing it to wind up Wayne Swan, the current Federal Treasurer. In fact, perhaps she and fellow mining tsar Clive Palmer have agreed a billionaire’s wager, much like brothers Randolph and Mortimer Duke in the 1983 movie ‘Trading Places’ and are taking it in turns in a colossal game of ‘top that’ to see who can make Swan rupture an aneurysm first. It would certainly explain a significant number of recent media statements. But I digress – Rinehart’s modern take on ‘let them eat cake’ probably made absolute sense to her because of the perspective she brings to the issue. Mrs Rinehart has, after all, worked hard and made many sacrifices to reach the stratospheric levels of wealth that she today enjoys, and presumably sees that path as the wellspring of her success. The same might not have been achieved, however – the same journey indeed, might have not even been possible – without the support, inspiration, and mentoring (not to mention mining leases) of her visionary and highly successful father, the prospector and – ultimately – iron ore magnate, Lang Hancock.

Rinehart might want to tell herself “I could have done it all on my own without the privilege and wealth I was born into” – and who knows, that might even be right. But the fact of the matter is that, outside of a whacky Disney movie (picture it – the ghost of Steve Jobs (played by Ashton Kutcher, obviously) visits Bill Gates (Kermit the Frog playing against type to prove his great comeback playing opposite Jason Segel in last year’s Muppet franchise reboot wasn’t just a lucky break) and offers to create an iGod app to cure all childhood disease in the third world if Bill is prepared to go back in time to live his life again as the son of a struggling but decent single mother (Sandra Bullock). To win the bet Bill has to prove that he can still drop out of college and found the world’s most successful software company without the benefits of an elite private school education and the support and connections of a close-knit and loving family with a lineage of bank directors and successful lawyers stretching pretty much as far back as the Mayflower. I’d pay to see that), that reset will not – can not – happen. Rinehart will never see whether success is possible without the benefits of her upbringing and circumstances, any more than Rick Blaine will ever know if Ilsa and he might actually have lived happily ever after if she’d turned back at the boarding gate and ditched Victor Laszlo and the Czech resistance to stay with him at the end of Cassablanca.

That’s the way historical contingency works. It’s not relevant to ask the question ‘how likely is that’, or ‘what else could have happened’, because as soon as something does happen, infinite possibility collapses down to a single reality.

How likely is it that you could pick the Lotto numbers for Tuesday’s draw? It’s a trivial calculation – a 6 in 40 chance that one of your numbers is first out of the barrel, then a 5 in 39 chance of the second number as well, and so on – for an overall probability of about 1 in 26 million. Too easy? Okay – how about something more challenging – what’s the likelihood of picking the lotto numbers every week for the next year? One in about the order of 10381. That’s a number so vanishingly small as to have no meaningful way to even conceptualise it – about as close to a definition of ‘statistically impossible’ as you could hope for. To put it into some sort of context, there are calculated to be only about 1080 atoms in the visible universe. And yet – if you look back at the historical record, there it is – 6 numbers have come up every week, and you can see that record there in black and white – for a year and more – as far back as you’d like to the first live drawing in Australia back in 1972. The impossible has become the certain. And that’s the point – it’s irrelevant to ask the question “how likely is it that this particular sequence could have happened?” – because the simple fact of record is that it did. Probability has no relevance to history.

Now, by another seemingly astonishing coincidence (yes, it’s a narrative leap of epic proportions, but stick with me on this one), all aspects of the structure of the Universe and the laws of physics by which it operates are within a narrow range of values consistent with the existence and operation of life as we know it. The Universe is old enough to have created enough ‘heavy’ elements (everything other than the hydrogen and helium created in the Big Bang) through solar processes for we carbon-based life forms to exist on a nice rocky planet – but not so old that stars like our Sun have turned into dim white dwarfs incapable of bathing us in sustaining warmth. The cosmological constant is large enough to have stopped the Universe collapsing back in on itself, but not so large that rampant inflation stopped matter from coalescing to form stars…the list goes on.

Some people take this as a cornerstone of a theistic viewpoint (the universe could only be so perfectly attuned to the needs of life if it was designed by a creator). In a philosophical position known as the Strong Anthropic Principle, others – in what to me has always sounded rather uncomfortably close to a re-hash of the religious approach without the fancy clothes and awe-inspiring architecture – advocate, that the Universe is ‘compelled’ in some sense for conscious life to eventually emerge.

The obvious fallacy in these ideas though is that, like the ‘impossible’ prediction of an entire year’s lotto numbers, it implies that there is an alternative to the historical record. How likely is it that the universe we live in could be so perfectly suited to the evolution and function of biological life without some guiding hand and/or Universal compulsion directing it? The thing is, of course, that’s not even a valid philosophical construct – because the very fact you’re here to ask that question establishes pretty effectively that the Universe is, indeed, finely suited to your existence. To ask how it could have evolved differently is like the old joke about asking for directions in Ireland/the Countryside/Tasmania/insert-name-of-region-you-wish-to-denigrate-here – the punchline being “Well first of all, I wouldn’t start from here…”

What this drives us towards then is an acceptance of the less extreme Weak Anthropic Principle – which runs along the lines of ‘because intelligent life could only exist under the time and conditions we see around us in the Universe, it shouldn’t really surprise us that, as (nominally) intelligent life forms, those are the conditions we see’. It may be nothing more than a simple logical construction that doesn’t help us to explain or make useful predictions about anything else (and thus cannot be called a theory in any meaningful sense), but at least it has the advantage from my perspective that it doesn’t require us to start from an egocentric belief in the miraculous or predestination.

So thanks, Mrs Rinehart, for helping me to embrace my inner doubts and see that historical contingency means we don’t need to believe in our own specialness. The moral of the story I suppose, should one be desired, is that maybe instead of worrying about our position in the Universe and what it means, we should all be glad – and grateful – for what we’ve inherited.

But what’s it for? An alternative to the monetarist valuation of University education

When an academic starts talking about the University system, you know straight away that there is a risk of the discussion becoming so self referential a reader would need an endoscope to appreciate the arguments. Having last week laid out a critique of the statistical underpinning of the recent Grattan Institute report on University funding (More Pennies for your thoughts – 28/09/12), however, I find myself compelled to do just that, in the spirit of addressing the essayist’s dilemma of “well its all very well to criticise, but what’s your alternative?”

So if I would pull down the monetarist temple that Andrew Norton, higher education programme director at the Grattan Institute, wants to build on the foundations of our Universities, what would I have us put in its place, and why?

Norton argues that fee increases would not impact on the decisions of individuals to attend University because:

“…a financially-based motivation cannot explain why so many students with good ATARs [scores in Australia’s University entrance examinations] choose humanities and performing arts, which have relatively poor employment and income outcomes.”

I suspect the inference drawn here is correct – University students in Australia, including the more intellectually gifted among them, often choose courses on the basis of personal interest or intellectual curiosity. Personally I hope this long continues…but I don’t think Norton sees things the same way. Rather, he seems to see this as an anathema – clever students with the potential to make money for themselves choosing to do something without an obvious net cash benefit? Like a Vulcan anthropologist on Star Trek, unable to understand this strange propensity of humans to undertake study that is not in some way to their immediate and personal financial advantage, Norton effectively proposes an experiment whereby we massively increase the economic cost on the individual, and then see how many people still behave ‘inefficiently’ in choosing to study on the basis of silly little things like social conscience or a desire to broaden themselves intellectually.

I have a number of intelligent and talented friends who have chosen to attend University later in life for reasons other than economic enrichment. One, after a youth spent traveling and teaching in Asia, chose to study for an education degree so he could apply his acquired wisdom to the inspiration and development of Australian school students. Another, a recent arrival in this country herself, is completing a degree in social work because she wants to help the disadvantaged and disenfranchised in our society. Would we really be better off as a nation if these people were to take courses in business management instead? Or give up their dreams of study to work waiting tables in a cafe?

Entangled with this question of why we undertake University education is the issue of exactly what the benefit is that we take from the experience. Norton argues that the value derived from a University degree arises from the combination of training – extra capacities graduates gain at university that they could not otherwise obtain – and signaling – in the form of a credential that distinguishes them in the labour market.

The concept here of being ‘distinguished’ in the labour market is particularly interesting in the context of drives to increase the uptake of tertiary education in our already relatively educated and well-trained population. It may have been true that a university degree was a significant badge of merit – a ticket, perhaps, to the inner circle of a stratified meritocracy – at the time of Australian Federation, when the 2,652 University students in the country represented around 0.2% of the young adult population. Or even on the eve of World War 2, when 14,000 students accounted for around 1% of the equivalent group. That badge begins to lose its lustre though as the University participation figure nudges 30% in the first decade of the 21st century (on the back of 757,000 students), and the Australian government is targeting 40% degree qualification in the adult workforce by 2015.

If you’re at the bottom end of that 30-40%, its unlikely you’re going to see much of a premium on your employment worth from that particular line in your CV. In the comically evil words of Buddy Pine/Syndrome in the Pixar movie ‘The Incredibles’ – “When everyone’s super – no-one will be”.

So what of training? Does university make people smarter? I would say no – it certainly gives an opportunity to apply intellect, but I doubt a graduate is any more gifted in terms of naked intelligence than they were the day they matriculated. Does it provide skills that will be useful in the workplace? Sure – but wouldn’t working for three to five years do that too? Or job-specific training? Seriously, if your primary interest is a desire to develop work skills and business acumen – and I can’t stress this enough – the best path I could recommend would be to get a job, apply yourself with rigor and passion, and seek guidance and advice from figures in the industry who you respect and admire.

Outside of the handful of specific professional University degrees (we’re talking law, medicine and their ilk, and perhaps technical science fields to some extent), I seriously doubt there are many learning experiences at University that could not be replicated – even bettered – through professional experience and mentoring. Major companies routinely employ top graduates as the core talent of their future workforce – and then what? They don’t say “right graduate boy (or girl), off you go to work”. No, they induct their new hires, retrain them, and teach them how to do a job – sometimes over a course of years.

So what am I saying here – is University a waste of time and money we should be steering kids away from? No – on the contrary, I see university education as a huge and life enhancing benefit to the individual that everyone should aspire to undertake at some point in their lives. With my heart on my sleeve I say that this experience should be freely available to anyone on the basis of their intellectual and emotional readiness to benefit from it – but with an economic realist hat on (albeit one tilted at a deliberately subversive angle), if society cannot afford the luxury of free education, the individual should pay, and should value and appreciate the opportunity their money is buying. So far I suspect Norton is nodding his head in agreement and mumbling “that’s what I said” – but where I think our perspectives really start to diverge is that I am not convinced that the benefits derived from University education are financial, and I think to sell it as such is dishonest and degrades the university experience.

“But…but…” stutters my inner Norton, his eyes widening in incredulous disbelief, “If you discount the economic benefits of a degree, what are you left with? Three (or four, or five) years of lost earnings and opportunity for career progression. Why on Earth would anybody choose to do that?”

University education is a sensational opportunity if – and this is the critical caveat – if you want to learn and to invest of yourself (your time, your energy, your intellectual capital) in doing that. It’s an opportunity to interact with and benefit from the wisdom and experience of brilliant and creative people, a chance to get exposed to new and challenging ideas. A chance to think deeply about who you are, who you can be, and how our society operates – what our values are, and what they should be. Yes, I’m approaching this from a biased perspective in that I would happily admit to valuing philosophical novelty and intellectual discourse above a newer car and a designer watch. But as long as I’m open about that perspective – and don’t try to pretend my views are some kind of universal truth you’d be crazy not to appreciate – you can judge for yourself the validity and relevance of my comments.

When it comes down to it, how many graduates in my own field of geology does does society ‘need’? The current situation in Australia is somewhat anomalous – with a rampant resource sector almost desperate enough to hire anyone able to differentiate a rock from a hard place – but any reader with the barest grasp of history should appreciate that this white-hot demand is a temporary aberration. Here’s the thing though, I still think more people – as many as possible – should study geology. Fundamentally, I’m not training people to log drillcore, or even to design mineral exploration programmes – although my teaching will place you in an intellectual space where both of these things will be supported – I’m trying to engage people in understanding the processes that shape the Earth.

At the recent quadrennial International Geological Congress meeting in Brisbane, I attended a public forum in which mining and resource ministers from several countries – including our own – were addressing the future resource needs of a growing global population. We shouldn’t lose sight of the fact that decisions – important, far-reaching decisions – are being made by governments all around the world today about this sort of issue – how to balance economic and social needs against environmental quality, traditional land use, and other amenity values. These decisions will affect you, me, our kids – all of us – long into the future. Do you really want to base your views on these issues on what Alan Jones says? Or Tony Abbott? Or Bob Brown for that matter? Seriously, if you want to engage in these debates and hold the decision makers to account, understanding the function of Earth systems should concern you deeply. I want to see an educated population able to, in that finest of Australian political traditions, “keep the bastards honest”.

Therein lies the true benefit of University education to an open and democratic society – producing an intellectually engaged, thoughtful populace ready and able to debate values and issues of significance from myriad perspectives, and contribute the societal wisdom needed to steer our nation, and indeed our planet, through the challenges of the future.