A principal benefit conferred by higher education on power and other engineers, it has been argued, is the ability to work from fundamental principles and to escape the sins of empiricism. However, those who reason from fundamentals can sometimes be as misguided as those who follow rules of thumb.
Take Thomas Malthus. He famously predicted in 1798 that famine and disease would soon prevail unless population growth were curbed (he suggested that people should wed later in their lives to reduce fertility). Malthus made his case on fundamental grounds. He reasoned that population, if uncontrolled, grows exponentially. The means of subsistence, on the other hand, can be increased only arithmetically.
Move on a century. A hundred years ago the pioneers of aviation put engine power into kites and gliders. The resulting aeroplanes became potent weapons in the great war of 1914-1918. Fundamentalists among their designers argued that their one- or two-seat heavier-than-air craft could not be the precursors of truly large flying machines, though, because structural weight would rise as the cube of their linear dimensions, whereas the lift available from wings etc could increase only as the square.
A mere decade and a half ago the seers of semiconductor chipmaking technology thought that a fundamental fact – the wavelength of visible light – dictated the absolute limit of smallness to which their lithographic equipment could be made to go. But they have been confounded by manufacturers’ recourse to deep ultraviolet light and advanced optics. The law that the fundamentalists had perhaps most neglected was an empirical one. In 1965 Gordon Moore (a co-founder of the integrated circuit maker, Intel) observed that the complexity of semiconductor components had doubled annually for six years, and he dared to predict that this would remain the doubling time for at least another ten years.
With a tiny droop, Moore’s Law has continued valid to this day (the wavelength of visible light was only one of the supposedly insurmountable physical barriers overcome in practice), and Intel Corporation is using the slightly modified form of the law to predict integrated circuit complexity’s rise into the next decade.
I guess that most doomsters’ predictions about energy and power, however fundamentally rationalised, will prove mistaken. Saying so I may be committing empirical sin, but even the small selection of instances I have given here may help you (and any academic readers I may conceivably have) to forgive me.
The wheel was great, too
As I have boasted already on this page, one of the top directors of a prominent automation and process control company once assured me that the western world’s greatest contribution to civilisation was the invention of double-entry book-keeping. I did not expect my disclosure to be followed by an avalanche of competitive bids from readers, and it was not. But one kind soul has sent me a pertinent clipping from the weekly science section of The Guardian, a British newspaper claiming that its commitment to science dwarfs any other daily’s in the UK.
The clipping carries an interesting assertion by columnist Dr Ben Goldacre. ‘Science’, he writes, ‘is the optimum belief system: because we have the error bar, the greatest invention of mankind, a pictorial representation of the glorious undogmatic uncertainty in our results, which science is happy to confront and work with. Show me a politician’s speech, or a religious text, or a news article, with an error bar next to it?’
Or, Dr Goldacre might have added, an accountant’s double-entry books: says my reader.
But could error bars have stopped scientists; and their cousins, engineers; and their sometime sponsors, politicians; collusively launching lunar expeditions, Concorde development projects and fast breeder reactor programmes? And would error-bar embargoes have been right and proper anyway? What has been learned from those great ventures, despite their disappointing commercial outcomes so far, might yet prove hugely beneficial in the future, justifying their proponents at last.
I guess humankind’s greatest invention has not been the error bar, any more than has double-entry book-keeping. Perhaps, after all, that brainwave was sliced bread.*
* A once common anglophone irony, in communities whose shops sold machine-cut loaves, was to call a product ‘the greatest invention since sliced bread’. – Editor.