Press, Principia, Steam, Germs, Quanta
Five Breakthroughs That Changed Humanity This Millennium
A good example of a simple technology with profound historical consequences is hay.
- Freeman Dyson
by Larent Belsie
Boston - If one could travel back to AD 1000 and ask people to predict the technologies that would change the world, they might have guessed a few of them: better crops, a machine to replace horses, a means to fly. But they would almost certainly not understand how science and technology have transformed this millennium. The age runs so thick with breakthroughs that even with hindsight, picking the top scientific and technological discoveries remains daunting. This period built and rebuilt itself with inventions like the clock and the computer, the car and the mouldboard plough. Its scientists discovered evolution and DNA. How does one choose?
Nevertheless, certain events and people stand out, science historians say, because they cleared away the fog and transformed the way societies work and think. Here is our
list of five breakthroughs that changed the world.
1455 - Germany: First Printing Press Using Movable Type
Johannes Gutenberg's invention spawned the first mass media, boosted the importance of literacy, inspired the Reformation of Martin Luther and John Calvin, and made ideas far more accessible. Within a half century, some 35,000 separate editions of books - perhaps 15 million copies - had been published.
"Without a Gutenberg, a Jefferson would have never read a John Locke," says Richard Berendzen, a science historian at American University in Washington, DC "And if Jefferson had never read John Locke and other philosophers, would he have written the great document that he did?"
Today, we take the spread of books and ideas for granted; back then, it was astonishing.
1687 - Britain: Publication of Newton's Principia
Arguably the single most important scientific breakthrough of the millennium, Isaac Newton's masterwork explained the world as it had never been explained before. Instead of divine fiat, natural laws ran the world. This was the scientific revolution. Copernicus, who theorised that planets moved around the sun, inaugurated it, Galileo ran the first real scientific experiments (sometimes incorrectly) to prove it. What Newton did was synthesise their ideas into simple laws that were fully testable.
"He was able to put the pieces together so neatly that it proved to people that this new science worked," says Pamela Mack, a professor of history at Clemson University in Clemson, SC. "Suddenly, science got a whole lot more respect." (However, she ranks Galileo above Newton because he created the scientific experimental method.)
Either way, the scientific revolution changed the way humans looked at the universe and themselves. And authorities, such as the church, that tried to dictate scientific
rules lost credibility.
1711 - Britain: Invention of the Steam Engine
Thomas Newcomen designed the first practical steam engine. The steam engine powered the Industrial Revolution. It drew people from farmlands to factories and plucked
their economy from its agrarian roots into a whole new world. For the first time, power didn't depend on a river. It could be taken anywhere. Although Newcomen's
engine was far less powerful than James Watt's version 50 years later, it was used successfully to pump water out of mines. And it presaged the arrival of the steam
locomotive, the riverboat, and, ultimately, electrical and internal-combustion engines. As a result of the Industrial Revolution, populations grew; so did material
wealth. The West became more recognisably modern. Pollution increased.
Mid-19th century - France: Development of the Germ Theory of Disease
By studying fermentation, Louis Pasteur pointed to microorganisms in the air as the culprit for food spoiling and for some diseases. As a result of his experiments, he saved the French wine and silk industries, invented pasteurisation of milk, and originated the use of vaccines to treat diseases. Many historians consider him the father of modern medicine.
"It was through the work of Pasteur that more people's lives were saved than by any single other person," says Berendzen. His work represented the first steps in preventive
medicine, where doctors treat patients to avoid sickness, rather than treating it after it appears. "Up until that time, it was largely a matter of, 'We've got a crisis, let's
deal with it.'"
1905-1916 - Germany: Theories of Relativity and the Quantum Properties of Light
When Albert Einstein published his theories of relativity and the quantum properties of light, he revolutionised the world's view of itself. General relativity demonstrated that the philosopher's standbys - space and time - were not as fixed as Newton's theories suggested. Space could be curved, and time could differ from point to point. Einstein's special theory of relativity (E=mc2) suggested that a little mass of matter could create tremendous energy - a finding other scientists used to design an atomic bomb.
Einstein's early work on how light moves in little bundles (quanta) anticipated other scientists' work on quantum mechanics. These principles have led to the development of lasers, the transistor, and the semiconductor, crucial building blocks for the modern electronic age. Einstein's theories predicted the presence of black holes and neutron stars, and suggested ways that astronomers could look for them.
Nuclear weapons have had the greatest impact. For 40 years, they kept the superpowers from launching an all-out war. But their spread into more national arsenals gives mankind pause because for the first time man-made weapons could annihilate life on earth. It's a powerful reminder that breakthroughs in science and technology don't always have positive results.
The rise of science parallels the rise of Western civilisation. And in ways that historians still don't completely understand, the two have been inextricably linked.
Source: NandoTimes 28 Oct 1999 © Nando Media and Christian Science Monitor Service
Another Person's View
by Steven Denbeste
In my opinion, the four most important inventions in human history are spoken language, writing, movable type printing and digital electronic information processing (computers and networks). Each represented a massive improvement in our ability to distribute information and to preserve it for later use, and this is the foundation of all other human knowledge activities. There are many other inventions which can be cited as being important (agriculture, boats, metal, money, ceramic pottery, postmodernist literary theory) but those have less pervasive overall affects.
It's arguable that the single most important specific invention in history was paper and that its lone inventor (by tradition, Ts'ai Lun in 105 AD) may have had more direct effect on the course of human history than any other single person. It came from China and appears to only have been invented one time. Everyone else learned how to make paper from that original source, and in a real sense our modern society is built out of paper. Movable type printing could never have become important without it; other substrates for books were too expensive and couldn't be produced in sufficient quantity.
Before the era of rapid and easy world communication, many major inventions happened multiple times. It's difficult to say whether spoken language developed multiple times, since it happened so far back that there's no way to know, but such evidence as there is suggests at least three origins of major language groups, but that could represent an extremely early branching rather than true independent invention.
However, writing was developed independently between three and five times (China, Central America, and one to three times in the area of south-central Asia and north-east Africa, a number subject to considerable debate). Canines may have been domesticated independently as many as five times, but certainly more than once. (At the very least, it's clear that African Wild Dogs and Wolves were both domesticated independently.) Agriculture and the development of domesticated plants seems to have happened spontaneously all over the world where conditions permitted it.
Printing as such was invented several times (movable type printing was the most important, but wood-block printing was used in China and in Europe much earlier). The technology of refining metals from ores definitely happened many times. That one seems to be "easy" and usually copper was the first. Stones with high copper content are easily identified by their intense green or blue colour, and if you put such stones in a fire it also makes the flames change colour (especially if you've ground up those stones first, which is easy to do because they're soft), so it may have been something that early shamans did to impress the crowd. But when the fire went out, eventually someone would notice the pretty shiny blob that had formed in the fire pit, and it didn't take long to figure out where it came from. Since copper is ductile, it could be cold-worked into useful shapes, and unlike gold it's sufficiently hard to be useful. It's also easily melted using wood fires. (Gold could often be found in metallic form, but though it was pretty and considered valuable, it was also too soft to be used for tools.)
The bow and spear were invented many times. The idea of using wooden vessels to cross water certainly has happened a lot of places. But paper only happened once, and for an invention as important and influential as paper which appeared as early as it did, that's quite amazing.
With regard to my big four, what we see from history very clearly is that every major advance which makes transportation and communication easier, cheaper, and more efficient has always had explosive effect on cultural progress and change, which makes those kinds of basic advances far more influential than any others. That's why such things as the Roman Roads, "modern" sailing ships (that is at least 100 tons with lateen sails, sternpost rudders and compasses, capable of sailing for extended periods out of sight of land), transportation canals in Europe, the steamship, the railroad, the electric telegraph, the camera, the telephone, radio and aircraft were also valuable, but their effects are nothing like as great as those four. Speech, writing and printing each ended up explosively revolutionising human life and drastically affected everyone in every place they've been used. That's because each one radically improved our ability to spread useful ideas in space and time. The body of human knowledge is the result of a huge collaboration, and each of these four have represented a substantial stairstep in improving that process of collaboration. Communication is the foundation of everything else we do.
It's not completely clear that it's correct to refer to speech as an "invention", given that it appears to be in part the result of evolution and may well have taken tens of thousands of years to truly develop. We have sections of our brains adapted to deal with speech, and there are other adaptations in our throats and tongues which may have been necessary to make speech easy and flexible.
But I'm erring on the side of caution here, so I'll include speech in the list of great inventions, because in every other way it is the same as the other three. Absent the ability to speak, our ability as a species to organise ourselves and to work collectively would have been greatly diminished, and given how small and weak our ancestors were, their greatest strength was numbers. It's true that some levels of cooperation do not require language (such as the way lionesses hunt) but with language, human tribes became a form of super-organism that could do things no other species had ever done to our knowledge. Speech allows close cooperation to accomplish very complex goals, and the better able we are to express ourselves verbally the more complex goals we can achieve. Speech was the ultimate "force multiplier"; it's the foundation of our civilisation and what has made us successful.
That may even have been one of the advantages the Cro-Magnons had over the Neanderthals when the Cro-Magnons boiled out of Africa between fifty and a hundred thousand years ago. There's reason to believe that the Neanderthals didn't have the same ability to speak (through control over the organs involved) as the Cro-Magnons (that is, as modern humans, for we are essentially unchanged from those who left Africa in that second hominid invasion of the world, differing only in minor details), which would mean that though the Neanderthal were individually larger and stronger, they may not have had as great an ability to organise and thus collectively would not have been able to compete.
Speech gave us the ability to communicate efficiently in small groups, a dramatic advantage. But before writing, the only way to pass information down from generation to generation was as oral tradition, and that's both prone to cumulative distortion and also very limited in the total amount of information which can be preserved. The largest and richest known oral traditions could easily fit on one bookshelf or one CD. Writing broke that wide open, both because the amount of information which could be preserved was far less limited and because the transmission process down through time was much less subject to distortion. Writing expanded the time scope of information transmission.
But in terms of spreading it around in a given time and place, it was still severely limited by the fact that each copy had to be produced, painstakingly, by hand on stone or clay tablets or parchment (processed animal skins) or papyrus. When the Romans "published" a book, they went to a place where dozens or hundreds of literate slaves would create copies one at a time with pens. Movable type printing was a huge improvement in the ability to make and distribute vast numbers of copies of any piece of information, and it drastically improved the ability to spread information vast distances rapidly.
It also was a critical crossover point. For the first time in history, it actually required less work to produce and distribute a copy of something than to find and destroy it. Movable type printing made widespread dissent practical.
On a technological level, it gave the edge to spreaders of information over those who tried to censor it. The process of spread of "dangerous" information was slow and inefficient when that was either done orally or via hand-produced written copies, and the process of suppression could largely keep up. But when a small shop with a handful of employees could produce hundreds of copies of something in a week, and any printer elsewhere who got one copy of that could produce further hundreds of copies, then it became impossible to prevent the broad spread of political and religious dissent. Luther was a child of the printing press; he would be no more than a footnote in history were it not for the fact that his ideas spread through Europe like wildfire, in the form of thousands of copies produced by printers everywhere. The great intellectual revolution in Europe of the 15th - 17th century was solidly built on top of millions of books and pamphlets. And ever since, "dangerous" knowledge has been increasingly difficult to totally suppress by those most threatened by it even when they were in power.
And we're at the beginning of the fourth explosion which computers and networks will bring about. When knowledge could only spread by speech, it might take a thousand years for a good idea to cross the planet and begin to make a difference. With writing it could take a couple of centuries. With printing it could happen in 50 years. With computer networks, it can happen in a week if not less. After I've posted this article to a server in San Diego, it will be read by someone on the far side of a major ocean within minutes. That's a radical change in capability; a sufficient difference in degree to represent a difference in kind. It means that people all over the world can participate in debate about critical subjects with each other in real time.
With speech, the collaborative process of creation of knowledge expanded from the person to the tribe. With writing, it spread to the level of citystates. With printing it encompassed nations and even continents. With computer networking, everyone in the world is involved whether they like it or not. There's nowhere left to hide.
We're already seeing some of the political, technological and cultural effects of the Internet, and this is just a start. What this means is that drastic cultural shakeout cannot be avoided. The next fifty years are going to be a very interesting time as the Internet truly creates the Global Village.
Source: denbeste.nu This weblog is called USS Clueless - it is anything but clueless - a very erudite site. If you've not visited, you should...
An article in the journal Nature suggested that fertiliser may have been the greatest breakthrough of all for mankind. (I'd always heard that if life hands you a pile of shit, you should make fertiliser. I didn't know I was being given the biggest pearl of wisdom on philosophy's necklace...)
The Next Three Suggestions Are from Participants of the Third Culture (The Edge)
The most important invention in the past two thousand years is anesthæsia.
Have you ever had surgery? If so, either a) part of your body was temporarily "deadened" by "local" anesthæsia, or b) you "went to sleep" with general anesthæsia. Can you imagine having surgery, or needing surgery, or even possibly needing surgery without the prospect of anesthæsia? And beyond the agony-sparing factor is an extra added feature — understanding the mechanism of anesthæsia is our best path to understanding consciousness.
Anesthæsia grew from humble beginnings. Inca shamans performing trephinations (drilling holes in patients' skulls to let out evil humours) chewed coca leaves and spat into the wound, effecting local anesthæsia. The systemic effects of cocaine were studied by Sigmund Freud, but cocaine's use as a local anesthæsia in surgery is credited to Austrian ophthalmologist Karl Koller who in 1884 used liquid cocaine to temporarily numb the eye. Since then dozens of local anesthæsia compounds have been developed and utilised in liquid solution to temporarily block nerve conduction from peripheral nerves and/or spinal cord. The local anesthetic molecules bind specifically on sodium channel proteins in axonal membranes of neurons near the injection site, with essentially no effects on the brain. On the other hand general anesthetic molecules are gases which do act on the brain in a remarkable fashion — the phenomenon of consciousness is erased completely while other brain activities continue.
General anesthæsia by inhalation developed in the 1840's, involving two gases used previously as intoxicants. Soporific effects of diethyl ether ("sweet vitriol") had been known since the 14th century, and nitrous oxide ("laughing gas") was synthesised by Joseph Priestley in 1772. In 1842 Crawford Long, a Georgia physician with apparent personal knowledge of "ether frolics" successfully administered diethyl ether to James W Venable for removal of a neck tumour. However Long's success was not widely recognised, and it fell to dentist Horace Wells to publicly demonstrate the use of inhaled nitrous oxide for tooth extraction at the Massachusetts General Hospital in 1844. Although Wells had apparently used the technique previously with complete success, during the public demonstration the gas-containing bag was removed too soon and the patient cried out in pain. Wells was denounced as a fake, however 2 years later in 1846 another dentist William T G Morton returned to the "Mass General" and successfully used diethyl ether on patient William Abbott. Morton used the term "letheon" for his then-secret gas, but was persuaded by Boston physician/anatomist Oliver Wendell Holmes (father of the Supreme Court Justice) to use the term anesthæsia.
Although its use became increasingly popular, general anesthæsia remained an inexact art with frequent deaths due to overdose and effects on breathing until after World War II. Hard lessons were learned following the attack on Pearl Harbor — anesthetic doses easily tolerated by healthy patients had tragic consequences on those in shock due to blood loss. Advent of the endotracheal tube (allowing easy inhalation/exhalation and protection of the lungs from stomach contents), anesthæsia gas machines, safer anesthetic drugs and direct monitoring of heart, lungs, kidneys and other organ systems have made modern anesthæsia extremely safe. However one mystery remains. Exactly how do anesthetic gases work? The answer may well illuminate the grand mystery of consciousness.
Inhaled anesthetic gas molecules travel through the lungs and blood to the brain. Barely soluble in water/blood, anesthetics are highly soluble in a particular lipid-like environment akin to olive oil. It turns out the brain is loaded with such stuff, both in lipid membranes and tiny water-free ("hydrophobic") lipid-like pockets within certain brain proteins. To make a long story short, Nicholas Franks and William Lieb at Imperial College in London showed in a series of articles in the 1980's that anesthetics act primarily in these tiny hydrophobic pockets in several types of brain proteins. The anesthetic binding is extremely weak and the pockets are only 1 /50 of each protein's volume, so it's unclear why such seemingly minimal interactions should have significant effects. Franks and Lieb suggested the mere presence of one anesthetic molecule per pocket per protein prevents the protein from changing shape to do its job. However subsequent evidence showed that certain other gas molecules could occupy the same pockets and not cause anesthæsia (and in fact cause excitation or convulsions). Anesthetic molecules just "being there" can't account for anesthæsia. Some natural process critical to consciousness and perturbed by anesthetics must be happening in the pockets. What could that be?
Anesthetic gases dissolve in hydrophobic pockets by extremely weak quantum mechanical forces known as London dispersion forces. The weak binding accounts for easy reversibility - as the anesthetic gas flow is turned off, concentrations drop in the breathing circuit and blood, anesthetic molecules are gently sucked out of the pockets and the patient wakes up. Weak but influential quantum London forces also occur in the hydrophobic pockets in the absence of anesthetics and govern normal protein movement and shape. A logical conclusion is that anesthetics perturb normally occurring quantum effects in hydrophobic pockets of brain proteins.
The quantum nature of the critical effects of anesthæsia may be a significant clue. Several current consciousness theories propose systemic quantum states in the brain, and as consciousness has historically been perceived as the contemporary vanguard of information processing the advent of quantum computers will inevitably cast the mind as a quantum process. The mechanism of anesthæsia suggests such a comparison will be more than mere metaphor.
Stuart Hameroff MD is Professor, Departments of Anesthæsiology and Psychology, University of Arizonan 1996. He is co-editor of Toward a Science of Consciousness: The First Tucson Discussions and Debates and Toward a Science of Consciousness II: The Second Tucson Discussions and Debates.
The telescope resolves light from very far away. The spectroscope analyses and diagnoses it. It is through spectroscopy that we know what the stars are made of. The spectroscope shows us that the universe is expanding and the galaxies receding; that time had a beginning, and when; that other stars are like the sun in having planets where life might evolve.
In 1835, Auguste Comte, the French philosopher and founder of sociology, said of the stars:
Even as he wrote, the Fraunhofer lines had been discovered: those exquisitely fine barcodes precisely positioned across the spectrum; those telltale fingerprints of the elements. The spectroscopic barcodes enable us to do a chemical analysis of a distant star when, paradoxically (because it is so much closer), we cannot do the same for the moon — its light is all reflected sunlight and its barcodes those of the sun. The Hubble red shift, majestic signature of the expanding universe and the hot birth of time, is calibrated by the same Fraunhofer barcodes. Rhythmic recedings and approachings by stars, which betray the presence of planets, are detected by the spectroscope as oscillating red and blue shifts. The spectroscopic discovery that other stars have planets makes it much more likely that there is life elsewhere in the universe.
For me, the spectroscope has a poetic significance. Romantic poets saw the rainbow as a symbol of pure beauty, which could only be spoiled by scientific understanding. This thought famously prompted Keats in 1817 to toast "Newton's health and confusion to mathematics", and in 1820 inspired his well known lines:
Humanity's eyes have now been widened to see that the rainbow of visible light is only an infinitesimal slice of the full electromagnetic spectrum. Spectroscopy is unweaving the rainbow on a grand scale. If Keats had known what Newton's unweaving would lead to — the expansion of our human vision, inspired by the expanding universe — he could not have drunk that toast.
Richard Dawkins is an evolutionary biologist and the Charles Simonyi Professor For The Understanding Of Science at Oxford University; Fellow of New College; author of The Selfish Gene, The Extended Phenotype, The Blind Watchmaker, River out of Eden (Science Masters Series), Climbing Mount Improbable, and the recently published Unweaving the Rainbow.
My vote for "The Most Important Invention In the Past Two Thousand Years" is Gödel's Incompleteness Theorem. This single piece of mathematical jujitsu, proving unprovability, formally ended the strain of Western thought begun by Socrates and first fully fleshed out by Aristotle. The ancillary effects of that theory — a rejection of master narrative, an understanding that we will never know all the answers, an acceptance of contradiction, and an embrace of complexity — are just now making themselves felt in the dawn of the post complete world.
Clay Shirkey is Professor, New Media Department of Film & Media, Hunter College.
Too Hot to Handle
In April 1993, the defence magazine Jane's International Defence Review announced the discovery by a British amateur inventor, Maurice Ward, of a thin plastic coating able to withstand temperatures of 2,700°C. The reason why it was a defence magazine who first published news of this revolutionary invention is that the coating is so resistant to heat that it can make tanks, ships and aircraft impervious to the effects of nuclear weapons at quite close range - and hence is of great interest to the military mind.
A little later that year the whole nation had an opportunity to see for themselves the effectiveness of Maurice Ward's new paint on BBC tv when it was featured on Tomorrow's World. Presenter Michael Rodd showed viewers an ordinary chicken's egg that had been painted with the new coating. The paint was so thin it was not visible. Rodd then dramatically donned welder's visor and gauntlets, lit up an oxyacetylene torch, and played the flame directly onto the egg for several minutes. When he removed the flame, and cracked the egg on the table top, viewers were able to see that the coating was so heat resistant that the egg was still raw and had not even begun to cook.
This invention, a simple paint that can render anything impervious to very high temperatures, has been the holy grail of chemical research for more than 50 years. Teams of scientists in the world's greatest industrial and defence laboratories have poured billions of pounds and hundreds of man-years into the search for such a substance - a quest which made Ward's discovery even more extraordinary. Ward's invention is remarkable enough, but the story of how he came to make it, and the resistance he encountered in getting anyone to believe him, is even more remarkable.
Maurice Ward comes from Blackburn and has no professional scientific background. The closest he has come to the chemical industry was when, as a young man, he drove a fork lift truck in the warehouse of ICI. For the past two decades, he has earned a living as a ladies' hairdresser. Part of his income was derived from selling his customers hair preparations such as shampoo, conditioner and hairspray. To maximise his income he rented a small workshop, bought standard chemicals and mixed and bottled his own brand hair products.
In the best traditions of Ealing Comedy, it was when playing around mixing up chemicals in his "skunk works" that Ward stumbled on the formula that had eluded the finest minds in chemical research. Realising at once the value of his invention, Ward wrote to Britain's major chemical companies, offering to demonstrate his material to them. Every one sent him the standard brush-off letter they send to cranks and crackpots. After the Tomorrow's World demonstration, Ward stopped getting the brush-off and starting getting offers instead.
One consequence of his contacts with chemical companies was that the head of research of ICI's paint laboratory left the firm and went into partnership with Ward to exploit the discovery commercially. One other interesting consequence is that the large corporations who had rejected his initial approaches in such a knee-jerk fashion, conducted internal inquests to find out what had gone wrong, both with their own research and with their dealings with the outside world. On the face of it, it was perfectly understandable that Ward's claims should be ignored since he was merely an amateur, with no scientific training and no track record in research. ICI's own paints laboratory held an internal audit and what they found puts this claim in an entirely different light. For the audit showed that the most scientifically qualified of its research chemists had contributed to the least number of patents, and the fewer scientific qualifications the staff possessed, the greater the number of patents they had contributed to. In the most striking case of all, the person who had contributed to most ICI's patents had no scientific qualifications at all.
It seems that Maurice Ward's greatest strength as a researcher was that he had not been taught how to think.
Tech '54, Where Are You?
by Larry Smith
The first commercial microwave hit the market in 1947, but like many early versions of now-ubiquitous items, they were gigantic (5½ feet tall, 750 pounds) and expensive (up to $5,000). In the late '60s, prices dropped, and by 1976 microwaves were more popular than dishwashers.
Bradbury typed one of his best-known works, Fahrenheit 451 (published in 1953), in the basement of UCLA's library, using a pocketful of dimes to power the library's pay-as-you-go typewriters.
Kerouac wrote On the Road in 21 days on a continuous scroll of typewritten pages he taped together. He and his cohorts thrived in an underground artists' movement during the height of 1950s consumerism. Kerouac labelled the group, which included Allen Ginsberg and Lawrence Ferlinghetti, the Beat Generation - a reference to their goal of achieving beatitude, a state of utmost bliss.
Source: popsci.com June 2004
So Much More to Know …
From the nature of the cosmos to the nature of societies, the following 50 (they had 100, but I cut it down to the half I liked the best) questions span the sciences.
For articles on bacteria, centrioles, chairs, nebulae, asteroids, robots, memory, chirality, pain, fractals, DNA, geology, strange facts, extra dimensions, spare parts,
discoveries, ageing and more click the "Up" button below to take you to the Table of Contents for this Science section.