Thursday, 11 July 2013

12 Most Important Trends in Science Over the Past 30 Years

12 Most Important Trends in Science Over the Past 30 Years

DISCOVER was founded just over three decades ago, with the goal of bringing science's amazing discoveries to any reader curious enough to want to find out about them. Since then, every field of science has taken big strides. Here we take a look back at the most significant new ideas, discoveries, and inventions of DISCOVER's three decades. (Also see the pieces on where science and technology will go over the next 30 years and 30 ways the world could end.)
Below is a list of the 12 biggest trends in science, with explanations written by some of the magazine's best contributors.
30
Worlds Unveiled
titan
Titan exposed: Three infrared snapshots from the Cassini probe show strance surface markings, including possible ice volcanoes.

NASA/JPL/University of Arizone
Back when the first issue of DISCOVER hit the newsstands, the solar system was a sleepy place. We had nine planets and dozens of moons, but they seemed like inert, dead places—to the extent that we knew them at all. No action, no change.
Today the picture couldn’t be more different. NASA probes have found evidence of geological activity on at least three planetary bodies: Jupiter’s hellish moon Io, Saturn’s Enceladus, and Neptune’s Triton. The Mars Reconnaissance Orbiter has documented landslides and dust devils on the Martian surface. Observers around the world saw Jupiter whacked by impacts on three occasions, including the dramatic multiple beating it took in 1994 by the comet Shoemaker-Levy 9.
And, perhaps most amazingly, scientists are seriously considering the presence of life elsewhere in the solar system. Mars still ranks high on the list of places to look. But another of Jupiter’s satellites, icy Europa, is a contender, as are Enceladus and the largest of Saturn’s moons, Titan. The last of these even has liquid methane lakes that expand and shrink with the seasons.
Many people think that Apollo represented the glory days of America’s space program, but you’d have to qualify that with the adjective manned. The unmanned program is having its heyday right now. NASA has spacecraft orbiting Saturn, Mars, the moon, and the sun and will soon have one around Mercury; the European Space Agency (ESA) has probes around the sun and Venus. ESA has a comet in its sights as well: The Rosetta mission will touch down on one in 2014. In counterpoint, NASA’s Dawn spacecraft will orbit the asteroids Ceres and Vesta later this decade. All eight planets have now been seen up close. If you’re a Pluto fan, you’ll have to wait just a few more years for New Horizons’ 2015 flyby.
Whole libraries could be filled with the wonders gleaned from the robotic exploration of the solar system, and we’ve only scratched the surface. When humans set foot once again on the moon—or for the first time on Mars—what they find will surely be enriched by all that our robotic messengers have already told us.
Dinosaurs Live On
Every schoolchild knew exactly what dinosaurs were like: tail-dragging, cold-blooded, lumbering lizards, dumb as a box of sedimentary rocks. The handful of new species discovered each year were mere scientific curiosities, more fodder for the kids’ books.
The last 30 years has turned all that on its head. Dinosaurs were hardly destined for extinction, it seems, and in a real sense they never went extinct at all. Each new dinosaur discovery seems to tell us that the megabeasts of yestermillennia aremore closely related to the partridge than the Komodo dragon. Like birds, they ran in groups, cared for their young, were probably warm-blooded. And their tails? Up like tail feathers, mostly horizontal and off the ground. The idea that dinosaurs still walk among us—fly, really—was put forward by John Ostrom of Yale University in the 1970s, but it stayed an argument until the ever-growing mountain of evidence buried almost all controversy. In 1993 in the Gobi desert, remains of a Citipati (an ostrichlike dinosaur) were found brooding a nest of eggs. Skeletal studies revealed hundreds of similarities between modern birds and even the largest of dinosaurs: three-fingered hands, ankles up off the ground, holes in hip sockets. Then in 1995 diggers started uncovering bits of Sinosauropteryx and other feathered dinosaurs from the remarkable Liaoning province of China, culminating in 2000 with the discovery of an intact feather-covered dromaeosaur, a dinosaur with many of the physical traits of a bird. “It’s nailed shut,” says Jack Horner, curator of paleontology at Montana State University’s Museum of the Rockies. “Someone might say, ‘That’s wacky—look at the difference between a robin and a T. rex.’ There are lots of differences, yes, but you can only use similarities to determine relatedness. That’s a paradigm that shifted.”
As for the disappearance of the terrestrial ancestors of our birds, that part of the puzzle remains unsolved. Almost simultaneous with the birth of discover, physicist and Nobel laureate Luis Alvarez (along with his son, Walter Alvarez of the University of California, Berkeley) proposed that a giant asteroid impact killed off the dinosaurs by causing global firestorms and dust clouds that blotted out the sun. “Previously, hypotheses of extinction had centered on slow declines, competition from mammals, plant chemistry, and so on,” says Darren Naish, a paleontologist at the University of Portsmouth in England. Thirty years later everyone agrees that a large asteroid did indeed hit the Yucatán peninsula 65 million years ago, just around the time of the dinosaurs’ demise, but theconsequences are not as clear-cut as many scientists assumed a decade ago. Perhaps it merely finished off the last of the earthbound dinos after a long decline in diversity. Says Horner: “I would say that it certainly might have.”
 Cells Become Healing Tools
Medicine has seen its equivalent of the splitting of the atom: the ability to take apart, reassemble, and release the incredible power locked in genes and cells. The breakthrough has resulted from the merger of interlocking fields—gene therapy and cell therapy—which are now spawning near-miraculous treatments and cures. But where atomic research had a single, explosive debut, gene- and cell-based treatments have emerged in fits, after many false starts.
One huge step forward came in 1985, when researchers began shuttling genes into mammalian cells by first transferring them to a virus. The genes hitched a ride inside the virus, ultimately entering the cell nucleus and working alongside native genes already in place. Then things went wrong. Not only did gene therapy fail to cure disease, but Jesse Gelsinger, a boy with a rare metabolic disorder, died in a clinical trial in 1999.
Only recently have cell and gene therapy begun to triumph, by borrowing from and blending into each other’s approaches. One stunning proof of principle occurred in 2007, when German researchers treated a 40-year-old patient for HIV and leukemia with stem cells lacking an HIV receptor, making them resistant to the virus. The patient was cured and, three years later, remains well. In another landmark success, scientists in Italy and the United States cured “bubble” babies who have a malfunctioning gene for the enzyme adenosine deaminase, which causes a buildup of toxic products that destroy immune cells. Doctors gave the patients stem cells containing copies of a properly functioning gene for the enzyme; the babies’ immune systems were then able to reconstitute themselves.
The combined therapy is assuming even greater power as scientists manipulate genes to wind ordinary cells back to an embryonic state. Embryonic cells could restore brain and immune function and regenerate organs. “The time is coming when we’ll repair heart tissue after a heart attack and restore blood flow to limbs that would otherwise be amputated,” says stem cell researcher Robert Lanza of Advanced Cell Technology in Massachusetts. “We’ll look back and say, ‘Can you believe how people used to suffer?’”
 New Hope in the Search for Alien Life
For the first 15 years of DISCOVER’s existence, if you wanted to hear about planets orbiting other stars, you had your choice of sources: Star Wars and Star Trek. That all changed in 1995 with the discovery of a planet orbiting 51 Pegasi, a near-twin of the sun located about 50 light-years away. Swiss astronomers Michel Mayor and Didier Queloz found the distant world by watching its gravity tug its parent star. Over the following year, American scientists Geoffrey Marcy and Paul Butler confirmed the observation and soon found several more planets on their own.
The first exoplanet discoveries upended the world of astronomy. Scientists had always assumed that other solar systems (if they existed) would look pretty much like ours. Not these: The new planets were giants, like Jupiter, but more than half lay closer to their stars than Mercury does to our sun, whipping around in just a few days and baking at temperatures close to 2,000 degrees Fahrenheit. Our theories about how planets form and evolve had to be ripped up and rewritten. Since then, planet searchers have found more than 400 new worlds, and it has been one surprise after another. Some planets follow wildly oval orbits. Others orbit their stars backwards, have strange and unexpected compositions, are puffed up like marshmallows, or shed tails like comets. Collectively, they prove that the universe is far stranger and more creative than anyone imagined.
By next spring, the planet-hunting space telescope known as Kepler—rejected by NASA three times but then approved after those initial detections of exoplanets in the 1990s—will most likely report the discovery of the first known Earth-like planet in an Earth-like orbit. This milestone will inevitably spark another revolution, as observers redouble their efforts to see Earth-like worlds directly and probe their atmospheres for the telltale chemical signatures of life. It’s a good bet that these planets, too, will not be what we expected.
Simply studying the light from Earth-like planets—much less getting direct pictures of them—will be wildly difficult. But then, 15 years ago nobody thought we could find any exoplanets at all. (The James Webb Space Telescope, launching in 2014, might be able to find hints of biology on an alien world.) Even if the next space telescope only comes close, the one after that will very likely do the trick. So here is a bold prediction: By the time discover celebrates its 50th anniversary, the mystery of whether life exists elsewhere in the universe will finally be solved.
Building With Atoms
“Nanotechnology”—anything constructed on the scale of a nanometer, just a few times the size of an atom —has a quintessential science fiction ring. Yet you almost surely have some nanotechnology sitting around you. Try doing a Google search on the word. There: You just used it.
The revolution began quietly in 1981, when Gerd Binnig and Heinrich Rohrer at IBM in Zurich invented the scanning tunneling microscope (STM), which could read a surface atom by atom. Over the last decade, researchers adapted STMs to probe organic molecules and to build simple devices using atoms like Lego blocks. At the same time, electronics engineers were working their way toward the nanoscale from the top down, cramming ever more (and faster) transistors onto silicon chips. That effort allowed the speed of computer processors to keep doubling every 18 months or so, an advance known as Moore’s law. By the early 2000s, transistor size had dipped below a ten-millionth of a meter, bringing computers and cell phones into the nano realm. And the progress goes on: Late last year, researchers in Finland and Australia built an experimental transistor out of a single atom of phosphorus.
The next stage of atomic technology may involve replacing silicon with other substances optimized for the nanoscale. “Materials change properties when you enter into this new world,” says James Yardley, director of the Nanoscale Science and Engineering Center at Columbia University. He and his colleagues were among the first to discover one-atom-thick sheets of carbon called graphene. In theory, electrons should move through these sheets with essentially no resistance. “If you could do that, you could transmit electricity across the country with no loss,” Yardley says. Last February, IBM scientists created a graphene transistor that can switch on and off 100 billion times a second, more than twice as fast as its silicon counterpart.
Real-world nanotechnology has implications far beyond computing—implications that could finally give substance to the field’s old science fiction visions. Graphene is so strong that it has some scientists scheming ways to build elevators into space. In the meantime, nanoscale carbon structures are being developed for fast-charging batteries, efficient solar cells, and implantable drug-delivery capsules. “We have the knowledge and tools to make all that happen,” Yardley says.
Human Family Tree Gets Bushy, Grows Roots
The study of human origins has been marked by lively—sometimes vicious—sparring over the identity of the original human ancestor. The battle royal is best symbolized by world-famous Lucy, a 3.2 million-year-old fossil Australopithecus afarensis originally unearthed in 1974 and put forth as the original biped leading to us. The furor over Lucy’s pedigree embroiled researchers of the 1980s and remains unresolved, but proof could be beside the point. “I frankly do not care,” says Stony Brook paleoanthropologist William Jungers. “She allows us to understand what our precursors looked like: sexually dimorphic, small-brained bipeds retaining the ability to climb trees.”
Recent discoveries offer a deeper and broader view of human ancestry. One stunner: Early humans mated with Neanderthals, according to evolutionary geneticist Svante Pääbo and colleagues at the Max Planck Institute for Evolutionary Anthropology in Germany. Through an analysis of DNA fragments from Neanderthal bones, Pääbo traced the interbreeding back 60,000 years to the Middle East. Today 1 to 4 percent of the human genome outside Africa is Neanderthal. Another shock came last year when Tim White of the University of California, Berkeley, and the Middle Awash Team unveiled Ardipithecus ramidus(“Ardi”), a 4.4 million- year-old fossil female hominid. Bipedal on the ground but efficient at moving through trees, Ardi suggests the common ancestor we share with chimpanzees was an ape with monkeylike traits. Finally, in 2004, in a cave on the island of Flores in Indonesia, bones of a human relative no larger than a modern-day 4-year-old were discovered by archaeologist Michael Morwood of the University of Wollongong in Australia and his team. The bones are 14,000 years old, but tools nearby date back as much as a million years. After furious debate, most paleoanthropologists now agree that Homo floresiensis, nicknamed the hobbit, is a genuine ancient human with a teensy brain folded in ways that increased its complexity—enough for hobbits to hunt cooperatively, knap their own tools, and thrive on an island for more than a million years.
“We’re the last hominid standing,” Jungers says. But it is no longer clear whether we are the crown of creation or just one branch of a diverse evolutionary bush
DNA Decoded and Reprogrammed
In 1990 biologists embarked on one of science’s most ambitious journeys of self-discovery: sequencing every base pair in our genetic code. A decade later, in February 2001, the publicly funded Human Genome Project (HGP) and privately funded Celera Corporation, led by J. Craig Venter, separately published their drafts of the human genome. In 2003 the HGP released a full map. Then...nothing. Finding connections between the genome and disease proved far more complicated than biologists had hoped (or feared). Common diseases turn out to be caused by intricate gene interactions, and genes respond to environmental signals in confusing ways. But just when it seemed as if the Human Genome Project would take us nowhere fast, the burgeoning field of bioinformatics—treating DNA as data—came of age.
Studies of genetic markers had already proved invaluable for evolutionary biology and forensic science, aided by chemist Kary Mullis’s 1983 invention of PCR, an efficient way to amplify minute fragments of DNA. Uploading genomes onto a computer opened rich new possibilities. Parsed by computer, digital DNA began revolutionizing the study of human ancestry. Today molecular biologist Leroy Hood, president of the Institute for Systems Biology in Seattle, is trying to use the tools of bioinformatics to create a kind of medicine he calls P4: predictive, personalized, preventive, and participatory. By analyzing genomes within a nuclear family, he has discovered the gene linked to Miller’s syndrome, a craniofacial defect. Next he aspires to tackle more common but genetically and environmentally complex conditions—cardiovascular, neurodegenerative, and autoimmune disease. “In 5 to 10 years, each individual patient will be surrounded by a virtual cloud of billions of data points,” he predicts. “We’ll be able to mine that information and gain deep insights into health and disease.”
Treated as information, DNA can also be manipulated to create designer organisms. Molecular biologist Venter recently used this approach to create what he calls the first synthetic organism. He sees a day coming soon when DNA can be written like software and custom microbes can be “programmed” to generate inexpensive energy, fertilizer, drugs, or food. One potential goal: turning human waste into clean water, electricity, or both. Another: fighting global warming by sucking carbon dioxide from the air. “Could we make artificial steaks?” Venter asks. “We’re limited only by our imagination.”
The Web Takes Over
Just think of all the ways the world wide web has improved our lives since its invention in 1989.
It is making us healthier: In 2007, some 160 million Americans looked for medical information online, according to a Harris Interactive poll. We can get advice directly from the Mayo Clinic’s Web site or ask a forum and tap the wisdom of the crowd.
Nanotech2day.blogspot.com - NANOTECHNOLOGY
It facilitates news gathering: When Iran cracked down on protesters last year, startling videos of the chaos were careening around the world’s computers and cell phones within minutes. We have access to a billion frontline reporters 24 hours a day.
It is an economic turbocharger: Today anyone can start a global business for almost nothing while sitting in his or her living room. Engineers, executives, salespeople, and creative teams can collaborate with each other and with customers and suppliers virtually around the world. Online retail sales are approaching $150 billion a year in the United States alone.
It is revitalizing politics: Barack Obama’s presidential campaign raised half a billion dollars—the bulk of the total money it raised—from more than 3 million people via the Web. Today, regular White House video postings have pundits dubbing this the “YouTube presidency.”
It is connecting us: We are in closer touch with more people for longer periods of time. Facebook has 500 million active users spending more than 700 billion minutes per month on the site.
Then again, the Web is also making us sicker, or at least making us feel that way: A 2008 study by Microsoft Research showed that Internet searchers tend to focus on only the top few results, typically highlighting rarer, more serious diagnoses of common ailments such as headache (brain tumor!) and chest discomfort (heart attack!).
It impairs news gathering: Some 30 daily newspapers have been shuttered in the past three years, and virtually all the rest have been rocked by layoffs, as advertisers and readers flee to blogs and other free, Web-based news sources where the reporting is often slapdash or recycled.
It is an economic saboteur: Large segments of the publishing and entertainment industries have been devastated by the cornucopia of free (often pirated) online publications, music, and video.
It is poisoning politics: Studies show that people tend to read Web sites that reinforce their biases and beliefs. Among the biggest winners politically on the Web have been hate and terrorist groups, many of which have mastered online fund-raising and recruiting.
It is alienating us: Social scientists caution that Facebook and other online playgrounds are keeping children from getting out and getting together, limiting their social skills and encouraging obesity.
What’s indisputable is that the Web is changing our behavior.
It may be changing the way our brains are wired, too. But by the time we figure out what these changes are, it won’t matter, because everyone will have grown up in a world where the Web is inseparable from everything we do. Celebrate the change or not. It isn’t waiting for your approval.

Uncovered: How a Brain Creates a Mind
Before neuroscience could tackle its biggest question—how the brain transforms chemical reactions and electrical pulses into cognition—it had to wrestle with the tiny.
In the early 1980s, new tools made it possible to map out the machinery inside an individual brain cell. Using minuscule glass electrodes that are able to measure picoampere currents, researchers could observe a single neuronal pore popping open and slamming shut to transmit a signal. At the same time, the techniques of genetics and molecular biology began to reveal the intricate biochemical signals that synapses—the portals of nerve cells—deploy during communication.
Probably the most powerful idea to emerge from these innovations was the realization that neuroplasticity, the brain’s capacity to change and adapt, persists into adulthood. As Columbia University neuroscientist Eric Kandel and others have shown, neurons respond to stimulation by altering their activity and connectivity, remodeling the architecture of the brain. If the adult brain is flexible, the implication is that damage or injury might be reversed.
Other researchers, meanwhile, were using neuroimaging to go global, observing the whole human brain in action without opening the skull. In the 1980s, positron emission tomography (PET) scans, which detect neural metabolism with radioactive tracers, snapped the first pictures of the brain in midthought. In 1990 functional magnetic resonance imaging (fMRI) provided a safer way to track sensations and emotions as they occurred.
An even more holistic vision of the brain is emerging from connectomics, which seeks to create a wiring diagram of the entire human brain. Most recently, a technology called optogenetics has allowed experimenters to toggle entire brain circuits “on” or “off” by shining laser light on genetically modified synapses. The implication: A deeper understanding of thought and feeling, not to mention new therapeutics, may be closer than we think.
In the new view, the brain looks like “a fluctuating mosaic of areas in a state of dynamic equilibrium,” says V. S. Ramachandran, a neuroscientist at the University of California, San Diego. It is not a computer so much as a constantly unfolding dance.
Universe on a Scale
Physicist Saul Perlmutter vividly remembers what cosmology was like during his grad student days 25 years ago. “It was a standing joke that if you were within a factor of 10 with your measurements, you were doing well,” he says. Estimates of the universe’s age ranged from 7 billion to 20 billion years. It wasn’t a discipline renowned for exactitude.
Cosmology was reborn on November 18, 1989, with the launch of the Cosmic Background Explorer (COBE) satellite. COBE made the first precise measurements of the faint radiation left over from the Big Bang. According to theory, tiny quantum fluctuations were blown up to cosmic proportions within the first fraction of a second after the birth of the universe, creating lumps that seeded today’s galaxies and galaxy clusters. COBE found the imprints of those fluctuations, strong evidence for the standard model of the Big Bang. A second,even more shocking development came in 1998 from two teams of astronomers (one led by Perlmutter) studying supernovas in distant galaxies. They discovered that the expansion of the universe is accelerating, driven apart by an enigmatic, all-pervasive property of space now called dark energy. “Will dark energy keep the universe accelerating faster and faster?” Perlmutter asks. “Or could it decay or even change and make the universe collapse?”
In 2001 COBE’s successor, the Wilkinson Microwave Anisotropy Probe (WMAP), brought even more precision to cosmology. WMAP finally revealed the exact age of the universe: 13.7 billion years. It also showed that ordinary matter—the atoms that make up galaxies, planets, and people—accounts for a paltry 4 percent of the universe’s contents. Dark matter, invisible except for its gravitational influence, makes up about 23 percent; dark energy accounts for the rest. Next up, the Joint Dark Energy Mission—tentatively scheduled to launch in 2016—will fill in more details. This space observatory will be able to study supernovas that exploded as far back as 10 billion years to analyze the shifting relationship between the pull of mass and the push of dark energy. “To make any predictions,” Perlmutter says, “we need measurements about 20 times more accurate than we have now.” Those measurements—and the next era of precision cosmology—are on the way.

Physics Seeks The One
A new era in physics should have started 11 years ago in Waxahachie, Texas. That’s where the Superconducting Supercollider, a 54-mile-long underground circular particle accelerator, was supposed to smash protons together and glean vital clues from the subatomic wreckage. Cost overruns led Congress to cancel the SSC in 1993. If the project had gone on as planned, “by now we’d be asking a new generation of questions and refining them,” says theoretical physicist Frank Wilczek, a Nobel laureate at MIT. “But I’m on record as saying we’re about to enter a new golden age.”
The reason for his sunny mood: the Large Hadron Collider, an almost-as-powerful accelerator near Geneva that began firing protons last spring. It could provide the best evidence yet that the four natural forces shaping our world—gravity, electromagnetism, the strong force, and the weak force—are manifestations of a single underlying law. Through a decades-long effort, physicists have managed to incorporate all of the forces save gravity into a theory called the standard model. The LHC is designed to find the hypothetical particles (most notably the Higgs boson, believed to endow other particles with mass) that would back up that theory.
The LHC may also lead physicists toward a unifying framework that goes even beyond the standard model. String theory—which holds that all particles and forces ultimately consist of unimaginably small, vibrating objects called strings—has dominated theoretical physics for most of the past 30 years, yet it remains controversial. Many in the field feel the theory is valid, but it cannot be falsified by experiment, the standard by which scientific concepts are judged.
Although the LHC will not come anywhere close to detecting strings, it may confirm a precursor theory called supersymmetry, in which every known type of particle has a “superpartner”: an unstable, heavier twin. On the other hand, failure to detect supersymmetric particles at the LHC would be a blow to string theory’s credibility. “String theory has been impressive mathematically,” Wilczek says, “but disappointing in describing physical reality.”
The Heat Is On
The science revealing rising risks of disruptive human-driven climate change has accumulated like dots added to a pointillist painting, but the resulting image still lacks clarity. The result is one of the great paradoxes of the early 21st century: a potential planet-scale threat that perpetually hides in plain sight. Parts of the climate picture are visible now in high resolution. There is no longer any reasonable way to explain recent changes in atmospheric and ocean temperatures without a substantial contribution from accumulating human-generated greenhouse gases. Arctic sea ice in summer is dwindling. Tropical climate conditions are expanding. The stratosphere is cooling, as predicted, while lower atmospheric layers warm.
For many of the most consequential climate impacts, though, the picture remains fuzzy. Rising sea levels are certain in a warming world, but there is still substantial uncertainty about the extent of the increase in this century, mainly because thedynamics that could erode the ice sheets of Greenland and Antarctica remain poorly understood. Other worst-case outcomes also remain primarily in the realm of the plausible, as opposed to the probable. That may be one reason why work toward a meaningful climate treaty and national climate legislation has sputtered. But there are others.
The disconnect between information and action is less surprising when you examine two other trends of the past 30 years. One is in research in behavioral science, illuminating our tendency to sift facts using emotion-based filters and to deeply favor short-term payoffs. The second is the sustained disinvestment in basic energy-related R&D that began with the election of Ronald Reagan in 1980 and has continued until now, with bipartisan support.
Some analysts call for a focus on adaptation and innovation: helping vulnerable communities develop ways to deal with climate extremes and reviving research budgets to raise the odds of energy breakthroughs. Even pessimists point to climate countermeasures, dubbed geoengineering, as a vital insurance policy. Still, the core challenge remains as Ralph Cicerone, president of the National Academy of Sciences, described it to me in 2007: “Does it take a crisis to get people to go along a new path, or can they respond to a series of rational, incremental gains in knowledge?” Given the persistent gap between climate data and behavior, we may not like the answer.
source:http://discovermagazine.com

No comments: