Many of us stared at the star-besotted sky as children and wondered how long it would take to count all those pinpricks of light. The answer is 10 years, once the Large Synoptic Survey Telescope begins its count in 2022. Construction began in October.
Everywhere we look, our mastery grows by leaps and bounds as we push to the very edges of what's possible. We've begun cataloging not only the universe's visible light but also that which we can't see.
Arizona State University's Lawrence Krauss was one of the first physicists to suggest that most of the universe resides in empty space — as so-called dark energy and matter. Though dark energy remains well beyond our ability to measure, painstaking projects have begun to map dark matter.
"We are like the early mapmakers in that we're just coming to understand the world on its largest scale," Krauss says. "We're trying to probe the literal limits of what we can see or what can be detected."
Texas A&M astrophysicist Nicholas Suntzeff, part of a team building the LSST on a remote mountain in Chile, likens it to Magellan's voyage around the globe 500 years ago. It's the moment we first truly appreciated our planet's finite nature.
"Once that boat went around the world, they proved there's a technology to explore everywhere there is on the surface of the Earth," says Suntzeff, who in 1998 helped make the seminal discovery that the universe was not only expanding but accelerating.
"The same thing has happened in astronomy. If I have a big telescope and take a deep enough image, I can see all the way across the universe, and there is an edge to the universe I can't see farther than," he says. "In the next 20 to 30 years, we'll map out every galaxy — except the small ones — in the universe. You'll be able to go to Google Universe and look at any one of the 300 to 500 billion galaxies on your iPad."
This revolution goes far beyond what we can see or what was reasonable applications for use, much like the transistor. Though AT&T invented it in 1947, the company shared the patent liberally four years later because it had no idea how to monetize the transistor or that it would key an incipient digital takeover. (Otherwise, the company would have guarded it more zealously.)
"We're on the edge of what could be a revolution in new technology because of quantum engineering," Krauss says. "We're going to manipulate matter on small scales exquisitely and create new two-dimensional systems, like graphene and buckyballs, that can exploit the quantum mechanical properties of matter to potentially produce extremely new materials with exciting new applications in electronics and building."
Everyone talks about 3-D printers, but they're nothing compared to advances that have enabled scientists to "print" atom by atom. This unprecedented ability to create substances a single atom thick has forged material with very special properties, none more so than graphene, the constituent of buckyballs (hexagonal graphene "dice") and carbon nanotubes (graphene wrapped into a straw).
This one-atom-thick, two-dimensional carbon structure is many times stronger and more flexible than steel, a better conductor than copper, with much greater electrical capacity — but at almost half the density of aluminum. Not only that, but unlike these metals, it won't expand, contract, or corrode.
The challenge of the next 15 years will be finding the right applications, which is more difficult in some ways than making the groundbreaking discovery.
"The more innovative — the more breaking-the-mold — the innovation is, the less likely we are to figure out what it is really going to be used for," said University of Maryland technology historian Robert Friedel in a New Yorker article on graphene last year.
Krauss concurs, but he can't escape the feeling that we're staring down a scientific tsunami: "I don't know where it's heading, but it's going to have a huge impact.
"Quantum mechanics was first developed 100 years ago, and now it rules our lives," Krauss continues. "We wouldn't be able to have this conversation without quantum mechanics, which shows what at one stage might have seemed esoteric but at the next becomes the basis not only for new technology but the gross national product."
The pace of change since the millennium is awe-inspiring, and it's only accelerating. In just 15 years, the world has transformed from mobile computing/communications to effective voice recognition and widely distributed digital media.
We've invaded planets with small armies of robot scooters, peered at matter harking to the first moments of the universe with the Large Hadron Collider, and last year, we even transmitted a thought from the mind of a researcher in India to the mind of another in France.
It's not The Jetsons — there's no flying Buick in the garage — but like what's printed on our car mirrors says, things are closer than they appear. There's inexorable logic to it related to the way computing power has doubled biennially over the past 50 years (a.k.a. Moore's Law). Science is making logarithmic leaps bringing barely imaginable futures closer, quicker than ever before.
Not to suggest this is novel.
"That trend goes back thousands of years. It used to be you would do what your great-grandfather did," says Peter Stone, a University of Texas professor who's worked primarily with robotics and autonomous artificial-intelligence systems. "Nowadays, people are expected to reinvent themselves every few years, not just once in their lifetime. The rate of change is accelerating, but it has been for many, many years."
Yet this doesn't change the calculus. What happened over a generation now occurs in a decade and a half. It took half as long for smart phones to replace cell phones as broadband for dial-up. Will cable TV even exist in 15 years, or might all our media be on demand? From wellness, yoga, and low carbs to Kickstarter, Spotify, and Snapchat, our world's behaving like a flash mob as trends penetrate deeper and faster.
Going back just to the millennium takes us past the birth of Twitter and Facebook and the death of Enron and Arthur Andersen. Ten years ago, Blockbuster was at its peak, with 60,000 employees at 9,000 stores, and people were worried about Clear Channel's domination of radio. You know, the thing with the dials on which grandpa always listened to talk radio?
We live in a world where a 25-year-old video-game-playing Swede known as PewDiePie has a bigger YouTube presence (33.6 million subscribers) than Beyoncé, Rihanna, and Justin Bieber combined, earning as much as $1.4 million a month.
To quote storied former U.S. Attorney General Robert F. Kennedy's line: "There are those who look at things the way they are and ask why . . . I dream of things that never were and ask why not?"
The future's coming fast, regardless of whether we're able to imagine it.
It took four billion years for our genetic code to achieve its complexity but only a dozen years of concerted effort to finally unravel it in 2002, thanks to the $2.7 billion Human Genome Project. That breakthrough sparked a technological revolution in DNA sequencing that has even outpaced Moore's law.
In the past seven years, the cost to sequence your genome has dipped from $10 million to $1,000 with a commensurate increase in speed. Before long, you'll squeeze a drop of blood into a kiosk receptacle and have not only your flu/cold/viral symptoms diagnosed by text within the hour but also receive the best possible treatment based upon your specific and individualized genetic profile.
Advances in DNA sequencing have plant applications, as well. Northern Arizona University ecology Professor Thomas Whitham is planting and replanting different species in different climatic regions to measure changes in the DNA of what is called "foundational species."
Losing a foundational species causes a cascade of effects all the way down the ecological food chain. One such example would be the Fremont cottonwood, which can be found in the desert around Yuma.
"It has about an 11-month growing season, and Yuma is hotter than hell, but it's genetically adapted to it," Whitham says. "Take the same species of cottonwood that lives on the Colorado plateau in a much cooler environment. Its growing season might effectively be five months. You do reciprocal transplant experiments where you take [the two cottonwoods] and plant them both in low and high sites, and see how genetically differentiated they are — it's a lot."
Whitham has a grant from the National Science Foundation to plant these garden arrays and measure how the genes respond to different environmental conditions.
"We're able to make some pretty specific recommendations on what populations and species should be planted based upon their underlying genetic structure," he says. "A genetics approach has only rarely been applied in a wildland system, but it's the backbone of agriculture and actually of a lot of forestry."
There are few better examples of what's possible in this brave new world than the living human lung scientist Joan Nichols has in a bottle at the University of Texas Medical Branch in Galveston. It's a feat of bioengineering accomplished without any grants that no one thought could be done.
Nichols is associate director of research at UTMB, where she leads a 15-person team experimenting on a living lung created from human tissue. They used lungs from a pair of children who died of trauma that were unsuitable for transplant. Nichols' team stripped down one lung to its cell-less scaffolding of collagen and elastin, reseeding it with salvaged cells from the other lung and then immersing it in a nutrient solution.
It was a particularly painstaking effort until medical student Michael Riddle managed to piece together the first apparatus using a pet-store fish tank. (The things you do when you don't have large national grants.) He ultimately would prove able to cut the stripping process from months to days.
The process soon will be used to make pig lungs for attempted transplant. If this experimentation is successful, it could lead to an answer for the 1,600 people awaiting transplants while suffering from incurable lung disorders, such as cystic fibrosis or chronic obstructive pulmonary disease. More than that, the bio-engineered lung offers a living human template on which to experiment rather than relying on proxy animals like mice, hoping the insights translate.
"People ask about modeling other organ systems," Nichols says, chuckling. "I say it's taken me and my research partner 15 years to do the lung and immune system and get this to work, and when first I started talking to people about creating a human model using human cells . . . they didn't accept it very well."
Fueled by ever-increasing computing power, the sciences have been brought together like individuals who've been linked by social media. Researchers who once roamed library stacks for hours seeking scholarly articles now acquire them in seconds.
"This ability to communicate has dramatically changed scientists' ability to collaborate across continents. In fact, that's why the Internet was first created," says ASU's Strauss, noting its genesis as a networked storage platform for European Organization for Nuclear Research (or CERN) scientists. "It makes access to information and communication between scientists much easier, and therefore international collaborations that would've been impossible two decades ago are now the norm."
Science has grown more multi-disciplinary hand-in-hand with increasingly precise tools. This has required ever more expertise from researchers even as it has deepened and widened fields.
"Most people have a more systems-based thinking now simply because it's more possible than it used to be," Nichols says. "Fifteen years ago, you could be a scientist and you worked on your project by yourself with the students in your lab, and it was a very small world, and you were very introverted."
This is just no longer possible as science grows more complicated and requires the assistance of more people to negotiate. Nichols' 15-person team for the lung project includes a cardiothoracic surgeon, a nanoparticle specialist, and chemical engineers.
"The bigger your project, the more people you need to be a part of your team, and the more specialized they are," Nichols says. "You have to be able to talk a little bit of physics, a little bit of chemistry to get everybody on the same page and move it along."
The challenge goes beyond cultivating interests in related disciplines and staying abreast of advances in as many as 10 other fields. There are communication and people skills required to reach across disciplines.
"It's not bad enough that you're doing cutting-edge science and you've got a great team, but you have to have the skills to keep [the team] functioning," she says. "Those are social components they don't always teach scientists . . . I'm happy staring into my microscope all day long, but I can't do that."
The changing mindset is reflected in the broad multi-disciplinary approach that understands a broad variety of phenomena as part of idiosyncratic ecosystems from biology and climate to sociology and politics. It's fueled by the computer-aided ability to see ever smaller particles and parse ever larger data sets.
"We're witnessing a change in the scientific paradigm," says author John Vanston. A former Army lieutenant colonel who taught nuclear engineering before becoming a futurist/consultant in the late 1970s, he's written a dozen books, highlighted by 2011's Minitrends: How Innovators & Entrepreneurs Discover & Profit from Business & Technology Trends.
"Through the centuries, people came to scientific conclusions through observation, then said, 'Let's make an experiment, and we'll see how these things work,'" he says. "Then you began to move into computer simulation, and now we're beginning the fourth paradigm, which is data-intensive discovery. We've been overwhelmed by information, and the key now is to figure out how to use that information intelligently."
Obviously, "Big Data" isn't solely the province of science. Between natural speech recognition, integrated Internet-enabled home appliances, and the next wave of smartphone apps and wearable media, we're on the verge of the ability to auto-correct the most mundane aspects of our life, handing off our chores to a personal concierge service no further away than our voices. This could anticipate needs from ordering groceries to paying bills to balancing checkbooks to scheduling appointments.
But the changes Big Data promises are every bit as big for medicine. Computer advances have produced not only advances in genetic sequencing but also machines that model and test thousands of proteins at a time. They call it the "high throughput era" because of how dramatically new tech has cut experimentation time so that thousands of biochemical interactions that would've taken years now complete in hours.
As with astrophysics, we're entering a period in which the universe is laid out before us. We may not know what everything does, but we can test it, and with the whole picture in hand, things are beginning to make more sense. Once you've created the taxonomy, it's time to sort and organize all these various "known unknowns."
This and other advances in diagnostics, modeling, and medicine's growing interdisciplinary understanding make UTMB's Nichols optimistic that we're on the brink of so much.
"We're coming to the point where I expect, in the next 10 years, we're going to make some big changes in how we look at treatment," she says. "As we understand more, it gives us a chance to make better therapeutics, better vaccines, and better ways to help people overcome any kind of injuries."
One way this is manifesting is in the growth of personalized medicine. All this data potentially can offer insight into why some people respond to treatment while others don't. It's believed to be related to individual differences in DNA and the distinct bacteria that have evolved with us, known as our microbiome.
Our bodies have 10 times more microbial cells than human ones, and our bodies' bacteria have a hundred times our genetic matter. Only since the explosion of DNA sequencing have we begun to realize how important these bacteria are in regulating health, even beyond digestion, including regulating the expression of our genes and inflammation response.
A fuller picture of what's going on promises better therapies, even personalized ones. In January, President Obama proposed allocating hundreds of millions of dollars to this field and heard rare avowals of bipartisan support.
"Basic knowledge of how the immune system works has advanced tremendously over the last 20 years," says Dr. William Decker, a cancer researcher at Baylor College of Medicine in Texas. "When you look back and see what people were doing, I don't want to say it [was] laughable because it [was] admirable, but . . ."
Decker works on trying to teach the immune system to seek out and destroy cancer cells. This would replace the current chemo-based method that involves zapping the body with toxic chemicals in the hope that the immune system bounces back. Immunology takes a focused approach, targeting particular chemical pathways in a manner less damaging to the body as a whole.
"I'm positive that in the next 15 to 25 years, immune therapy will be working really, really well," he says. "These new technologies enable us to think better [about] how things might work and [to] test new hypotheses faster.
"I'm not saying I personally have the answer," he continues. "But there are so many people working on this, and we've learned so much more about the immune system, that one or more of the emerging technologies is certainly going to work and work well. It isn't [in] some locked black box; we can understand how it functions."
While the evolution of medicine and science occurs outside the spotlight, our culture changes before our eyes. Revolutionary changes were percolating on somebody's blackboard long before the phenomenal application was discovered or crucial obstacles surmounted.
Some advances never materialize, like the flying car. Others just take time, such as artificial intelligence beating a grand champion at chess, a feat AI had chased for 40 years before it began regularly besting grandmasters in the middle of the last decade.
Such missed marks humbled computer scientists like Peter Stone, leaving them hesitant to get too grandiose. Even things that are near at hand can seem far away. He remembers when he published his first paper on autonomous cars 11 years ago, even his colleagues in computer science "would look at me like I'm crazy.
"There were demonstrations of cars going long distances under completely autonomous control in the early '90s," says Stone, a former Fullbright Scholar and Guggenheim Fellow.
"If you go back even three or four years and started to talk about autonomous cars becoming inevitable, most people would look at you like you're crazy, where as I think nowadays, most people would just nod and say, 'What else is new?' In a short period of time, public perception of possibility and capability can change dramatically."
That's how Stone feels about the autonomous car. He knows ground-breaking applications when he sees them. Some close colleagues helped found Kiva Systems, whose breakthroughs in robotic order fulfillment have powered Amazon (which eventually bought the firm), the Gap, and Office Depot.
"People envision an autonomous car as a car in which you take exactly the same trips you take today when really it's going to change everything," Stone says. "It's going to change the value of real estate, where people tend to live, the types of trips people take, the need for parking in urban settings, and the value of owning versus sharing cars.
"I compare it to the microwave," he says. "People said I'll just cook the same food faster. In reality, there are whole aisles of grocery stores devoted to microwaveable food. It's changed lifestyles. Same with the cell phone."
Another field where revolutionary innovation mingles with dramatically changing attitudes is renewable power. The longtime pipe dream of environmentalists is becoming more competitive with fossil fuels, thanks to accelerating technological innovation.
"We're seeing advances in basic [energy] generation that are analogous to what happened in the semiconductor industry for many years," says Bruce Wright, Tech Parks Arizona's associate vice president. "[There's been] rapid change and [an] increase in both the efficiency of photovoltaic cells and the different ways of generating solar energy,"
It's bolstered by broad government intervention that has begun to see results. In China, the government's heavy investment in solar helped the industry achieve economies of scale that drove down prices. Recently, France mandated that all new roofs must feature solar panels or greenery.
In America, tax rebates for residential solar inspired many to install. Beyond that, huge tech companies such as Microsoft, Google, and Apple have invested in renewable power for their server farms. Google also has teamed with SolarCity to invest three-quarters of a billion dollars in a residential solar initiative.
"Companies are embracing renewable energy, and it's not because they're directed or mandated by the government but because they bought into the environmental ethic," Wright says. "They're trying to sell the environmental sensitivity of their operations by associating with green energy."
The University of Arizona research park works on defense and security technology, intelligent transportation systems and smart vehicles, water tech, biomedical devices and diagnostics, and mining technology, analytics, and chemistry. It's only in the past four years that the U of A has gotten big into solar.
It began when a large energy company approached the university's tech park about its more than 1,000 acres of undeveloped land in an area of high solar radiance, with the intent of turning it into a solar farm. The park declined, but its scientists were inspired to start a solar-tech-research platform, "where all these new technologies can be demonstrated, tested, and evaluated side-by-side, and you compare one system to another," Wright says.
The park completed the first phase of planned expansion, which includes nine companies testing 14 projects on 165 acres. The next phase is about to begin on 30 acres with seven leading-edge projects marrying solar generation with storage. It requires grid-level output (350 to 750 kilowatts), and features projects from smaller concerns, like CoGenra and Arzon, as well as from larger ones, such as Duke Energy, Washington Gas, and German utility EON SE (the foreign utility's first construction in the United States).
"Storage becomes a very big factor, particularly as you try to deal with issues of weather and off-peak usage," Wright says. "There are all kinds of different approaches to this, and that's where we're really excited that our partner, Tucson Electric Power, is getting ready to release requests for proposals to look at different storage approaches."
Coming fast on the horizon are embedded solar systems, already in testing.
They include thin skins that can coat a building, transparent cells for use in windows, floating platforms on reservoirs that also prevent evaporation, and flexible rollout solar mats for soldiers camping in desolate areas.
It's commercial applications that will drive future industry growth, Wright says. The success of the Tesla electric-powered car is just a small indication of what's possible.
"Government can play a catalytic role, which I think it has at the federal level. In Tucson, at the city and county levels, local governments have tried to [provide incentives] for solar usage," he says. "But ultimately it's the consumer who I think is going to demand and drive the market. The public already is embracing it in ways government didn't fully anticipate."
Most of the marvels discussed in this article were sown by advances in computers, miniaturization, and robotics that go back to the space race. Government funds for the Apollo space program still are paying dividends decades after the break-even point.
"So many technologies brought to bear or invented for that program found their way into everyday lives," says ASU's Dr. Scott Parazynski, a physician and veteran of five space shuttle missions and seven spacewalks. "The computer revolution, the miniaturization of sensors, and medical monitor devices, so commonplace in our operating rooms, all have a pedigree in the space program, just like satellite communications."
For ASU's Krauss, the Apollo example demonstrates the folly of the recent flow of federal money away from fundamental science and toward more applied research. While Congress may hate funding scientists without knowing what might result, that's generally how advances are made. Certainly, space travel wouldn't have been possible without years of investment in basic research."Curiosity-driven research has been the basis of our modern way of life, not just modern technology but really the economic basis of our well-being," Krauss says. "Efforts to focus on applied research [are] a mistake because you don't know where the fundamental research is going to take you."
Parazynski, of ASU's engineering and School of Earth and Space Exploration faculties, shares Krauss' concern about the feds — in his case, the National Institute of Health — diverting funds from edge-pushing research.
"Tempering my enthusiasm for medical revolution and even space revolution, you have to think about the dollars," he says. "One of my concerns for medical innovation is the NIH has a reasonably large budget, but [it ends] up funding established research that has been funded year after year. And it's incremental research, as opposed to really revolutionary, potentially game-changing research."
Parazynski's part of the National Science Foundation's U.S. Antarctic Program, seeking to make surgery possible even in extreme environs such as Mars. The program's findings may have applications for remote rural areas in the United States, perhaps far-flung locales in Africa, and one day maybe in mobile centers sent to crisis zones.
"It leads to remote surgeries anywhere a person needs [them], even without a surgeon present," says robotics guru Stone. "I also have colleagues working on rehabilitation robotics that help people doing physical therapy through robotic-exoskeleton-type things."
Advances in brain/machine interfaces also have produced surprising prosthetic functionality, Stone notes.
There are those who fear that robots with artificial intelligence could threaten mankind, but Stone laughs at the idea.
"Go into any robotics lab and see the capabilities of robots at the moment, and your fears will be alleviated right away," Stone says, noting that despite recent advances, the sophisticated robotics seen in movies is well beyond reach.
But this doesn't mean we shouldn't consider the repercussions of technology, many experts believe. Before problems emerge is the best time to consider them, according to ASU political scientist Dave Guston, director of the university's Center for Nanotechnology in Society.
"We've seen lots of top Silicon Valley-type figures, like Elon Musk and Bill Gates, talk about potential threats to human existence [from] artificial intelligence and robotics," Guston says. "We're talking about the creation of a set of intelligence and capabilities that are intended to exceed human performance. We would be dumb if we did not take some of these warnings seriously."
Our ability to command information about our environment is increasing, as well as our ability to communicate with accelerating ease and speed. Pretty soon, we won't have to command information; it will avail itself to us, self-filtering salient facts based on our established preferences or patterns.
"We have innate curiosity," says astrophysicist Suntzeff. "It's all part of the same need humans have to figure out the world around them."