Monthly Archives: February 2012

Who cares about science?

Too bad.

One of the most self-damning flaws of scientific research is that, except in the rarest of cases, you don’t get to see the true impact of your current work until much, much later in life. Still. How awesome would it be to be Tim Berners-Lee right now? “I invented the Internet.” Or Thomas Edison: “I invented the light bulb.” Or Freud: “I invented sexy thoughts.”

Take Berners-Lee and the World Wide Web. Back in 1989, computers were clunky, command-line interfaces that couldn’t talk to each other, at least not in any significant way. Sir Tim Berners-Lee changed that. As a young scientist at CERN, he saw an opportunity to combine existing computer networking protocols—the Internet—with the newfangled concept of hypertext, and out popped the World Wide Web—the “Internet.” Now, at age 56, Berners-Lee gets industry awards, a knighthood, honorary doctorates left and right—but not enough to inspire the masses. Those who lack in age rarely recognize their deficiency, and the promise of unlimited speaking engagements at universities and conferences 20 years down the line won’t push today’s teenagers from TVs to test tubes. Maybe the prospect of being knighted will do the trick. But I doubt it. If professions were subject to natural selection, researchers would be extinct, for an ironic lack of reproducibility.

Other technical people are generally a bit quicker than the public to catch on to the significance of a scientific breakthrough—surprise—but even so, it’s only within the scientific community—a tiny fraction of it at that—that any such recognition resonates. Maybe that’s why relatively few young Americans today are excited about research. They don’t care about recognition from the scientific community—why should they? From the outside looking in, the community is small, quirky, and rarely produces a viral YouTube video or Top 10 hit.

While science only brings forever-delayed gratification, working at Apple, Google, or Intel lets you to point to an iPhone or new search feature or computer and say “I created that”—sure, with 100 other people, but what of it? Siri’s still pretty damn cool. Our current Internet-dominated era has that advantage, twofold: Anyone can learn to program and create an iPhone app or website—low barrier to creation—and anyone can find their work going viral via YouTube or Reddit—low barrier to recognition. It’s a simple feedback cycle—create, be recognized for creation—and few can resist its temptations. It’s hard to overstate how good it feels to be able to say “I created that”—for many people, it makes all the hard work worth it. That’s what drives them to work late nights and weekends. That’s what makes them say, “I love my job!” and truly mean it.

But imagine going to Google to work on Android, then finding out after a year on the job that it won’t be released for 20 more years, and even then with only 10% probability. You’ll have to wait two decades before anyone knows what the hell you’re talking about: “Hold on… You make androids? Is that ethical?” Until then, it’s all blank stares and polite smiles and changed subjects. I mean, it sure does look promising, but can I get it on Amazon?

Read any popular science article: “Scientists warned, ‘This is an extremely promising breakthrough, but it’s at least 5-10 years away from commercial deployment'” (see ScienceDaily or MIT’s Technology Review for more egregious real-world examples). And while it may be honest science journalism, Teenage Me hears that and thinks, “10 years—that’s half my life! Where did I put that Google offer letter?” That’s the burden of the scientific profession, the psychological barrier to entry that pushes many away from research careers, perhaps after a first unfulfilling undergrad research experience where feedback was lacking and progress was uncertain.

So what can we do about it?

Universities can encourage faculty and graduate students to take extended leave from their home institutions to work in the private sector, to start companies, to get involved in public policy. Research institutions can raise salaries for research scientists and other technical staff. Researchers can eradicate the academic superiority complex.

Government can fund more research, more education, more graduate and postdoctoral fellowships. Forward-thinking politicians can create more research jobs that don’t require a PhD.

The rest of us can learn some science—not Alka-Seltzer volcanoes and Coke-and-Mentos science, but real-world stuff: climate change, battery technology, the power grid, the Internet, DNA, neuroscience, medical imaging, computer hardware, energy conversion, programming, electric cars, wireless communications. We can figure out how the world around us works. It’s not magic, and when more than just technology creators understand how stuff works—when technology users get it too—innovative ideas emerge organically.

Science seems to be content with enabling, not creating, future technology. And that’s OK—the future is built on scientific progress. But the engineer in me can’t accept that. As a researcher in semiconductor devices, I straddle physics and chemistry and materials science and electrical engineering, and I can’t possibly divorce the science from the applications and still stay motivated enough to keep working on it. The thought of spending my life working on something that will never see the light of day—literally—terrifies me, and not a single day passes in which I don’t think about how I can best contribute—not just to my field, as is the nominal goal of the PhD, but to our daily lives.

Although the ivy has receded, particularly at startup-friendly institutions like Stanford and Berkeley and MIT, there’s still an unacceptably large divide between academic research and industry, between basic science and applied technology. We need researchers who are as comfortable talking to politicians and electricians and farmers as to colleagues and science reporters and the ever vague and ill-defined “general public.” We need researchers who can and will bring to market the incredible world-changing potential that every journal paper promises. And we need non-researchers—entrepreneurs, teachers, politicians—who innovate like researchers: logically, relentlessly, radically.

That could be you.

When you think about who you want to be when you grow up, imagine telling your kids in 30 years: “I made you AND the world you live in.” Take that, Freud.

Tagged , , , , , , ,