Writing a novel is like driving a car at night. You can see only as far as your headlights, but you can make the whole trip that way.
I think grad school works the same way.
Writing a novel is like driving a car at night. You can see only as far as your headlights, but you can make the whole trip that way.
I think grad school works the same way.
A couple weeks ago, I ran the Reach the Beach Relay, a 200-mile footrace across Massachusetts. I’d never met anyone on my team before 9AM that Friday morning, when we gathered across from the Stata Center on Vassar St. at MIT to pile into two Dodge Grand Caravans and make our way into the wilderness beyond Greater Boston. A trunk already packed high with Nutri-Grain bars, fruit snacks, and Gatorade was further burdened by my duffel bag packed with running shoes and bananas.
I was enlisted for this drama by my labmate Christina, who knew a team of MIT chemists (“12 Angry Scientists”) looking for a happy engineer to fill out their roster. I had no idea what I was getting into, but signed up on a whim months ago, then promptly forgot about it in favor of working on my masters thesis. The morning of the race, I turned in my completed SM thesis to the EECS department and clambered into Van 2 carefree.
8th in the Angry Scientist rotation, I woke up to grab the baton from #7 Jen and run my first leg, 7.53 miles from Worcester to Boylston. I knew I went out way too fast but couldn’t help it—it was one of the first hot days of the year, I’d been sitting in front of a computer writing a thesis for the past month, and a mile in I was panting like an overexcited puppy. Smooth. Luckily there was no one around to see me self-destruct—my teammates helped with some drive-by dance music—and after 52 minutes of contemplative misery, I rolled into transition area 9 under my own power.
Apparently my preparation was lacking; I’ve run a few marathons (26.2 miles) and half-marathons (13.1) over the years, so I was ready for a calm 7.5- or 8-minute-per-mile pace. Although my total distance here (~22 miles) was similar to a marathon, it was split up into three frantic 6-8 mile races, so I felt compelled to run hard from the get-go rather than pacing comfortably and running to finish. Getting used to the pace was the second-hardest part of this race. The hardest was timing: When to eat, when to drink, when to sleep, when to stretch, when to get warmed up, and—most importantly—when to take a seat on a nearby toilet. No joke. Coordinating alimentary intake and inevitable emission over 24 hours of running is an engineering task far beyond my abilities. Our team captain Kit solved the problem in finest MIT fashion: He simply contracted food poisoning, so that everything he ate came right back up—no need to digest. Brilliant.
Our van pulled into Horseneck Beach in southern Massachusetts around 1:30PM on Saturday, with our intrepid driver and final runner Yifeng in hot pursuit. 12 Angry Scientists joined in for the last 100 meters of the race, crossing the finish line after more than 25 hours in transit. We were met with medals and Boloco burritos, and drove back to Boston in delirium.
Reach the Beach 2013 exceeded all my expectations for a relay race. Running for and with a team is infinitely more fun than running alone: you can cover a lot more distance, you get to travel with a built-in fan base, and there’s always someone around to feed you. This particular race was an incredible opportunity to get out of Boston and explore the New England countryside in all its glory: I’ve now used Porta-Potties all across Massachusetts. Who’s in for next year?
The photos in this post were taken by me, Andrew, Jen, and Monica. Thanks!
I went to a talk earlier this week by Robert Jaffe, an MIT physics professor. Jaffe does research on particle physics and quantum field theory, but this talk was on energy-critical elements (ECEs)—chemical elements that (a) are critical for one or more new energy-related technologies and (b) don’t currently have established markets and hence might not be available in large quantities in the future.
For renewable energy technologies, materials availability is particularly crucial because renewable resources tend to be diffuse. Consider solar energy: In a typically-sunny place like the Bay Area, light from the sun reaches the Earth with an average power density of ~200W per square meter. With a record-efficiency 1-square-meter solar panel (and high-efficiency storage), you might be able to power one incandescent light bulb around the clock. Not exactly awe-inspiring. (I recommend LED lighting anyway.) The amount of power we can get from a renewable resource is proportional to the area we can cover with solar panels or wind turbines or tidal energy harvesters, which in turn is proportional to the amount of material we need. To scale up renewable generation is to scale up production of the materials used in renewable generation technology—hence the importance of energy-critical elements.
Here are a few examples of current energy-critical elements and why we care about them:
(2) Neodymium (Nd) and praseodymium (Pr) – Used in wind turbines
(3) Terbium (Tb) and europium (Eu) – Used for lighting and displays
(4) Rhenium (Re) – Used in advanced high-performance gas turbines
(5) Helium (He) – Used in cryogenics and many research applications
(6) Indium (In) – Used as a transparent conductor in touchscreens, TV displays, and modern thin-film solar cells
(7) Lithium (Li) and lanthanum (La) – Used in high-performance batteries
(8) Platinum (Pt), palladium (Pd), and other platinum group elements – Used as catalysts in fuel cells for transportation
Keep in mind that the “energy-critical” label doesn’t reflect any fundamental difference in the nature of elements within and without this classification, and the subset of elements designated as ECEs will change as energy technologies and our knowledge of materials availability evolve.
Side note: Many “rare-earth elements” are energy-critical elements as well, but are actually more common in the Earth’s crust than most of the above elements; REEs are “rare” simply because they don’t exist as free elements or in concentrated ores that are easily accessible.
Side note 2: Some unlisted elements (e.g., Cu, Al, Si) are also critical to energy technologies, but they have developed markets, exist around the world—i.e., geopolitical issues have less impact on their supply—and are used in many other applications, such that substitutes could be found for those applications and additional supply made available for energy applications if necessary.
Energy-critical elements might not be available because they just aren’t very abundant in the Earth’s crust—the only part of our planet we can reach right now. The crust is mostly made of oxygen, silicon, and aluminum; all other elements exist in very low concentrations and are often hard to isolate and extract. That said, the absolute availability of elemental resources probably shouldn’t be our primary concern. A more insidious barrier to the development of new energy technologies is short-term disruption in the supply of ECEs. Supply volatility causes prices to fluctuate, which in turn disrupts long-term extraction efforts and hinders large-scale deployment of technologies that depend on ECEs.
What constraints could disrupt ECE supply?
One primary peril is geopolitics. When ECE production is concentrated in only a few places in the world, international politics and trade restrictions may dictate the market, which is bad. Take platinum: The vast majority of global reserves of platinum-group metals are concentrated in the Bushveld Complex of South Africa. Technical, social, and political instabilities in South Africa could thus disrupt the availability of platinum, palladium, and other critical elements. Another example is China’s 2010 decision to restrict exports of rare-earth elements, a market in which China enjoys a near-monopoly thanks to its natural geological advantages. Rare-earth element prices spiked briefly before the market readjusted to the current and future possibility of limited supply.
But despite all the political talk about energy independence, keep in mind that the US currently imports over 90% of the energy-critical elements it consumes, and that’s not a bad thing. Different regions have different comparative advantages in the production of ECEs, and only through trade are efficient markets achieved. Complete ECE independence is simply not possible—e.g., we have no viable source of platinum—and even partial independence is not possible without sacrificing many modern technologies. Consider food markets: Can you imagine life in a food-independent US? We can’t grow nearly enough bananas, mangoes, cashews, coffee, or cacao to satisfy our massive national appetite, nor can we survive without. Why then do we expect full ECE independence?
Another potential risk for disruption lies in the joint production of energy-critical elements—particularly In, Ga, and Te—with conventional ores. Nearly all ECEs are extracted as byproducts of the mining and refining of major metals (e.g., Ni, Fe, Cu) with much higher production volumes and more established markets. The problem then is that the demand for ECEs does not drive production: Their availability is thus constrained by how much of the ECE is contained in the ore of the primary product, and supply is dictated by economic decisions based on the primary product rather than the ECE. This lack of market control renders ECE prices subject to the whims and fancies of major metal markets.
Adding to the uncertainty in ECE availability is the artificially low prices made possible by joint production: Since ECEs piggyback on the mining infrastructure already in place for major metal production, their prices don’t reflect many of the fixed costs of mining and refining. ECE prices will remain artificially low until by-production saturates—i.e., when enough demand exists that byproduct production can’t keep up, making independent mining of the ECE profitable. At that inflection point, however, new energy technologies developed and assessed using current ECE prices may not be able to afford the much-higher true price and will thus fail. One current example is tellurium, which currently costs around $150 per kilogram and exists with ~1ppb abundance in the crust. For comparison, consider platinum, which costs around $150,000 per kilogram despite its ~4ppb crustal abundance. Why is tellurium so cheap? It turns out tellurium is a byproduct of the electrolytic refining of copper, and the large market for copper keeps the supply of the tellurium byproduct sufficient to meet current global demand.
Other risk factors for ECE availability include environmental and social concerns—the refinement of ECEs (e.g., rare-earths) is often a highly destructive process involving unpleasant chemicals, which could make ECE availability subject to environmental policy—and long response times for extraction—it typically takes 5-15 years to bring new mines online, which may be too slow to keep up with the deployment of novel energy technologies.
So what can we do about it?
Large-scale coordination by the government is needed to attack so complex a problem as energy-critical element availability. Providing reliable and up-to-date information on the availability of ECEs to researchers and investors will go a long way toward improving the current situation: With sufficient information, we can shift research efforts toward energy technologies with ECE needs that coincide with ECE availability.
Another potential response is to increase efforts to recycle ECEs. Recycling all ECE-containing products could reduce our dependence on new resources. Consider cell phones: Modern mobile devices contain 40 or more chemical elements—the majority of known radioactive-stable elements—and most end up in the back of desk drawers at the end of each 2-year contract cycle. But recycling isn’t a feasible option when considering any growth in the market size, much less exponential growth. Assuming the same efficiency of use over time—e.g., the same amount of Te will be needed to produce a CdTe solar cell with a fixed power output now and 20 years from now—recycling can never keep up with increasing material demands, even with 100% recycling efficiency.
The take-home message: Many new energy technologies rely heavily on a subset of chemical elements (e.g., He, Li, Te, rare-earths). These “energy-critical elements” (ECEs) are not currently produced in large quantities, and thus their future availability is highly unpredictable and dependent on complex economic, environmental, and geopolitical factors. A shortage of these elements could inhibit the large-scale deployment of promising solutions to the world’s energy needs. We need more people and more money dedicated to identifying potential substitutes, informing researchers and the public about ECE issues, and improving the efficiency with which we extract, use, and reclaim these elements.
Check out the full APS/MRS report if you’re interested in finding out more!
The WordPress.com stats helper monkeys prepared a 2012 annual report for this blog.
Here’s an excerpt:
4,329 films were submitted to the 2012 Cannes Film Festival. This blog had 16,000 views in 2012. If each view were a film, this blog would power 4 Film Festivals
It’s been a while since I’ve done any writing around these parts.
For those who haven’t been privy to my big, dirty secrets, I’ve been neck-deep in research and relearning how to learn as a first year grad student in EE at MIT. Anyway, I promised myself that I would start writing for fun (i.e., here) again after finishing a manuscript I’ve been working on for the last few months. It’s not quite there yet, but if I’m cheating, it’s for a damn good cause…
A fellow Stanford alum and friend, Emanuel Pleitez, is running for mayor of LA this year. Emanuel’s an awesome guy. He knows the city—grew up there, went to high school there, came back there to serve. Then again, I don’t know if I could vote for him as a politician—he’s too young, too empathetic, too hard-working, too efficient, too passionate about what he does. Oh wait…
You can learn more about Emanuel or read up on the issues, but in my mind, all you need to know is that he’s as untraditional a political candidate as you’ll find anywhere. I spoke to him on the phone earlier this week, and he’s running his campaign the way I—or any other young, optimistic, tech-savvy citizen—would run a campaign. And that’s a good thing. He’s a real guy, running for a real office, hoping to do real things. I don’t have a vote in LA right now, but if you do, do yourself and your city a favor and vote for Emanuel.
Feel free to contribute something to his campaign here—I just did. The amount doesn’t matter; it’s your name, your voice, that makes a difference.
P.S. If you’re interested in helping lead Emanuel’s campaign on the ground, check out this cool fellowship opportunity.
This thought experiment is from XKCD‘s new “What If” blog, which considers the consequences of various unlikely (read: impossible) scenarios.
What would happen if you tried to hit a baseball pitched at 90% the speed of light?
– Ellen McManis
Let’s set aside the question of how we got the baseball moving that fast. We’ll suppose it’s a normal pitch, except in the instant the pitcher releases the ball, it magically accelerates to 0.9c. From that point onward, everything proceeds according to normal physics.:
The answer turns out to be “a lot of things”, and they all happen very quickly, and it doesn’t end well for the batter (or the pitcher). I sat down with some physics books, a Nolan Ryan action figure, and a bunch of videotapes of nuclear tests and tried to sort it all out. What follows is my best guess at a nanosecond-by-nanosecond portrait:
The ball is going so fast that everything else is practically stationary. Even the molecules in the air are stationary. Air molecules vibrate back and forth at a few hundred miles per hour, but the ball is moving through them at 600 million miles per hour. This means that as far as the ball is concerned, they’re just hanging there, frozen.
The ideas of aerodynamics don’t apply here. Normally, air would flow around anything moving through it. But the air molecules in front of this ball don’t have time to be jostled out of the way. The ball smacks into them hard that the atoms in the air molecules actually fuse with the atoms in the ball’s surface. Each collision releases a burst of gamma rays and scattered particles.
These gamma rays and debris expand outward in a bubble centered on the pitcher’s mound. They start to tear apart the molecules in the air, ripping the electrons from the nuclei and turning the air in the stadium into an expanding bubble of incandescent plasma. The wall of this bubble approaches the batter at about the speed of light—only slightly ahead of the ball itself.
The constant fusion at the front of the ball pushes back on it, slowing it down, as if the ball were a rocket flying tail-first while firing its engines. Unfortunately, the ball is going so fast that even the tremendous force from this ongoing thermonuclear explosion barely slows it down at all. It does, however, start to eat away at the surface, blasting tiny particulate fragments of the ball in all directions. These fragments are going so fast that when they hit air molecules, they trigger two or three more rounds of fusion.
After about 70 nanoseconds the ball arrives at home plate. The batter hasn’t even seen the pitcher let go of the ball, since the light carrying that information arrives at about the same time the ball does. Collisions with the air have eaten the ball away almost completely, and it is now a bullet-shaped cloud of expanding plasma (mainly carbon, oxygen, hydrogen, and nitrogen) ramming into the air and triggering more fusion as it goes. The shell of x-rays hits the batter first, and a handful of nanoseconds later the debris cloud hits.
When it reaches the batter, the center of the cloud is still moving at an appreciable fraction of the speed of light. It hits the bat first, but then the batter, plate, and catcher are all scooped up and carried backward through the backstop as they disintegrate. The shell of x-rays and superheated plasma expands outward and upward, swallowing the backstop, both teams, the stands, and the surrounding neighborhood—all in the first microsecond.
Suppose you’re watching from a hilltop outside the city. The first thing you see is a blinding light, far outshining the sun. This gradually fades over the course of a few seconds, and a growing fireball rises into a mushroom cloud. Then, with a great roar, the blast wave arrives, tearing up trees and shredding houses.
Everything within roughly a mile of the park is leveled, and a firestorm engulfs the surrounding city. The baseball diamond is now a sizable crater, centered a few hundred feet behind the former location of the backstop.
A careful reading of official Major League Baseball Rule 6.08(b) suggests that in this situation, the batter would be considered “hit by pitch”, and would be eligible to advance to first base.
I ran across these “MIT Burnout Prevention and Recovery Tips” the other day:
2) AVOID ISOLATION. Don’t do everything alone! Develop or renew intimacies with friends and loved ones. Closeness not only brings new insights, but also is anathema to agitation and depression.
3) CHANGE YOUR CIRCUMSTANCES. If your job, your relationship, a situation, or a person is dragging you under, try to alter your circumstance, or if necessary, leave.
4) DIMINISH INTENSITY IN YOUR LIFE. Pinpoint those areas or aspects which summon up the most concentrated intensity and work toward alleviating that pressure.
5) STOP OVERNURTURING. If you routinely take on other people’s problems and responsibilities, learn to gracefully disengage. Try to get some nurturing for yourself.
6) LEARN TO SAY “NO”. You’ll help diminish intensity by speaking up for yourself. This means refusing additional requests or demands on your time or emotions.
7) BEGIN TO BACK OFF AND DETACH. Learn to delegate, not only at work, but also at home and with friends. In this case, detachment means rescuing yourself for yourself.
8) REASSESS YOUR VALUES. Try to sort out the meaningful values from the temporary and fleeting, the essential from the nonessential. You’ll conserve energy and time, and begin to feel more centered.
9) LEARN TO PACE YOURSELF. Try to take life in moderation. You only have so much energy available. Ascertain what is wanted and needed in your life, then begin to balance work with love, pleasure, and relaxation.
10) TAKE CARE OF YOUR BODY. Don’t skip meals, abuse yourself with rigid diets, disregard your need for sleep, or break the doctor appointments. Take care of yourself nutritionally.
11) DIMINISH WORRY AND ANXIETY. Try to keep superstitious worrying to a minimum – it changes nothing. You’ll have a better grip on your situation if you spend less time worrying and more time taking care of your real needs.
12) KEEP YOUR SENSE OF HUMOR. Begin to bring job and happy moments into your life. Very few people suffer burnout when they’re having fun.
One of the most self-damning flaws of scientific research is that, except in the rarest of cases, you don’t get to see the true impact of your current work until much, much later in life. Still. How awesome would it be to be Tim Berners-Lee right now? “I invented the Internet.” Or Thomas Edison: “I invented the light bulb.” Or Freud: “I invented sexy thoughts.”
Take Berners-Lee and the World Wide Web. Back in 1989, computers were clunky, command-line interfaces that couldn’t talk to each other, at least not in any significant way. Sir Tim Berners-Lee changed that. As a young scientist at CERN, he saw an opportunity to combine existing computer networking protocols—the Internet—with the newfangled concept of hypertext, and out popped the World Wide Web—the “Internet.” Now, at age 56, Berners-Lee gets industry awards, a knighthood, honorary doctorates left and right—but not enough to inspire the masses. Those who lack in age rarely recognize their deficiency, and the promise of unlimited speaking engagements at universities and conferences 20 years down the line won’t push today’s teenagers from TVs to test tubes. Maybe the prospect of being knighted will do the trick. But I doubt it. If professions were subject to natural selection, researchers would be extinct, for an ironic lack of reproducibility.
Other technical people are generally a bit quicker than the public to catch on to the significance of a scientific breakthrough—surprise—but even so, it’s only within the scientific community—a tiny fraction of it at that—that any such recognition resonates. Maybe that’s why relatively few young Americans today are excited about research. They don’t care about recognition from the scientific community—why should they? From the outside looking in, the community is small, quirky, and rarely produces a viral YouTube video or Top 10 hit.
While science only brings forever-delayed gratification, working at Apple, Google, or Intel lets you to point to an iPhone or new search feature or computer and say “I created that”—sure, with 100 other people, but what of it? Siri’s still pretty damn cool. Our current Internet-dominated era has that advantage, twofold: Anyone can learn to program and create an iPhone app or website—low barrier to creation—and anyone can find their work going viral via YouTube or Reddit—low barrier to recognition. It’s a simple feedback cycle—create, be recognized for creation—and few can resist its temptations. It’s hard to overstate how good it feels to be able to say “I created that”—for many people, it makes all the hard work worth it. That’s what drives them to work late nights and weekends. That’s what makes them say, “I love my job!” and truly mean it.
But imagine going to Google to work on Android, then finding out after a year on the job that it won’t be released for 20 more years, and even then with only 10% probability. You’ll have to wait two decades before anyone knows what the hell you’re talking about: “Hold on… You make androids? Is that ethical?” Until then, it’s all blank stares and polite smiles and changed subjects. I mean, it sure does look promising, but can I get it on Amazon?
Read any popular science article: “Scientists warned, ‘This is an extremely promising breakthrough, but it’s at least 5-10 years away from commercial deployment'” (see ScienceDaily or MIT’s Technology Review for more egregious real-world examples). And while it may be honest science journalism, Teenage Me hears that and thinks, “10 years—that’s half my life! Where did I put that Google offer letter?” That’s the burden of the scientific profession, the psychological barrier to entry that pushes many away from research careers, perhaps after a first unfulfilling undergrad research experience where feedback was lacking and progress was uncertain.
So what can we do about it?
Universities can encourage faculty and graduate students to take extended leave from their home institutions to work in the private sector, to start companies, to get involved in public policy. Research institutions can raise salaries for research scientists and other technical staff. Researchers can eradicate the academic superiority complex.
Government can fund more research, more education, more graduate and postdoctoral fellowships. Forward-thinking politicians can create more research jobs that don’t require a PhD.
The rest of us can learn some science—not Alka-Seltzer volcanoes and Coke-and-Mentos science, but real-world stuff: climate change, battery technology, the power grid, the Internet, DNA, neuroscience, medical imaging, computer hardware, energy conversion, programming, electric cars, wireless communications. We can figure out how the world around us works. It’s not magic, and when more than just technology creators understand how stuff works—when technology users get it too—innovative ideas emerge organically.
Science seems to be content with enabling, not creating, future technology. And that’s OK—the future is built on scientific progress. But the engineer in me can’t accept that. As a researcher in semiconductor devices, I straddle physics and chemistry and materials science and electrical engineering, and I can’t possibly divorce the science from the applications and still stay motivated enough to keep working on it. The thought of spending my life working on something that will never see the light of day—literally—terrifies me, and not a single day passes in which I don’t think about how I can best contribute—not just to my field, as is the nominal goal of the PhD, but to our daily lives.
Although the ivy has receded, particularly at startup-friendly institutions like Stanford and Berkeley and MIT, there’s still an unacceptably large divide between academic research and industry, between basic science and applied technology. We need researchers who are as comfortable talking to politicians and electricians and farmers as to colleagues and science reporters and the ever vague and ill-defined “general public.” We need researchers who can and will bring to market the incredible world-changing potential that every journal paper promises. And we need non-researchers—entrepreneurs, teachers, politicians—who innovate like researchers: logically, relentlessly, radically.
That could be you.
When you think about who you want to be when you grow up, imagine telling your kids in 30 years: “I made you AND the world you live in.” Take that, Freud.
The WordPress.com stats helper monkeys prepared a 2011 annual report for this blog.
Here’s an excerpt:
The concert hall at the Syndey Opera House holds 2,700 people. This blog was viewed about 9,800 times in 2011. If it were a concert at Sydney Opera House, it would take about 4 sold-out performances for that many people to see it.
Take a second to sign this letter to Congress in support of continued funding for scientific research. It’s worth it.
To: The United States Congress Joint Select Committee on Deficit Reduction
America’s science and engineering graduate students need your help. Our country is on the precipice: with US finances in a desperate position, upcoming decisions will determine the shape of our nation for decades to come. We urge you to seek common ground in Congress to preserve the indispensable investments in science and engineering research that will drive our nation’s prosperity for generations. We urge you to avoid any cuts in federally funded research.
We could reiterate that scientific progress and technological innovation have kept the US at the head of the global economy for over half a century. We could remind you that rapid changes in health technology, information security, globalization, communications, artificial intelligence, and advanced materials make scientific and technological progress more critical than ever. We could warn you that our global competitors are ramping up investments in research and development, inspired by our own rise to economic superpower. But all this is well established. Instead, we’d like to discuss a crucial element of research funding that is often overlooked: human capital.
Over half a million graduate students and postdoctoral associates study science and engineering in the US. These researchers form the bedrock labor force of the world’s best university R&D community. The value of these graduate students is not limited to the experiments they run and the papers they publish. Researchers in science and engineering learn to develop and implement long-term strategies, monitor progress, adapt to unexpected findings, evaluate their work and others’, collaborate across disciplines, acquire new skills, and communicate to a wide audience. Scientists and engineers don’t just get good jobs; they create good jobs, enabling their employers to produce the innovative products and services that drive our economic growth. Every science and engineering graduate represents a high-return investment in human capital, one impossible without federal support.
Federal research funding is essential to graduate education because research is our education. Over 60% of university research is federally funded; private industry, although it dominates the development stage, accounts for only 6% of university research. America must remain competitive in the global economy, and we cannot hope to do that by paying the lowest wages. We will never win a race to the bottom. Instead, we must innovate, and train the next generation of innovators. Innovation drives 60% of US growth. Economists estimate that if our economy grew just half a percent faster than forecast for 20 years, the country would face half the deficit cutting it faces today.
Does federal research funding promote innovative technology and groundbreaking scientific progress? Absolutely. It also provides our economy with the most versatile, skilled, motivated, and creative workers in the world. We graduate students understand the severity of the fiscal crisis facing our country. Our sleeves are rolled up; we’re ready to be part of the solution. But we need your help. Congress’s goal in controlling our deficit is to protect America’s future prosperity; healthy federal research funding is essential to that prosperity. In the difficult months ahead, we ask you to look to the future and protect our crucial investments in R&D.
America’s Science and Engineering Graduate Students
 National Academy of Sciences, National Academy of Engineering, and Institute of Medicine: Rising Above the Gathering Storm http://www.nap.edu/catalog.php?record_id=11463
 National Academy of Sciences, National Academy of Engineering, and Institute of Medicine: Rising Above the Gathering Storm, Revisited: Rapidly Approaching Category 5 http://www.nap.edu/catalog.php?record_id=12999
 National Science Board: Science and Engineering Indicators 2010 http://www.nsf.gov/nsb/sei/
 American Association for the Advancement of Science: The US Research and Development Investment http://www.aaas.org/spp/rd/presentations/
 National Science Foundation: Science and Engineering Indicators: 2010 http://www.nsf.gov/statistics/seind10/
 American Association for the Advancement of Science et al.: Letter to the Joint Select Committee on Deficit Reduction http://www.aau.edu/WorkArea/DownloadAsset.aspx?id=12780
 National Science Foundation: Graduate Students and Postdoctorates in Science and Engineering. http://www.nsf.gov/statistics/nsf11311/
 National Science Foundation: Science and Engineering Indicators: 2010, page 5-14 http://www.nsf.gov/statistics/seind10/
 Robert M. Solow (Prof. of Economics, MIT), Growth Theory, An Exposition (Oxford Univ. Press, New York, Oxford, 2nd edition 2000), pp. ix-xxvi (Nobel Prize Lecture, Dec. 8, 1987)
 David Leonhardt, “One Way to Trim the Debt, Cultivate Growth”, NY Times, Nov. 10, 2010 (see also work by economists Alan Auerbach and William Gale)