Manufacturing in the United States is in trouble.
That’s bad news not just for the country’s economy but for the future of innovation.
In a hangarlike building where General Electric once assembled steam turbines, a $100 million battery manufacturing facility is being constructed to make products using a chemistry never before commercialized on such a large scale. The sodium–metal halide batteries it will produce have been tested and optimized over the last few years by a team of materials scientists and engineers at GE’s sprawling research center just a few miles away. Now some of the same researchers are responsible for reproducing those results in a production facility large enough to hold three and a half football fields.
The engineers have moved from the bucolic research center, which sits on a hill overlooking the Mohawk River, down to the manufacturing site, which abuts the river at the edge of Schenectady, New York, a working-class town known in its heyday as Electric City. There, they supervise the installation and testing of robotics, high-temperature kilns, and analytic equipment that will monitor the production process. The new batteries use an advanced ceramic as an electrolyte inside a sealed metal case containing nickel chloride and sodium; the technology promises to store three times as much energy as the lead-acid batteries used in data centers, in heavy-duty electric vehicles, and for backup power. But almost anything can go wrong. If, say, the particles that make up the ceramic are uneven in size or haven’t been properly dried, battery performance could fall short. That means the conditions in the huge factory must be tightly controlled, and multi-ton devices must be able to match the exactness of lab equipment. “It’s not for the weak of heart,” says Michael Idelchik, GE’s vice president of advanced technologies.
The GE plant is one of a number of facilities around the country producing new technologies for rapidly growing markets in advanced batteries, electric vehicles, and solar power—but those efforts cannot counter the reality that the U.S. manufacturing sector is in trouble. After decades of outsourcing production in an effort to lower costs, many large companies have lost the expertise for the complex engineering and design tasks necessary to scale up and produce today’s most innovative new technologies, not to mention the appetite for the risks involved.
If you believe Thomas Friedman‘s assertion that “the world is flat,” and that moving manufacturing to places where production is cheap makes companies more competitive, such a shift might not matter beyond its implications for the U.S. economy and its workers. But the United States remains the world’s most prolific source of new technologies, particularly materials-based ones, and evidence is growing that its diminished manufacturing capabilities could severely cripple global innovation. There are ample reasons to believe that the model of the U.S. computer industry—which has successfully outsourced much of its production in the last few decades and made design, not manufacturing, its priority—will not work effectively for companies trying to commercialize innovations in energy, advanced materials, and other emerging sectors.
Academic researchers have begun documenting the complex connections between innovation and manufacturing with an eye to clarifying how the loss of U.S. manufacturing could affect the emergence of new technologies. Willy Shih, a professor of management at Harvard Business School, has created a list of basic technologies in which the United States has squandered its lead in manufacturing in recent years. They include crystalline silicon wafers, LCDs, power semiconductors for solar cells, and many types of advanced batteries. And he has detailed how losing the “industrial commons”—the research know-how, engineering skills, and manufacturing expertise needed to make a specific technology—can often mean losing the knowledge and incentives to create advances in related technologies. For example, as silicon semiconductor production and associated supply chains have shifted to Asia, the development of new silicon-based solar cells has been hampered in the United States.
It turns out it’s not necessarily true that innovative technologies will simply be manufactured elsewhere if it doesn’t happen in the United States. According to research by Erica Fuchs, an assistant professor at Carnegie Mellon University, the development of integrated photonics, in which lasers and modulators are squeezed onto a single chip, has been largely abandoned by optoelectronic manufacturers as they have moved production away from the United States. Many telecom firms were forced to seek lower-cost production in East Asia after the industry’s collapse in the early 2000s, and differences in manufacturing practices meant that producing integrated photonic chips was not economically viable in those countries. Thus a technology that once appeared to be just a few years away from revolutionizing computers and even biosensors was forsaken. Economists might argue that we don’t care where something is produced, says Fuchs, but location can profoundly affect “the products that you choose to make and the technology trajectory itself.”
For many people in industry, the connections between innovation and manufacturing are a given—and a reason to worry.
via Technology Review - David Rotmanᔥ
Bookmark this page for “Tomorrow’s Breakthroughs” and check back regularly as these articles update on a very frequent basis. The view is set to “news”. Try clicking on “video” and “2″ for more articles.