Home Technology Why tech didn’t save us from covid-19 – MIT Technology Review

Why tech didn’t save us from covid-19 – MIT Technology Review

141
0
Why tech didn’t save us from covid-19 - MIT Technology Review


In the US, manufacturing jobs dropped by almost a third between 2000 and 2010 and have barely recovered since. Manufacturing productivity has been particularly poor in recent years (chart 5). What has been lost is not only jobs but also the knowledge embedded in a strong manufacturing base, and with it the ability to create new products and find advanced and flexible ways of making them. Over the years, the country ceded to China and other countries the expertise in competitively making many things, including solar panels and advanced batteries—and, it now turns out, swabs and diagnostic tests too. 

No country should aim to make everything, says Fuchs, but “the US needs to develop the capacity to identify the technologies—as well as the physical and human resources—that are critical for national, economic, and health security, and to invest strategically in those technologies and assets.”  

Regardless of where products are made, Fuchs says, manufacturers need more coordination and flexibility in global supply chains, in part so they aren’t tied to a few sources of production. That quickly became evident in the pandemic; for example, US mask makers scrambled to procure the limited supply of melt-blown fiber required to make the N95 masks that protect against the virus. 

The problem was made worse because manufacturers keep inventories razor-thin to save money, often relying on timely shipments from a sole provider. “The great lesson from the pandemic,” says Suzanne Berger, a political scientist at MIT and an expert on advanced manufacturing, is “how we traded resilience for low-cost and just-in-time production.” 

Berger says the government should encourage a more flexible manufacturing sector and support domestic production by investing in workforce training, basic and applied research, and facilities like the advanced manufacturing institutes that were created in the early 2010s to provide companies with access to the latest production technologies. “We need to support manufacturing not only [to make] critical products like masks and respirators but to recognize that the connection between manufacturing and innovation is critical for productivity growth and, out of increases in productivity, for economic growth,” she says.

The good news is that the US has had this discussion during previous crises. The playbook exists.

Declaring war on the virus

In June 1940, Vannevar Bush, then the director of the Carnegie Institution for Science in Washington, DC, went to the White House to meet President Franklin D. Roosevelt. The war was under way in Europe, and Roosevelt knew the US would soon be drawn into it. As Simon Johnson and Jonathan Gruber, both economists at MIT, write in their recent book Jump-Starting America, the country was woefully unprepared, barely able to make a tank. 

Bush presented the president with a plan to gear up the war effort, led by scientists and engineers. That gave rise to the National Defense Research Committee (NDRC); during the war, Bush directed some 30,000 people, including 6,000 scientists, to steer the country’s technological development. 

The inventions that resulted are well known, from radar to the atomic bomb. But as Johnson and Gruber write, the investment in science and engineering continued well after the war ended. “The major—and now mostly forgotten—lesson of the post-1945 period is that modern private enterprise proves much more effective when government provides strong underlying support for basic and applied science and for the commercialization of the resulting innovations,” they write. 

A similar push to ramp up government investment in science and technology “is clearly what we need now,” says Johnson. It could have immediate payoffs both in technologies crucial to handling the current crisis, such as tests and vaccines, and in new jobs and economic revival. Many of the jobs created will be for scientists, Johnson acknowledges, but many will also go to trained technicians and others whose work is needed to build and maintain an enlarged scientific infrastructure.

This matters especially, he says, because with an administration that is pulling back from globalization and with consumer spending weak, innovation will be one of the few options for driving economic growth. “Scientific investment needs to be a strategic priority again,” says Johnson. “We’ve lost that. It has become a residual. That’s got to stop.” 

Johnson is not alone. In the middle of May, a bipartisan group of congressmen proposed what they called the Endless Frontier Act to expand funding for “the discovery, creation, and commercialization of technology fields of the future.” They argued that the US was “inadequately prepared” for covid-19 and that the pandemic “exposed the consequences of a long-term failure” to invest in scientific research. The legislators called for $100 billion over five years to support a “technology directorate” that would fund AI, robotics, automation, advanced manufacturing,  and other critical technologies. 

Around the same time, a pair of economists, Northwestern’s Ben Jones and MIT’s Pierre Azoulay, published an article in Sciencecalling for a massive government–led “Pandemic R&D Program” to fund and coordinate work in everything from vaccines to materials science. The potential economic and health benefits are so large, Jones argues, that even huge investments to accelerate vaccine development and other technologies will pay for themselves. 

Vannevar Bush’s approach during the war tells us it’s possible, though the funding needs to be substantial, says Jones. But increased funding is just part of what is required, he says. The initiative will need a central authority like Bush’s NDRC to identify a varied portfolio of new technologies to support—a function that is missing from current efforts to tackle covid-19. 

The thing to note about all these proposals is that they are aimed at both short- and long-term problems: they are calling for an immediate ramp-up of public investment in technology, but also for a bigger government role in guiding the direction of technologists’ work. The key will be to spend at least some of the cash in the gigantic US fiscal stimulus bills not just on juicing the economy but on reviving innovation in neglected sectors like advanced manufacturing and boosting the development of promising areas like AI. “We’re going to be spending a great deal of money, so can we use this in a productive way? Without diminishing the enormous suffering that has happened, can we use this as a wake-up call?” asks Harvard’s Henderson.

“Historically, it has been done a bunch of times,” she says. Besides the World War II effort, examples include Sematech, the 1980s consortium that revived the ailing US semiconductor industry in the face of Japan’s increasing dominance, by sharing technological innovations and boosting  investment in the sector. 

Can we do it again? Henderson says she is “hopeful, though not necessarily optimistic.”

The test of the country’s innovation system will be whether over the coming months it can invent vaccines, treatments, and tests, and then produce them at the massive scale needed to defeat covid-19. “The problem hasn’t gone away,” says CMU’s Fuchs. “The global pandemic will be a fact of life—the next 15 months, 30 months—and offers an incredible opportunity for us to rethink the resiliency of our supply chains, our domestic manufacturing capacity, and the innovation around it.” 

It will also take some rethinking of how the US uses AI and other new technologies to address urgent problems. But for that to happen, the government has to take on a leading role in directing innovation to meet the public’s most pressing needs. That doesn’t sound like the government the US has now. 

Let’s block ads! (Why?)

Original Post Source link