Ice Lounge Media

Ice Lounge Media

Nine members of the US Congress have sent a letter to Google asking it to clarify the circumstances around its former ethical AI co-lead Timnit Gebru’s forced departure. Led by Representative Yvette Clarke and Senator Ron Wyden, and co-signed by Senators Elizabeth Warren and Cory Booker, the letter sends an important signal about how Congress is scrutinizing tech giants and thinking about forthcoming regulation.

Gebru, a leading voice in AI ethics and one of a small handful of Black women at Google, was unceremoniously dismissed two weeks ago, after a protracted disagreement over a research paper. The paper detailed the risks of large AI language models trained on enormous amounts of text data, which are a core line of Google’s research, powering various products including its lucrative Google Search. 

Citing MIT Technology Review’s coverage, the letter raises three issues: the potential for bias in large language models, the growing corporate influence over AI research, and Google’s lack of diversity. It asks Google CEO Sundar Pichai for a concrete plan on how it will address each of these, as well as for its current policy on reviewing research and details on its ongoing investigation into Gebru’s exit (Pichai committed to this investigation in an internal memo, first published by Axios). “As Members of Congress actively seeking to enhance AI research, accountability, and diversity through legislation and oversight, we respectfully request your request to the following inquiries,” the letter states.

In April of 2019, Clarke and Wyden introduced a bill, the Algorithmic Accountability Act, that would require big companies to audit their machine-learning systems for bias and take corrective action in a timely manner if such issues were identified. It would also require those companies to audit all processes involving sensitive data—including personally identifiable, biometric, and genetic information—for privacy and security risks. At the time, many legal and technology experts praised the bill for its nuanced understanding of AI and data-driven technologies. “Great first step,” wrote Andrew Selbst, an assistant professor at the University California Los Angeles School of Law, on Twitter. “Would require documentation, assessment, and attempts to address foreseen impacts. That’s new, exciting & incredibly necessary.”

The latest letter doesn’t tie directly to the Algorithmic Accountability Act, but it is part of the same move by certain congressional members to craft legislation that would mitigate AI bias and the other harms of data-driven, automated systems. Notably, it comes amid mounting pressure for antitrust regulation. Earlier this month, the US Federal Trade Commission filed an antitrust lawsuit against Facebook for its “anticompetitive conduct and unfair methods of competition.” Over the summer, House Democrats published a 449-page report on Big Tech’s monopolistic practices.

The letter also comes in the context of rising geopolitical tensions. As US-China relations have reached an all-time low during the pandemic, US officials have underscored the strategic importance of emerging technologies like AI and 5G. The letter also raises this dimension, acknowledging Google’s leadership in AI and its role in maintaining US leadership. But it makes clear that this should not undercut regulatory action, a line of argument popularized by Facebook CEO Mark Zuckerberg. “To ensure America wins the AI race,” the letter says, “American technology companies must not only lead the world in innovation; they must also ensure such innovation reflects our nation’s values.”

“Our letter should put everyone in the technology sector, not just Google, on notice that we are paying attention,” said Clarke in a statement to MIT Technology Review. “Ethical AI is the battleground for the future of civil rights. Our concerns about recent developments aren’t just about one person; they are about what the 21st century will look like if academic freedom and inclusion take a back seat to other priorities. We can’t mitigate algorithmic bias if we impede those who seek to research and study it.”

Read more

Cloud computing is at a critical juncture. Millions of companies now use it to store data and run applications and services remotely. This has reduced costs and sped operations. But a new trend threatens the benefits that cloud computing has unlocked.

“Digital sovereignty” describes the many ways governments try to assert more control over the computing environments on which their nations rely. It has long been a concern in supply chains, affecting the kinds of hardware and software available in a given market. Now it’s coming for the cloud.

Governments around the world are passing measures that require companies to host infrastructure and store certain kinds of data in local jurisdictions. Some also require companies that operate within their borders to provide the government with access to data and code stored in the cloud.

This trend, especially when applied unilaterally, erodes the fundamental model of cloud computing, which relies on free movement of data across borders. A cloud user or provider should be able to deploy any application or data set to the cloud at any time or place. And customers should be able to select the provider that can best meet their needs.

If we allow the principle of digital sovereignty to encroach further, cloud service providers will be bound by national interests, and consumers will bear significant costs. Power will be further concentrated in the hands of a few large players. And fragmentation along national lines will make it harder for anyone to solve global problems that rely on interoperable technology.

Pay to play

While the cloud and cloud-based services are theoretically available to any company in the world with internet access, digital sovereignty makes it increasingly difficult for companies in many countries to harness this powerful technology.

In Europe, concern about the dominance of US and Chinese cloud service providers has sparked efforts to create a European cloud. The GAIA-X project, for example, aims to direct European companies toward domestic cloud providers. Moreover, measures like GDPR, with its focus on data governance, give an advantage to European providers that might not otherwise be competitive.

China has long required that cloud infrastructure be hosted in China by local companies. In fact, China’s Cybersecurity Law mandates that certain data be stored on local servers or undergo a security assessment before it’s exported. A Personal Information Protection law, which is still in draft form, goes a step further by stating that China’s data rules can be enforced anywhere in the world if the data at issue describes Chinese citizens. This law would also create a blacklist prohibiting foreign entities from receiving personal data from China.

Now the United States is beginning to advance its own version of digital sovereignty. Secretary of State Mike Pompeo’s Clean Network Initiative would prohibit Chinese cloud companies from storing and processing data on US citizens and businesses. And while the Biden administration will likely roll back many actions taken under President Trump, the prospect of compelling ByteDance to sell TikTok to Oracle or run its US operations through a local partner remains on the table. This could set a dangerous precedent: the US government would be mirroring and legitimizing China’s cloud regulations, which require foreign providers to enter the market only through joint ventures with Chinese companies that own majority shares.

The trend toward digital sovereignty has unleashed a digital arms race that slows down innovation and offers no meaningful benefit to customers.

And in South Africa, a 2018 guideline from the South African Reserve Bank set up an approval mechanism for institutions seeking to use cloud computing, indicating that bank supervisors would “not be agreeable” if data were stored in a way that might inhibit their access to it.

If some variation of the TikTok/Oracle deal becomes the norm, it will set the stage for more governments to demand that technology providers sell a stake to a local entity, or operate through one, in exchange for market access.

Advocates of this approach argue that some degree of data sovereignty is inevitable. They say that the global internet still functions in the face of these rules, and companies continue to profit and innovate. But the fact that some companies continue to prosper under these conditions is not a persuasive argument for imposing them in the first place.

A global cloud

The trend toward digital sovereignty has unleashed a digital arms race that slows down innovation and offers no meaningful benefit to customers.

Companies like Amazon and Microsoft may well be able to afford to keep expanding their cloud computing platforms into new countries, but they are the exception. Thousands of smaller companies that provide cloud services on top of these platforms don’t have the financial or technological wherewithal to make their products available in every data center.

In Europe, for example, the GAIA-X project may only strengthen the large incumbents. And in China, the vast majority of foreign software providers have decided not to make their cloud services available there because the hurdles are too formidable. This does both Chinese customers and foreign technology providers a disservice. It also unwinds all the economic and security advantages of a global cloud.

What’s needed is for different countries to collaborate on common standards, agreeing to a set of core principles for the cloud and norms for government access to data stored there.

The OECD, for example, could do this by building on its existing privacy guidelines. The OECD’s Global Partnership on AI is one example of an initiative in a related technology area that brings together many stakeholders to develop policy.

As a starting point, the coalition could focus on a narrow subset of commercial data flows and corresponding use cases (such as those involving internal company personnel information, or cross-border contracts). Recognizing the concerns behind the drive for digital sovereignty—which may include political security, national security, and economic competitiveness—could help lay the groundwork for such an agreement. One approach might be to offer incentives for those companies that participate in such a coalition, but without blocking data flows to those that do not.

Finally, organizations such as the Cloud Security Alliance and the Cloud Native Computing Foundation can help find ways for the private sector to use cloud computing globally without being stymied by the whims of digital sovereignty.

The rules we establish today for governing cloud computing will shape the internet for years to come. To keep the benefits of this powerful technology widely available, let’s stop digital sovereignty from encroaching further still.

Michael Rawding is the founding partner of GeoFusion and the former president of Microsoft Asia. Samm Sacks is a cyber policy fellow at New America and a senior fellow at Yale Law School’s Paul Tsai China Center.

Disclosures: This article references Microsoft, which funds work at New America but did not directly support the research or writing of this article. Microsoft is a client of GeoFusion.

Read more

Nobel Prizes are rarely awarded without controversy. The prestige usually hatches a viperous nest of critics who deride the credentials of the winner, complain about the unmentioned collaborators who’ll be sidelined by history, or point to the more deserving recipients who’ve been unfairly snubbed.

So when the Norwegian committee decided to award the 2020 Nobel Peace Prize to the World Food Program, the United Nations’ food assistance agency, it was no surprise that the news was greeted with more than a few smirks and eye-rolls. 

In this case, the committee said, the prize was given because “in the face of the pandemic, the World Food Program has demonstrated an impressive ability to intensify its efforts.” Who could argue with that?

Plenty of people, it turns out. When UN bodies win the peace prize, “we are right at the edge of giving it to ‘the idea of org charts,’” quipped the Atlantic’s Robinson Meyer. “It’s a bizarre choice, and it’s a complete waste of the prize,” said Mukesh Kapila, a professor of global health at the University of Manchester. They have a point. The WFP, which provides food assistance to people in need, is the largest agency in the UN and has 14,500 employees worldwide. It won the prize for simply doing its job, argued Kapila.

And an extremely narrow interpretation of its job, at that. After all, the UN didn’t create the WFP to tackle immediate threats during an acute time of stress; its mission is to “eradicate hunger and malnutrition.” After nearly 60 years of trying to end hunger, the WFP is larger and busier today than ever before. The world’s farmers produce more than enough to feed the world, and yet people still starve. Why?

An actual mouth to feed

Hunger around the globe is getting worse, not better. It’s true that the proportion of people who regularly fail to get enough calories to live has been declining, dropping from 15% in 2000 to 8.6% in 2014. Nevertheless, that proportion has since held fairly steady, and the absolute number of undernourished people has been rising. Last year, according to the UN, 688 million people went hungry on a regular basis, up from 628.9 million in 2014. The curve is not sharp, but if current trends continue, more than 840 million people may be undernourished by 2030.

The statistics seem abstract, but each one of these millions is an actual mouth to feed, and the hardships they undergo are very real. In his 2019 book Food or War, the Australian journalist and author Julian Cribb describes the physical process of starvation in excruciating detail. The body, he explains, devours itself in the hunt for sustenance, depleting energy levels and producing side effects like anemia, fluid build-up, and chronic diarrhea. Then “the muscles begin to waste,” he writes. “The victim becomes increasingly weak.”

“In adults, total starvation brings death within eight to twelve weeks … in children, prolonged starvation retards growth and mental development in ways from which they may never recover, even if sound nutrition is restored. In short, starvation is one of the most agonizing ways to die, both physically and mentally—far worse, indeed, than most tortures invented by cruel people, because it takes so long and involves the destruction of virtually every system in the human body.”

Today, the global antipoverty nonprofit Oxfam identifies 10 “extreme hunger hot spots” worldwide where millions of people face this abominable torture. Some are theaters of conflict—including Afghanistan, home to the longest war America has been involved in, and Yemen, where a civil war fueled by neighboring Saudi Arabia has left 80% of the country’s 24 million citizens in need of humanitarian assistance. But there are other circumstances that can bring starvation too: Venezuela’s cratering economy; South Africa’s high unemployment rates; Brazil’s years of austerity. 

In Mississippi, the country’s hungriest state, one child in four is unable to consistently get enough to eat. What’s happening?

And even in high-functioning industrialized countries, the threat of hunger—not just poor nutrition, but actual hunger—has been rising as a result of economic inequality. In the UK, the use of food banks has more than doubled since 2013. In the US, food insecurity is widespread, and the hardest hit are children, elders, and the poor. In Mississippi, the country’s hungriest state, one child in four is unable to consistently get enough to eat. What’s happening?

A futuristic marvel

It’s hard to comprehend, in part because the food system has been one of the greatest technological success stories of the modern world. What we eat, how it is produced, and where it comes from—all have changed dramatically in the industrial age. We have found a way to apply almost every kind of technology to food, from mechanization and computerization to biochemistry and genetic modification. These technological leaps have dramatically increased productivity and made food more reliably and widely available to billions of people.

Farming itself has become many times more efficient and more productive. In the early 1900s, the Haber-Bosch process was harnessed to capture nitrogen from the air and turn it into fertilizer at an unprecedented scale. Mechanization came quickly: in the 1930s, around one in seven farms in the US had a tractor; within 20 years, they were used by the majority of farms. This was matched by an increasing ability to redirect water supplies and tap into aquifers, helping turn some arid regions into fertile arable land. Swaths of China, Central Asia, the Middle East, and the US were transformed by huge water projects, dams, and irrigation systems. Then, in the 1960s, the American agronomist Norman Borlaug bred new strains of wheat that were more resistant to disease, ushering in the “Green Revolution” in countries like India and Brazil—a development that led Borlaug himself to win the Nobel Peace Prize in 1970.

All of this means that industrialized farmers now operate at almost superhuman levels of output compared with their predecessors. In 1920, more than 31 million Americans worked in agriculture, and the average farm was just under 150 acres. A century later, the total acreage of farmland in the US has fallen by 9%, but just one-tenth of that workforce, 3.2 million people, is employed to tend it. (There are also far fewer farms now, but they are three times larger on average.)

The supply chain, too, is a futuristic marvel. You can walk into a store in most countries and buy fresh goods from all over the world. These supply chains even proved somewhat resistant to the chaos caused by the pandemic: while covid-19 lockdowns did lead to food shortages in some places, most of the empty shelves were the ones meant to hold toilet paper and cleaning products. Food supplies were more resilient than many expected.

But the mass industrialization of food and our ability to buy it has created an avalanche of unintended consequences. Cheap, bad calories have led to an obesity crisis that disproportionately affects the poor and disadvantaged. Intensive animal farming has increased greenhouse-­gas emissions, since meat has a much larger carbon footprint than beans or grains.

The environment has taken a beating, too. Booms in fertilizer and pesticide use have polluted land and waterways, and the easy availability of water has led some dry parts of the world to use up their resources.

They haven’t industrialized, so they don’t grow much food, which means they can’t make much money, so they can’t invest in equipment, which means they can’t grow much food. The cycle continues.

In Perilous Bounty, the journalist Tom Philpott explores California’s agricultural future. The massive water projects drawing supplies into the Central Valley, for example, have helped it become one of the world’s most productive farming regions over the past 90 years, providing around a quarter of America’s food. But those natural aquifers are now under acute pressure, overused and running dry in the face of drought and climate change. Philpott, a reporter for Mother Jones, points to the nearby Imperial Valley in Southern California as an example of this folly. This “bone-dry chunk of the Sonoran desert” is responsible for producing more than half of America’s winter vegetables, and yet “in terms of native aquatic resources, the Imperial Valley makes the Central Valley look like Waterworld.” The valley is home to California’s largest lake, the 15-mile-long Salton Sea—famously so loaded with pollutants and salt that nearly everything in it has been killed off.

This isn’t going to get better anytime soon: what is happening in California is happening elsewhere. Cribb shows in Food or War exactly how the trend lines are pointing the wrong way. Today, he says, food production is already competing for water with urban and industrial uses. More people are moving to urban areas, accelerating the trend. If this continues, he says, the proportion of the world’s fresh water supply available for growing food will drop from 70% to 40%. “This in turn would reduce world food production by as much as one-third by the 2050s—when there will be over 9 billion mouths to feed—instead of increasing it by 60% to meet their demand.”

These are all bleak predictions of future hunger, but they don’t really explain starvation today. For that, we can look at a different unexpected aspect of the 20th-century farming revolution: the fact that it didn’t happen everywhere.

Just as healthy calories are hard to come by for those who are poor, the industrialization of farming is unevenly distributed. First Western farmers were catapulted into hyper-productivity, then the nations touched by the Green Revolution. But progress stopped there. Today, a hectare of farmland in sub-Saharan Africa produces just 1.2 metric tons of grain each year; in the US and Europe the equivalent land yields up to eight metric tons. This is not because farmers in poorer regions lack the natural resources, necessarily (West Africa has long been a producer of cotton), but because they are locked into a cycle of subsistence. They haven’t industrialized, so they don’t grow much food, which means they can’t make much money, so they can’t invest in equipment, which means they can’t grow much food. The cycle continues.

This problem is exacerbated in places where the population is growing faster than the amount of food (nine of the world’s 10 fastest-growing countries are in sub-Saharan Africa). And it can be increased by sudden poverty, economic collapse, or conflict, as in Oxfam’s hot spots. While these are the places where the World Food Program steps in to alleviate immediate pain, it also doesn’t solve the problem. But then, their economic plight is not an accident.

A disaster for farmers worldwide

In September 2003, a South Korean farmer named Lee Kyung Hae attended protests against the World Trade Organization, which was meeting in Mexico. Lee was a former union leader whose own experimental farm had been foreclosed in the late 1990s. In an essay in the collection Bite Back (2020), Raj Patel and Maywa Montenegro de Wit recount what happened next. 

As demonstrators clashed with police, they explain, Lee climbed the barricades with a sign reading “WTO! Kills. FARMERS” hanging around his neck. On top of the fence, “he flipped open a rusty Swiss Army knife, stabbed himself in the heart, and died minutes later.”

Lee was protesting the effects of free trade, which has been a disaster for many farmers worldwide. The reason farmers in less industrialized nations can’t make much money isn’t just that they have low crop yields. It’s also that their markets are flooded with cheaper competition from overseas. 

Take sugar. After the Second World War, Europe’s sugar-beet growers were subsidized by their national governments to help ravaged countries get back on their feet. That worked, but once industrialization kicked in and production levels reached the stratosphere, they had an excess. The answer was to export that food, but the subsidies had the effect of artificially lowering prices: British sugar farmers could sell their goods in global markets and undercut the competition. This was good news for Europeans, but terrible news for sugar producers like Zambia. Farmers were locked into subsistence, or decided to turn away from the foods that they were naturally able to produce in favor of other products.

Powerful nations continue to subsidize their farmers and distort global markets even as the WTO has forced weaker countries to drop protections. In 2020, the US spent $37 billion on such subsidies, a number that has ballooned under the last two years of the Trump administration. Europe, meanwhile, spends $65 billion each year.

Patel and Montenegro point out that much of the populist political chaos of recent years has been a result of the trade turmoil—industrial jobs lost to outsourcing, and rural protests in the US and Europe by people angry at the prospect of rebalancing a deck that has been stacked in their favor for decades.

We have built systems that don’t just widen the gap between rich and poor but make the distance unassailable.

Donald Trump, they write, “was never honest about ditching free trade,” but “the social power he stirred up in the Heartland was real. Invoking the abominations of outsourced jobs, rural depression, and lost wages, he tapped in to neoliberal dysfunction and hitched the outrage to authoritarian rule.”

All this leaves us with a bleak picture of what’s next. We have built systems that don’t just widen the gap between rich and poor but make the distance unassailable. Climate change, competition for resources, and urbanization will produce more conflict. And economic inequality, both at home and abroad, means the numbers of hungry people are more likely to rise than fall. 

A golden age, but not for everyone

So are there any answers? Can starvation ever be ended? Can we head off the approaching food and water wars?

The countless books about the food system over the past few years make it clear: solutions are easy to lay out and extraordinarily complicated to enact. 

First steps might include helping farmers in poor countries out of the trap they are in by enabling them to grow more food and sell it at competitive prices. Such a strategy would mean not only providing the tools to modernize—such as better equipment, seed, or stock—but also reducing the tariffs and subsidies that make their hard work so unsustainable (the WTO has attempted to make progress on this front). The World Food Program, for all its plaudits, needs to be part of that kind of answer—not just an org chart plugging hungry mouths with emergency rations, but a force that helps rebalance this off-kilter system. 

And food itself needs to be more environmentally sound, employing fewer tricks that increase yields at the expense of the wider ecology. No more farming oases set up in bone-dry deserts; no more Salton Seas.This is difficult, but climate change may force us to do some of it regardless. 

All of this means recognizing that the golden age of farming wasn’t a golden age for everybody, and that our future may look different from what we have become used to. If so, that future might be better for those who go hungry today, and maybe for the planet as a whole. It may be hard to reckon with, but our spectacular global food system isn’t what will stop people from starving—it’s exactly why they starve in the first place. 

Read more
1 2,517 2,518 2,519 2,520 2,521 2,685