Únase a getAbstract para acceder al resumen.

Your Computer Is on Fire

Únase a getAbstract para acceder al resumen.

Your Computer Is on Fire

MIT Press,

15 mins. de lectura
7 ideas fundamentales
Audio y Texto

¿De qué se trata?

For technology to benefit all, people must stop seeing it as inherently unbiased.


Editorial Rating

9

Qualities

  • Applicable
  • Eye Opening
  • Hot Topic

Recommendation

Technology touches pretty much every aspect of people’s lives today, yet few stop to think about its origins or where it might take society. Many forget that humans are involved at every stage of technological design, development and deployment, and that these individuals’ worldviews shape the technology they create. This collection of essays by authors from the social sciences and STEM backgrounds argues against the view that technology is “neutral” in nature, and reveals the biases and power structures inherent in technological designs and tools.

Take-Aways

  • Technological progress without input from the social sciences is dangerous as it fails to realize its political, social and economic implications.
  • The seemingly virtual world of e-commerce and technology is profoundly physical.
  • Artificial intelligence is never purely artificial – it always depends on human input.
  • Sexism brought the UK computing industry to its knees.
  • Technology is neither neutral nor objective, but, rather, often reinforces negative racial and gender stereotypes.
  • The QWERTY keyboard serves as a site of both exclusion and creative possibility.
  • Contrary to what the tech industry wants you to believe, it is not a meritocracy.

Summary

Technological progress without input from the social sciences is dangerous as it fails to realize its political, social and economic implications.

Technological development always has the potential to improve lives. But even when an innovation aims to solve a serious societal problem, it often generates unintended consequences: Google played a key role in mapping the internet for the masses, but it also opened the door to invasive data mining and user privacy violations. Usually, people with little or no power in society suffer the most negative fallout from new technologies. 

“We are witnessing a period in which it is becoming ever more urgent to recognize that technological progress without social accountability is not real progress.”

Tech developers’ lack of concern for the widespread effects of their creations stems, in part, from a lack of input from other disciplines, such as social or political sciences and other STEM subjects. Leaving technical experts without sufficient knowledge of history or psychology – or even just a sufficient incentive to fact-check – in charge of systems that touch every aspect of people’s lives can lead to dangerous results. Think, for example, about how lack of governmental oversight allowed Boeing to install flawed autopilot systems in its 737 Max airlines – resulting in the deaths of hundreds of people. 

Many of today’s technologies and systems have power disparities and discrimination built into their makeup. Facebook founder Mark Zuckerberg’s first social media platform was voyeuristic and objectified his female classmates, so Facebook’s problematic history of privacy violations should not come as a surprise. Early computers and technology helped win wars and put people into space, but people often forget the women – in particular the Black women– involved in these ventures. Computing history is therefore also a history of the dominance of affluent white males.

The seemingly virtual world of e-commerce and technology is profoundly physical.

The cloud is a collection of technology resources that people can access via the internet, such as web hosting, server-based applications or data warehousing. People often think of the cloud as something without any physical presence, because of the seamless service it usually provides, and the invisibility of the labor and equipment involved. Yet the cloud is essentially a large factory, with a complex and extensive physical infrastructure. It is one of the largest consumers of electricity and water, and one of the largest sources of pollution in the world. Cloud servers and computers depend on rare materials, such as lithium, tin or cobalt, which companies source from all over the world.

“In rendering invisible the material infrastructure that makes possible the digital economy, the metaphor of the cloud allows the computer industry to conceal and externalize a whole host of problems, from energy costs to e-waste pollution.”

The cloud’s global supply chain raises political, social and environmental issues, ranging from worker safety to political and economic relations between countries. It also raises questions as to why big e-commerce companies are exempt from certain social, political and environmental controls. For example, despite being, in essence, a modern version of a mail-order catalog firm, Amazon receives tax subsidies by self-identifying as an e-commerce entrepreneur.

Artificial intelligence is never purely artificial – it always depends on human input.

All social media sites rely on artificial intelligence (AI) to monitor and moderate content. They use basic AI to filter keywords, IPs and URLs, as well as much more advanced tools such as hashing technologies, sentiment and forecasting tools, pixel analysis and machine learning. The use of AI might make it appear to remove potential biases from the process – but instead, it removes accountability. People forget that human decisions and actions lie behind the algorithms, programs and processes that comprise AI. 

The quality of AI decision-making depends on the technology’s source materials. If you put poor information in, you get poor information out. For example, policing technologies tend to rely on crime statistics. Yet crime statistics often give a skewed picture because of the over-policing of lower-income neighborhoods, and the resulting frequency of arrests of non-white people. In recent years, most of the big tech firms, including Facebook, Twitter and YouTube, have had to increase their human workforce for content moderation because of scandals around election influence, fake accounts or automated content recommendations.

“AI is presently and will always be both made up of and reliant upon human intelligence and intervention.”

Human input is critical in dealing with crimes such as child pornography, for instance. While algorithms can detect a potentially pornographic image, the decision about whether it classifies as child pornography relies completely on human interpretation and review. In addition, there are known problems with image recognition software: For example, most of these tools struggle to accurately identify racial minorities. This is because many of the data sets used to train the algorithms use white male adults as a default.

Sexism brought the UK computing industry to its knees.

The first computer operators and programmers in the UK were women, who, during World War II, worked on code-breaking computers. The reason women held these roles was not because there were no men available to do them – rather, it was because people considered computing akin to factory work.

“With computing work becoming aligned with power, women computer workers who possessed all of the technical skills to perform the jobs found themselves increasingly squeezed out by new hiring rubrics that favored untested men trainees with no technical skills.”

When computers become a more important part of the economy in the 1960s, the government changed its hiring practices for technical jobs. Women who possessed technical skills had to train male colleagues, who then took on managerial roles. Still, men perceived computing work as feminized, so they often did the government training, and then left to take up managerial, non-computing positions somewhere else. This led to a massive labor shortage, but the government still refused to employ or promote women for technical managerial positions. One such woman was Stephanie “Steve” Shirley.

After being passed over for promotion in the Civil Service several times, Shirley decided to start her own software company, drawing on the large pool of female technical talent, whom she employed on a flexible basis. Because the government and other big businesses were short of talent, they began to outsource computing work to companies such as Shirley’s.

Ultimately, to tackle the continued labor shortage, the government decided to invest in a solution that would require fewer workers. The government forced British computer companies to merge into a single company – International Computers Limited (ICL) – and demanded it focus production on large, technologically advanced mainframes in return for government grants and contracts. But by the time these mainframes were ready, the world had moved on to smaller and more flexible solutions, leaving ICL – and therefore the British computer industry – behind.

Technology is neither neutral nor objective, but, rather, often reinforces negative racial and gender stereotypes.

Speech technologies make many people’s lives easier. Yet those who don’t speak standard English find themselves excluded from using services that depend on these technologies, as their design reinforces accent bias. Research has found that people perceive those who speak with an accent as less intelligent, loyal and competent, and that non-native speech often leads to discrimination in housing, employment or the justice system. Language has always been a tool of imperialism; the language of the colonizer became the language of power – which the colonized had to adopt it if they wanted to have a voice. Similarly, the inability of speech technologies to understand non-standard English forces non-native or accented speakers to change the way they speak.

“Digital media – and by extension, the language of digital media – is arguably one of the most powerful tools of neo-imperialism.”

Because collecting data to train speech technology is expensive, most companies still use a corpus made up, primarily, of American Midwestern accents. In addition to reinforcing power structures and discrimination, technology also strengthens negative stereotypes. For example, in many Hollywood films, “good” robots tend to be white, while black or dark-colored robots are bad.

The QWERTY keyboard serves as a site of both exclusion and creative possibility.

Businesses began using the QWERTY keyboard, which is named for the order of the first six keys on the top left letter row, in the 1870s. Today, it is the globally accepted interface. Yet over 50% of the global population, including Arabic, Hindi, Korean and Chinese speakers, cannot use it to write their native script. This is because the QWERTY keyboard has a set of built-in features that don’t cater to the specifics of these scripts. For example, Arabic letters connect and change shape and are written from right to left, which contradicts QWERTY’s isomorphism (letters don’t change shape), monospatialism (all letters take up the same horizontal space) and unidirectionality. Hangul letters, from the Korean script, change size and position depending on context, and connect both vertically and horizontally, which contradicts isomorphism, isoposition (letters don’t change position), isolation (letter don’t connect) and monolinearity (letters sit on a single baseline).

“Creating a special class of writing systems and labeling them as ‘complex’ omits from discussion the question of how and why these writing systems came to be ‘complex’ in the first place.”

Initially, people tried to change these non-Latin scripts to make them work with the QWERTY keyboard, including removing letters from the Thai alphabet, moving the bottom half of Korean glyphs to the right of the graph and separating letters in the Arabic script. Eventually, people started to use the QWERTY keyboard keys as a “retrieval system”: pressing combinations of keys that “describe” to the machine the character they want to retrieve. For the machine to find the graph you’re trying to write, you don’t need to perform every stroke of a graph. On average, you only need to spell the first five to six strokes for the machine to recognize what graph you mean. This means that 22 stroke-letter combinations – which fit on the QWERTY keyboard – allow you to write the approximately 2,000 most commonly used Chinese graphs. Rather than rejecting the problematic keyboard, creative thinkers learned to transcend it.

Contrary to what the tech industry wants you to believe, it is not a meritocracy.

Despite claims that anyone with the will and skills can make it to the top, there is very little diversity within the tech industry. In the United States, a number of initiatives, such as Code.org and Girls Who Code, aim to teach coding to school-age kids, and, thus, set more women and minorities on the path to enter the tech industry. In India, tech-skills development programs aimed at young people and women promise a way to overcome social barriers and get higher-paid jobs.

The belief that a skills gap is the primary barrier of entry to big tech for women and other underrepresented groups implies that they are, themselves, at fault; it suggests they have failed because they didn’t put enough effort into acquiring the necessary abilities. Yet such initiatives ignore the fact that hiring practices at major tech companies often play a more important role in limiting diversity than any lack of skills. By and large, tech companies still favor employees from elite educational institutions, with the “correct” cultural background: people who conform to the stereotype of the “white, male nerd.” Recruitment also often happens through recommendations from existing employees, whose networks are often composed of people like themselves.

“We have Black Girls Code, but we don’t have ‘White Boys Collaborate’ or ‘White Boys Learn Respect.’ Why not, if we want to nurture the full set of skills needed in computing?”

The overvaluing of technical skills is yet another problem. Hiring managers often ignore the importance of social skills when filling tech positions. This view disadvantages female employees, who tend to work more collaboratively, and try to iron out problems before deadlines.

About the Authors

Thomas S. Mullaney is professor of history at Stanford University. Benjamin Peters is Hazel Rogers Associate Professor and chair of Media Studies at the University of Tulsa. Mar Hicks is associate professor of history at Illinois Institute of Technology. Kavita Philip is professor of history at California, Irvine.

This document is restricted to personal use only.

Did you like this summary?

Buy book or audiobook

Comment on this summary