Skip to main content

Click Here >>>For More 'Post Racial Society' Posts

Show more

High-tech redlining: AI is quietly upgrading institutional racism

How an outlawed form of institutionalized discrimination is being quietly upgraded for the 21st century.
During the heyday of redlining, the areas most frequently discriminated against were black inner city neighborhoods. For example, in Atlanta in the 1980s, a Pulitzer Prize-winning series of articles by investigative reporter Bill Dedman showed that banks would often lend to lower-income whites but not to middle-income or upper-income blacks.[
During the Great Depression, the federal government created the Home Owners’ Loan Corporation, which made low-interest home loans, and the Federal Housing Administration, which guaranteed mortgages made by private banks. The people running HOLC didn’t know much, if anything, about local borrowers, so they constructed “residential safety maps” that graded neighborhoods on a scale of A to D, with D neighborhoods color-coded in red to denote undesirable, high-risk areas. These “redlined” maps were also used by FHA and private businesses, and spilled over into banking, insurance, and retail stores, creating a vicious cycle of restricted services and deteriorating neighborhoods.
Many private banks had their own redlined maps. In California, for example, Security First National Bank created a Los Angeles neighborhood rating system. Most neighborhoods in Central L. A. were redlined, often with explicit notations about “concentrations of Japanese and Negroes.” Boyle Heights was said to be “honeycombed with diverse and subversive elements.” Watts was redlined because it was a melting pot of not only Blacks and Japanese, but also Germans, Greeks, Italians, and Scots.
The 1968 Fair Housing Act outlawed redlining. However, in the age of Big Data, employment, insurance, and loan applications are increasingly being evaluated by data mining models that are not as overt but may be even more pernicious than color-coded maps, because they are not limited by geographic boundaries, and because their inner workings are often hidden.
No one, not even the programmers who write the code, know exactly how black-box algorithms make their assessments, but it is almost certain that these algorithms directly or indirectly consider gender, race, ethnicity, sexual orientation, and the like: call it hi-tech redlining. It is not moral or ethical to penalize individuals because they share group characteristics that a black-box algorithm has found to be correlated statistically with behavior.
Many algorithms for evaluating job candidates identify statistical patterns in the characteristics of current employees. The chief scientist for one company acknowledged that some of the factors chosen by its software do not make sense. For example, the software found that several good programmers in its database visited a particular Japanese manga site frequently; so it decided that people who visit this site are likely to be good programmers. The chief scientist said that, “Obviously, it’s not a causal relationship,” but argued that it was still useful because there was a strong statistical correlation. This is an excruciating example of the ill-founded belief–even by people who should know better–that statistical patterns are more important than common sense.
The CEO also said that the company’s algorithm looks at dozens of factors, and constantly changes the variables considered important as correlations come and go. She believes that the ever-changing list of variables demonstrates the model’s power and flexibility. A more compelling interpretation is that the algorithm captures transitory coincidental correlations that are of little value. If these were causal relationships, they would not come and go. They would persist and be useful. An algorithm that uses coincidental correlations to evaluate job applicants is almost surely biased. How fair is it if a Mexican-American female does not spend time at a Japanese manga site that is popular with white male software engineers?
Similarly, Amazon recently abandoned an attempt to develop customized algorithms for evaluating the resumes of applicants. The algorithms trained on the resumes of job applicants over the previous ten years, and favored people who were like the (mostly male) people Amazon had hired in the past. Candidates who went to all-women’s colleges were downgraded because men who worked at Amazon hadn’t gone to those colleges. Ditto with candidates who played on female sports teams.
A Chinese algorithm for evaluating loan applications looks at cell-phone usage; for example, how frequently incoming and outgoing calls are answered, and whether users keep their phones fully charged. Which of these metrics are signs of a phone user being a good credit risk; which of a bad credit risk? Any uncertainty you feel demonstrates the arbitrary nature of these markers.
These are coincidental correlations that are temporary and meaningless, but may well discriminate. When this credit rating system was first disclosed in China, answering all incoming calls was considered to be a sign of being a good credit risk. Who knows how it is interpreted now. But it may well be biased: How fair is it if certain religions are not supposed to answer the phone on certain days or at certain times of the day?
Data gathered on social media platforms offers companies a new font of dubious qualitative insights. Admiral Insurance, Britain’s largest car insurance company, planned to launch firstcarquote, which would base its car insurance rates on a computer analysis of an applicant’s Facebook posts; for example, word choices and whether a person likes Michael Jordan or Leonard Cohen. Then, like other black-box algorithms, they drifted off into patterns hidden inside the black box. There are surely biases in Facebook posts. How fair is it if a black male likes Michael Jordan and a white female likes Leonard Cohen? How fair is it if Facebook word choices that are related to gender, race, ethnicity, or sexual orientation happen to be coincidentally correlated with car insurance claims?
Algorithmic criminology is increasingly common in pre-trial bail determination, post-trial sentencing, and post-conviction parole decisions. One developer wrote that, “The approach is ‘black box,’ for which no apologies are made.” He gives an alarming example: “If I could use sun spots or shoe size or the size of the wristband on their wrist, I would. If I give the algorithm enough predictors to get it started, it finds things that you wouldn’t anticipate.” Things we don’t anticipate are things that don’t make sense, but happen to be coincidentally correlated.
Some predictors (wristband sizes?) may well be proxies for gender, race, sexual orientation, and other factors that should not be considered. People should not have onerous bail, be given unreasonable sentences, and be denied parole because of their gender, race, or sexual orientation–because they belong to certain groups.
The future of algorithmic suspicion can be seen in China, where the government is implementing a nationwide system of social credit scores intended to track what people buy, where they go, what they do, and anything else that might suggest that a person is untrustworthy–not just less likely to repay a loan, but also more likely to foment political unrest. The country’s security services are also investing heavily in face recognition technology, which could bring new data to credit-scoring type tools. Two Chinese researchers recently reported that they could predict with 89.5% accuracy whether a person is a criminal by applying their computer algorithm to scanned facial photos. Their program found “some discriminating structural features for predicting criminality, such as lip curvature, eye inner corner distance, and the so-called nose-mouth angle.” As one blogger wrote,
What if they just placed the people that look like criminals into an internment camp? What harm would that do? They would just have to stay there until they went through an extensive rehabilitation program. Even if some went that were innocent; how could this adversely affect them in the long run?
What can we do to police these systems? Demand–by law, if necessary–more transparency. Citizens should be able to check the accuracy of the data used by algorithms and should have access to enough information to test whether an algorithm has an illegal disparate impact.
Fortunately, there’s a growing recognition of the influence algorithms play in our lives. A survey published last week by the Pew Research Center found that many Americans are concerned about bias and unfairness when computers use math to make decisions, like assigning personal finance scores, making criminal risk assessments, or screening job applicants’ resumes and interviews. Curiously, the Pew survey also found that the public’s concern about AI scoring depends heavily on context: About 30% of people think it’s acceptable for companies to offer deals and discounts based on customers’ personal and behavioral data. But about 50% believe it’s okay for criminal justice systems to use algorithms to predict whether a parolee will commit another crime.
Is our faith in computers so blind that we are willing to trust algorithms to reject job applications and loan applications, set insurance rates, determine the length of prison sentences, and put people in internment camps? Favoring some individuals and mistreating others because they happen to have irrelevant characteristics selected by a mindless computer program isn’t progress: it’s a high-tech return to a previous era of unconscionable discrimination.
Source: fastcompany.com

Comments

Popular posts from this blog

Brian (Waterhead Bo) Bennett

Brian (Waterhead Bo) Bennett So who was the biggest black kingpin of all time? Just how do you measure that? Money, volume of dope, power, cultural impact? Perhaps it was Frank Matthews… you can learn more about him in my documentary “The Frank Matthews Story” link. But in terms of documented transactions that we know about for sure, who was convicted in court: One man stands alone. Brian “Waterhead Bo” Bennett. Bennett and his Colombian Partner, Mario Villabona, were eventually convicted of moving nearly l5 thousand kilos that they talked about on certain wiretaps between December of 1987 and November of 1988. Some of the loads were as large as 1000 kilos and cheaper than $9,000 dollars each wholesale. That’s 1500 keys a month for nearly a year. And that’s just on the wiretapped phones. Who knows how much he really sold in total. Claims are made about this one and that one selling more, but 15,000 keys sold for sure is the most we know about for any black dealer. Waterhead B

49ers cheerleader kneels again during anthem before 'Monday Night Football'

49ers cheerleader kneels again during anthem before 'Monday Night Football' On Nov. 1, a San Francisco 49ers cheerleader  took a knee during the national anthem ahead of the team’s Thursday night matchup with the Oakland Raiders. She was later  identified as Kayla Morris  of Antioch, California,  a second-year member  of the 49ers’ Gold Rush cheer squad. Morris declined to discuss protest Both Morris and the 49ers  declined media requests  to discuss her kneeling after the Thursday night game, and some wondered if the act of protest would cost Morris her spot on the team. It did not. On Monday, the 49ers had a second consecutive prime time matchup on “Monday Night Football”,  a loss to the New York Giants . Morris back on the field, kneels again Morris was on the field with her Gold Rush teammates prior to the game. For the second straight game, she took a knee during the national anthem. While fans in the stands noticed her kneeling the first time,

Colin Kaepernick’s Jersey Is Among The NFL’s Top Sellers

Colin Kaepernick’s Jersey Is Among The NFL’s Top Sellers Even though colin has been out of the league for years, his #7 san fran Jersey is still among the most sold amoung nfl players. Colin Kaepernick is still dealing with being blackballed by NFL owners and general managers, but at least he continues to get love from some from fans. 20 free agent quarterbacks have been signed ahead of him during the offseason, and even the blind can see that it’s a direct result of the stance he took when he refused to stand during the national anthem to protest against police brutality and systemic racism in America. The quarterback who threw for 2,241 yards and 16 touchdowns last year while starting 12 games has only been able to secure one workout with the Seattle Seahawks, despite being a few seasons removed from starting in the Super Bowl. But his fans are showing their support. Among last months best-selling jerseys released by the NFL Shop, Kaepernick ranked 17th ahead of quarterbac

Adam Silver Thinks A Rule Will Make Players Stand For The Anthem

Adam Silver Thinks A Rule Will Make Players Stand For The Anthem The rise of player protests against racial injustice has presented a problem for money-driven league commissioners and owners. An age of enlightenment, social activism and unified awareness concerning police brutality and racial inequality is flourishing in pro sports and motivating Black athletes to finally understand the magnitude of their power and use it to challenge the outdated and oppressive status quo and form of financial and psychological control that has been prevalent in pro sports.   Since Colin Kaepernick took a knee and even before that with the Black Live Matter Movement and players expressing their social and political views on traditional sports platforms such as The ESPYs, the game has changed and league execs and owners are scrambling to find a way to regain control of the minds and bodies of their players. Full Article: https://theshadowleague.com/nba-commissioner-adam-silver-s-belief-that-a-

Whites have rights': billboard

A Pennsylvania billboard has sparked an outcry and accusations of racism — but the man who put it up insists that he’s simply trying to strike up a “conversation” about race. https://finance.yahoo.com/news/whites... #whytheracecardisplayed

The Three Civil Rights–Era Leaders Who Warned of Computers and Racism

More than 50 years later, will the U.S.—and Silicon Valley—hear their message?

Kenyans had the highest number of casualties and are not named. Ethiopians are not named. Egyptians are the only Africans named.

Kenyans had the highest number of casualties and are not named. Ethiopians are not named. Egyptians are the only Africans named.  'No survivors' on crashed Boeing 737 The Associated Press Verified account: Kenyans had the highest number of casualties and are not named. Ethiopians are not named. Egyptians are the only Africans named. #whytheracecardisplayed https://twitter.com/AP/status/1104720232449478661

LeBron James Rocks Colin Kaepernick Nike T-Shirt to Lakers Preseason Game

LeBron James Rocks Colin Kaepernick Nike T-Shirt to Lakers Preseason Game LeBron James was wearing some exclusive Nike gear on his way to the Lakers' preseason game against the Kings on Thursday as he sported a 'Kaepernick' t-shirt to Staples Center. James has previously voiced his support for Colin Kaepernick  both as an NFL quarterback  and as the  face of the 30th anniversary of Nike's "Just Do It." campaign . After the apparel brand announced in September  Kaepernick would be used in advertisements  going forward, James  said  he stood for "anybody who believes in change" and added, "I stand with Nike all day, every day." LeBron James has long supported Colin Kaepernick and the movement the former NFL quarterback started by taking a knee in 2016. Now, this shouldn’t come as a huge surprise to anyone. James has  been more outspoken about social issues in recent years . He’s discussed matters like immigration, police brutality, women

Montana Malik Baronette

Montana Malik Baronette This 21-year old has been called the “Number one trigger-puller” by Baltimore city police and was only recently arrested and charged with a 2014 murder. However, police suspect that he is linked to hundreds of violent crimes and more than a dozen homicides as an alleged member of the notorious Baltimore street gang the “Black Guerilla Family”. Nothing makes a killer more infamous than a memorable nickname, and you’d be hard-pressed to find a more memorable name than “Number One Trigger Puller”.  Baronette is currently awaiting trial on the West Baltimore murder of 23-year-old Alfonso Williams. Patreon.com/theiconiumfoundation

Leaked Jenna Chat Logs Expose Twitch Streamer’s Shocking Racist, Homophobic, & Sexist Slurs

Twitch streamer Jenna is once again in hot water after Discord logs leaked, revealing the shocking use of racist, homophobic, and sexist slurs