Quantcast
Channel: The Markup Award-Winning Work - Online Journalism Awards
Viewing all 13 articles
Browse latest View live

Denied

$
0
0
Crystal Marie McDaniels Crystal Marie McDaniels
About the Project

Discriminatory lending practices have been well-documented throughout the years. Lenders’ response to researchers and journalists has been that reporters don’t have enough relevant data to make a true determination and, if they did, the disparities would disappear.

The Markup’s investigation Denied debunks the lenders’ argument that the inclusion of certain financial characteristics would eliminate apparent bias in mortgage approval decisions. We analyzed more than two million mortgage applications and found that people of color were denied at higher rates than similarly qualified White applicants. The analysis includes 17 variables, including debt-to-income and combined loan-to-value ratios. Lenders previously said including those specific variables, which were not public at the time, would explain what appeared to be racial disparities in lending. But when we included the newly released variables, we found that wasn’t true.

Instead, lenders were 40 percent more likely to turn down Latino applicants for loans, 50 percent more likely to deny Asian/Pacific Islander applicants, 70 percent more likely to deny Native American applicants, and 80 percent more likely to reject Black applicants than similarly qualified White applicants. These are the national rates. We also found disparities in 89 metro areas across the country.

Our story explained that part of the reason these disparities persist, 50 years after the Fair Housing Act outlawed the racist housing policy known as redlining, can be tied to factors considered by algorithms used in the mortgage application process.

The investigation shows that Fannie Mae and Freddie Mac, two quasi-governmental companies that de facto set the standard for mortgage lending, rely on algorithms that can disproportionately harm people of color.

The two companies require lenders to use a credit-scoring algorithm called “Classic FICO” to determine whether applicants qualify for a conventional loan. But that credit-scoring model was built using data from the 1990s and is more than 15 years old. It’s considered detrimental to people of color because it rewards people who have access to traditional and mainstream types of credit, from which people of color have historically been shut out.

The most important algorithms influencing mortgage decisions are automated underwriting systems developed by Fannie and Freddie. The pair buy half of all mortgages in America, so most lenders use these algorithms for approval decisions. Ultimately, no one outside Fannie and Freddie knows exactly what factors their underwriting algorithms use or how they are weighted. Not even the companies’ government regulator, the Federal Housing Finance Agency, knows exactly how these algorithms work.

In the story’s sidebar, we identified specific mortgage lenders with the most egregious disparities. When we investigated their backgrounds, we found that all of them had faced criticism from at least one government agency in recent years for their business practices. Another pattern: Three of these lenders are affiliated with the nation’s largest home builders. Each lender in our sidebar was at least 100 percent more likely to deny Black and Latino applicants than similarly qualified White ones, and these lenders concentrated their loans in upper- and middle-class neighborhoods.

The post Denied appeared first on Online Journalism Awards.


The Markup

$
0
0
Brianna Hernandez, Jonathan King, their daughter, Zariyah, 5, and son Jonathan, 4, outside a hotel in Elgin, Ill., where they lived for a time after receiving an eviction notice. Driver advocate Lenny Sanchez of Chicago leans on his car Robert Gomez, owner of startup 4Q Brands, in his warehouse in Buford, GA on October 6th, 2021. For more than two years, his coffee grinder had been one of his best sellers on Amazon.
About the Project

The Markup launched just over two years ago as an investigative nonprofit that seeks to illuminate the ways that technology affects society. We remain the only tech-focused investigative newsroom in the nation, and we’re proud to use cutting-edge data and technology tools to examine the hidden biases in algorithms, from the mortgage approval processes used by lenders to crime prediction software sold to police. In each case, the algorithms were purported to be without bias. But again and again, using novel data scraping, massive data analysis, and statistical techniques, we lifted the veil and showed how massive companies are hurting people of color through automated decisions and reneging on their promises to the public.

Investigating private companies is among the most difficult reporting—and it’s even harder when you focus on tech giants, which hire armies of public relations staffers to thwart those efforts. So we have developed creative uses of technology to gather the information to hold them accountable. For instance, we built a tool that allowed us to collect anonymized data from individuals’ Facebook feeds, which we called Citizen Browser, and then hired a survey research firm to assemble a national panel of Facebook users willing to install it. Citizen Browser has fueled nearly two dozen investigations and powered two important persistent monitoring tools for our the public: Split Screen, a dashboard that showed what stories were most popular in the feeds of users of different genders, age groups, and voting preferences; and Trending on Facebook, a Twitter bot that tweeted out the most popular websites seen by our panelists each day. We’ve included interactive elements in our investigations to help readers comprehend the real-world effects algorithms have on their lives, and we’ve released a browser extension, Amazon Brand Detector, to help readers make more informed decisions as consumers. In order to get our reporting in front of as many people as possible, we collaborate with news organizations to distribute our content (the Associated Press and others) and co-report and co-publish (The New York Times, Consumer Reports, and others). We also publish under a Creative Commons license and use social platforms’ popular tools like Instagram Reels to explain our reporting and interactive polling features on Twitter and Instagram to hear our readers’ perspectives. We’re committed to accessibility, providing alt text on our images, captioning on our videos, and high-contrast graphics for readability.

Our peers have taken notice: Our Citizen Browser work is a finalist for a Scripps Howard Award for Excellence in Innovation. Emmanuel Martinez, Lauren Kirchner, and Malena Carollo’s reporting for the Denied series on mortgage approval algorithms has so far earned a Deadline Award for Reporting by Independent Digital Media (finalist for public service) and a Headliner Award for online investigative reporting. The Citizen Browser project and Prediction: Bias, on bias in predictive policing algorithms, were both shortlisted for Sigma Awards.

The post The Markup appeared first on Online Journalism Awards.

Still Loading

$
0
0

Pamela Jackson-Walters, whose Detroit neighborhood is almost entirely Black, has endured slow internet speeds and weeks-long outages.
About the Project

Our investigation “Still Loading” is the culmination of an eight-month effort that began with an attempt to see what speeds and prices internet service providers were offering to households across the country. In the process, it transformed into something even more urgent: an exposé revealing how a quartet of telecom giants had neglected to upgrade their networks with high-speed infrastructure in socioeconomically disadvantaged and racially diverse neighborhoods.

Still Loading found that four major national internet service providers (ISPs) disproportionately offered lower-income, least-White, and historically redlined neighborhoods slow internet service for the same price as for speedy connections in other parts of town. We produced the first nationwide disparate impact analysis of the speeds and prices internet service providers offered directly to consumers, and we’re the first to show where inequitable effects of tier flattening (charging internet customers the same rate for differing levels of service) have occurred.

The people who live in neighborhoods offered the worst internet deals aren’t just being ripped off; they’re being denied the ability to participate in remote learning, well-paying remote jobs, and even family connection and recreation—ubiquitous elements of modern life. The worst part: Until The Markup published its investigation, the customers receiving the worst deals had no idea.

To pull this off, Aaron Sankin manually tested the websites of all the major internet service providers to determine which engaged in this pricing scheme, while Leon developed custom-built software to collect data from each service providers’ website. To show what was happening, we had to collect more detailed data than what the FCC had received from telecom companies themselves.

Additionally, our reporters are now key experts on the digital divide, and we believe their work was the first nationwide address-level mass survey of internet speeds of internet speeds and prices released publicly anywhere.

Our reporters also made extensive efforts to share their findings and methodology with those impacted by the digital divide by facilitating original reporting in nine local news outlets, and releasing a guide that empowers anybody with a computer and internet access to collect a representative sample of internet plans and test for disparities.

Judges Comments

A nearly perfect example of what modern technology journalism can be. The Markup’s exposure of the lasting consequences of geographic inequality in nearly every major American city is both deeply reported and elegantly displayed. It has the potential for real impact by taking its thesis out of the abstract and providing easy-to-use tools that connect that thesis directly to the reader’s life. In an extremely competitive category, the utility and user experience of this project helped it stand out among excellent peers.

The post Still Loading appeared first on Online Journalism Awards.

L.A.’s Scoring System for Subsidized Housing Gives Black and Latino People Experiencing Homelessness Lower Priority Scores

$
0
0

Illustration of an arm with an open hand, palm facing up A form that is all filled in A woman in a jean jacket stands against a white background Three tents pitched on a sidewalk in front of a fence
About the Project

In Los Angeles, a secret is kept from people at the moment they are most vulnerable. When someone loses their home and is taken in by the homeless services authority, case managers must ask them a series of intensely personal questions—Do they use drugs? Do they owe someone money? Have they taken an ambulance recently?—all without revealing that they are being scored in a process that potentially decides whether or not they are provided a new home.

A Markup investigation, co-published with the Los Angeles Times, pulled back the curtain on this scoring system, confirming something that advocates for the unhoused had long suspected: The system has for years rated Black people experiencing homelessness as less vulnerable than White people, making them a lower priority for permanent housing.

We obtained more than 130,000 scoring surveys going back to 2016 via public records requests. The surveys are part of a scoring system known as the Vulnerability Index-Service Prioritization Decision Assistance Tool, or VI-SPDAT, and a variation of VI-SPDAT for younger adults, the Next Step Tool. In 2021, among adults under 25, 67 percent of White people scored in the highest priority group, compared with 46 percent of Black people. Among older adults that year, 39 percent of White people scored in the highest priority group, compared with 33 percent of Black people.

The scoring disparities came despite ample evidence that Black people in Los Angeles are particularly vulnerable to homelessness. They make up 9 percent of the county’s population but 30 percent of those without homes. A 2018 report from the homeless services authority attributed this to factors like “structural racism, discrimination, and implicit bias.”

We talked to case managers and housing “matchers” who described how central the scoring system is to housing decisions in Los Angeles. Experts explained how the stigmatizing questions in the system’s survey could lead to lower scores for Black people and provided evidence that the tool does a bad job at predicting the sort of vulnerability—to hospitalization or death—it is intended to measure.

We also explained how people experiencing homelessness in Los Angeles could work around these flaws and pointed them to resources that could help them do so. We found a formerly unhoused person, Chantel Jones, willing to share with us her story of losing her first chance at permanent housing due to a low score.

We detailed our data analysis in an accompanying methodology story. We also published a story recipe to encourage journalists to investigate the impact of the scoring tool in dozens of other cities where it has been adopted.

The post L.A.’s Scoring System for Subsidized Housing Gives Black and Latino People Experiencing Homelessness Lower Priority Scores appeared first on Online Journalism Awards.

The Markup

$
0
0

The Markup The Markup The Markup
About the Project

By the start of 2022, it was clear that Americans had drastically accelerated the migration of their lives online as ad-hoc means of coping with the COVID-19 pandemic became permanent.

As the nation’s only tech-focused investigative newsroom, The Markup knew it needed to educate people about the trade-offs involved in the apps, websites, and data pools that were making them more efficient—to ask difficult questions and to answer them in innovative ways.

To understand how internet access is provisioned in the U.S., we built custom software, consulted with statisticians, and made hundreds of phone calls, establishing that poorer, historically redlined neighborhoods paid the same price for slow internet connections as other parts of town paid for speedy ones.

To understand what information Facebook collected through its ubiquitous Meta Pixel web tracker, we forged a partnership with the maker of the Firefox web browser, allowing people to easily give us their browsing data. This enabled us to show that hospital patient portals, telehealth providers, online tax preparation services, and even the U.S. government’s financial aid site were oversharing sensitive personal data.

To understand the harms of increasingly common software algorithms, we obtained public records showing Los Angeles’ vulnerability scoring system, critical to housing decisions, had for years consistently ranked Black unhoused people as less in need of homes than their White counterparts. In Wisconsin, we showed that an algorithm purporting to predict which teens might drop out of high school disproportionately raised false alarms about Black and Hispanic students.

And to understand an epidemic of carjackings against Uber and Lyft drivers, we painstakingly collected personal stories to illustrate the frustration and harm Uber was causing, slowing police work after on-the-job carjackings or shootings.

Today’s digital technology is highly complex and it can be hard to illustrate its harms or make people care about them, so we had to think creatively about how to showcase our findings. As part of our Uber coverage, we created the only database tracking carjackings or attempted carjackings of gig workers nationwide. To illustrate broadband pricing discrepancies, we created an interactive map featuring almost a million addresses across 45 U.S. cities. And we used everything from story-topping data illustrations (on the dropout algorithm story) to searchable data tables (on our Meta Pixel series) to pull people into article pages.

Collaboration was key to spreading the word about these efforts, whether with news organizations, readers, or activists. We partnered with news organizations like the Washington Post, Los Angeles Times, and Associated Press; spoke with readers on social media about our reporting in Q&A sessions; and created tools like the United States Place Sampler and “story recipes” for activists and others who could extend our investigations.

Our peers are honoring this work. Our broadband pricing investigation won a Sigma award and a bronze medal from the Society of News Design for investigative infographics. The Society for Advancing Business Editing and Writing gave awards to our Uber carjacking and broadband pricing stories.

The post The Markup appeared first on Online Journalism Awards.

Denied

$
0
0
Crystal Marie McDaniels Crystal Marie McDaniels
About the Project

Discriminatory lending practices have been well-documented throughout the years. Lenders’ response to researchers and journalists has been that reporters don’t have enough relevant data to make a true determination and, if they did, the disparities would disappear.

The Markup’s investigation Denied debunks the lenders’ argument that the inclusion of certain financial characteristics would eliminate apparent bias in mortgage approval decisions. We analyzed more than two million mortgage applications and found that people of color were denied at higher rates than similarly qualified White applicants. The analysis includes 17 variables, including debt-to-income and combined loan-to-value ratios. Lenders previously said including those specific variables, which were not public at the time, would explain what appeared to be racial disparities in lending. But when we included the newly released variables, we found that wasn’t true.

Instead, lenders were 40 percent more likely to turn down Latino applicants for loans, 50 percent more likely to deny Asian/Pacific Islander applicants, 70 percent more likely to deny Native American applicants, and 80 percent more likely to reject Black applicants than similarly qualified White applicants. These are the national rates. We also found disparities in 89 metro areas across the country.

Our story explained that part of the reason these disparities persist, 50 years after the Fair Housing Act outlawed the racist housing policy known as redlining, can be tied to factors considered by algorithms used in the mortgage application process.

The investigation shows that Fannie Mae and Freddie Mac, two quasi-governmental companies that de facto set the standard for mortgage lending, rely on algorithms that can disproportionately harm people of color.

The two companies require lenders to use a credit-scoring algorithm called “Classic FICO” to determine whether applicants qualify for a conventional loan. But that credit-scoring model was built using data from the 1990s and is more than 15 years old. It’s considered detrimental to people of color because it rewards people who have access to traditional and mainstream types of credit, from which people of color have historically been shut out.

The most important algorithms influencing mortgage decisions are automated underwriting systems developed by Fannie and Freddie. The pair buy half of all mortgages in America, so most lenders use these algorithms for approval decisions. Ultimately, no one outside Fannie and Freddie knows exactly what factors their underwriting algorithms use or how they are weighted. Not even the companies’ government regulator, the Federal Housing Finance Agency, knows exactly how these algorithms work.

In the story’s sidebar, we identified specific mortgage lenders with the most egregious disparities. When we investigated their backgrounds, we found that all of them had faced criticism from at least one government agency in recent years for their business practices. Another pattern: Three of these lenders are affiliated with the nation’s largest home builders. Each lender in our sidebar was at least 100 percent more likely to deny Black and Latino applicants than similarly qualified White ones, and these lenders concentrated their loans in upper- and middle-class neighborhoods.

The post Denied appeared first on Online Journalism Awards.

The Markup

$
0
0
Brianna Hernandez, Jonathan King, their daughter, Zariyah, 5, and son Jonathan, 4, outside a hotel in Elgin, Ill., where they lived for a time after receiving an eviction notice. Driver advocate Lenny Sanchez of Chicago leans on his car Robert Gomez, owner of startup 4Q Brands, in his warehouse in Buford, GA on October 6th, 2021. For more than two years, his coffee grinder had been one of his best sellers on Amazon.
About the Project

The Markup launched just over two years ago as an investigative nonprofit that seeks to illuminate the ways that technology affects society. We remain the only tech-focused investigative newsroom in the nation, and we’re proud to use cutting-edge data and technology tools to examine the hidden biases in algorithms, from the mortgage approval processes used by lenders to crime prediction software sold to police. In each case, the algorithms were purported to be without bias. But again and again, using novel data scraping, massive data analysis, and statistical techniques, we lifted the veil and showed how massive companies are hurting people of color through automated decisions and reneging on their promises to the public.

Investigating private companies is among the most difficult reporting—and it’s even harder when you focus on tech giants, which hire armies of public relations staffers to thwart those efforts. So we have developed creative uses of technology to gather the information to hold them accountable. For instance, we built a tool that allowed us to collect anonymized data from individuals’ Facebook feeds, which we called Citizen Browser, and then hired a survey research firm to assemble a national panel of Facebook users willing to install it. Citizen Browser has fueled nearly two dozen investigations and powered two important persistent monitoring tools for our the public: Split Screen, a dashboard that showed what stories were most popular in the feeds of users of different genders, age groups, and voting preferences; and Trending on Facebook, a Twitter bot that tweeted out the most popular websites seen by our panelists each day. We’ve included interactive elements in our investigations to help readers comprehend the real-world effects algorithms have on their lives, and we’ve released a browser extension, Amazon Brand Detector, to help readers make more informed decisions as consumers. In order to get our reporting in front of as many people as possible, we collaborate with news organizations to distribute our content (the Associated Press and others) and co-report and co-publish (The New York Times, Consumer Reports, and others). We also publish under a Creative Commons license and use social platforms’ popular tools like Instagram Reels to explain our reporting and interactive polling features on Twitter and Instagram to hear our readers’ perspectives. We’re committed to accessibility, providing alt text on our images, captioning on our videos, and high-contrast graphics for readability.

Our peers have taken notice: Our Citizen Browser work is a finalist for a Scripps Howard Award for Excellence in Innovation. Emmanuel Martinez, Lauren Kirchner, and Malena Carollo’s reporting for the Denied series on mortgage approval algorithms has so far earned a Deadline Award for Reporting by Independent Digital Media (finalist for public service) and a Headliner Award for online investigative reporting. The Citizen Browser project and Prediction: Bias, on bias in predictive policing algorithms, were both shortlisted for Sigma Awards.

The post The Markup appeared first on Online Journalism Awards.

Still Loading

$
0
0

Pamela Jackson-Walters, whose Detroit neighborhood is almost entirely Black, has endured slow internet speeds and weeks-long outages.
About the Project

Our investigation “Still Loading” is the culmination of an eight-month effort that began with an attempt to see what speeds and prices internet service providers were offering to households across the country. In the process, it transformed into something even more urgent: an exposé revealing how a quartet of telecom giants had neglected to upgrade their networks with high-speed infrastructure in socioeconomically disadvantaged and racially diverse neighborhoods.

Still Loading found that four major national internet service providers (ISPs) disproportionately offered lower-income, least-White, and historically redlined neighborhoods slow internet service for the same price as for speedy connections in other parts of town. We produced the first nationwide disparate impact analysis of the speeds and prices internet service providers offered directly to consumers, and we’re the first to show where inequitable effects of tier flattening (charging internet customers the same rate for differing levels of service) have occurred.

The people who live in neighborhoods offered the worst internet deals aren’t just being ripped off; they’re being denied the ability to participate in remote learning, well-paying remote jobs, and even family connection and recreation—ubiquitous elements of modern life. The worst part: Until The Markup published its investigation, the customers receiving the worst deals had no idea.

To pull this off, Aaron Sankin manually tested the websites of all the major internet service providers to determine which engaged in this pricing scheme, while Leon developed custom-built software to collect data from each service providers’ website. To show what was happening, we had to collect more detailed data than what the FCC had received from telecom companies themselves.

Additionally, our reporters are now key experts on the digital divide, and we believe their work was the first nationwide address-level mass survey of internet speeds of internet speeds and prices released publicly anywhere.

Our reporters also made extensive efforts to share their findings and methodology with those impacted by the digital divide by facilitating original reporting in nine local news outlets, and releasing a guide that empowers anybody with a computer and internet access to collect a representative sample of internet plans and test for disparities.

Judges Comments

A nearly perfect example of what modern technology journalism can be. The Markup’s exposure of the lasting consequences of geographic inequality in nearly every major American city is both deeply reported and elegantly displayed. It has the potential for real impact by taking its thesis out of the abstract and providing easy-to-use tools that connect that thesis directly to the reader’s life. In an extremely competitive category, the utility and user experience of this project helped it stand out among excellent peers.

The post Still Loading appeared first on Online Journalism Awards.


L.A.’s Scoring System for Subsidized Housing Gives Black and Latino People Experiencing Homelessness Lower Priority Scores

$
0
0

Illustration of an arm with an open hand, palm facing up A form that is all filled in A woman in a jean jacket stands against a white background Three tents pitched on a sidewalk in front of a fence
About the Project

In Los Angeles, a secret is kept from people at the moment they are most vulnerable. When someone loses their home and is taken in by the homeless services authority, case managers must ask them a series of intensely personal questions—Do they use drugs? Do they owe someone money? Have they taken an ambulance recently?—all without revealing that they are being scored in a process that potentially decides whether or not they are provided a new home.

A Markup investigation, co-published with the Los Angeles Times, pulled back the curtain on this scoring system, confirming something that advocates for the unhoused had long suspected: The system has for years rated Black people experiencing homelessness as less vulnerable than White people, making them a lower priority for permanent housing.

We obtained more than 130,000 scoring surveys going back to 2016 via public records requests. The surveys are part of a scoring system known as the Vulnerability Index-Service Prioritization Decision Assistance Tool, or VI-SPDAT, and a variation of VI-SPDAT for younger adults, the Next Step Tool. In 2021, among adults under 25, 67 percent of White people scored in the highest priority group, compared with 46 percent of Black people. Among older adults that year, 39 percent of White people scored in the highest priority group, compared with 33 percent of Black people.

The scoring disparities came despite ample evidence that Black people in Los Angeles are particularly vulnerable to homelessness. They make up 9 percent of the county’s population but 30 percent of those without homes. A 2018 report from the homeless services authority attributed this to factors like “structural racism, discrimination, and implicit bias.”

We talked to case managers and housing “matchers” who described how central the scoring system is to housing decisions in Los Angeles. Experts explained how the stigmatizing questions in the system’s survey could lead to lower scores for Black people and provided evidence that the tool does a bad job at predicting the sort of vulnerability—to hospitalization or death—it is intended to measure.

We also explained how people experiencing homelessness in Los Angeles could work around these flaws and pointed them to resources that could help them do so. We found a formerly unhoused person, Chantel Jones, willing to share with us her story of losing her first chance at permanent housing due to a low score.

We detailed our data analysis in an accompanying methodology story. We also published a story recipe to encourage journalists to investigate the impact of the scoring tool in dozens of other cities where it has been adopted.

The post L.A.’s Scoring System for Subsidized Housing Gives Black and Latino People Experiencing Homelessness Lower Priority Scores appeared first on Online Journalism Awards.

The Markup

$
0
0

The Markup The Markup The Markup
About the Project

By the start of 2022, it was clear that Americans had drastically accelerated the migration of their lives online as ad-hoc means of coping with the COVID-19 pandemic became permanent.

As the nation’s only tech-focused investigative newsroom, The Markup knew it needed to educate people about the trade-offs involved in the apps, websites, and data pools that were making them more efficient—to ask difficult questions and to answer them in innovative ways.

To understand how internet access is provisioned in the U.S., we built custom software, consulted with statisticians, and made hundreds of phone calls, establishing that poorer, historically redlined neighborhoods paid the same price for slow internet connections as other parts of town paid for speedy ones.

To understand what information Facebook collected through its ubiquitous Meta Pixel web tracker, we forged a partnership with the maker of the Firefox web browser, allowing people to easily give us their browsing data. This enabled us to show that hospital patient portals, telehealth providers, online tax preparation services, and even the U.S. government’s financial aid site were oversharing sensitive personal data.

To understand the harms of increasingly common software algorithms, we obtained public records showing Los Angeles’ vulnerability scoring system, critical to housing decisions, had for years consistently ranked Black unhoused people as less in need of homes than their White counterparts. In Wisconsin, we showed that an algorithm purporting to predict which teens might drop out of high school disproportionately raised false alarms about Black and Hispanic students.

And to understand an epidemic of carjackings against Uber and Lyft drivers, we painstakingly collected personal stories to illustrate the frustration and harm Uber was causing, slowing police work after on-the-job carjackings or shootings.

Today’s digital technology is highly complex and it can be hard to illustrate its harms or make people care about them, so we had to think creatively about how to showcase our findings. As part of our Uber coverage, we created the only database tracking carjackings or attempted carjackings of gig workers nationwide. To illustrate broadband pricing discrepancies, we created an interactive map featuring almost a million addresses across 45 U.S. cities. And we used everything from story-topping data illustrations (on the dropout algorithm story) to searchable data tables (on our Meta Pixel series) to pull people into article pages.

Collaboration was key to spreading the word about these efforts, whether with news organizations, readers, or activists. We partnered with news organizations like the Washington Post, Los Angeles Times, and Associated Press; spoke with readers on social media about our reporting in Q&A sessions; and created tools like the United States Place Sampler and “story recipes” for activists and others who could extend our investigations.

Our peers are honoring this work. Our broadband pricing investigation won a Sigma award and a bronze medal from the Society of News Design for investigative infographics. The Society for Advancing Business Editing and Writing gave awards to our Uber carjacking and broadband pricing stories.

The post The Markup appeared first on Online Journalism Awards.

Languages of Misinformation

$
0
0
Photograph of a group of Vietnamese elders seated and singing karaoke Illustration of a young Asian woman holding an umbrella while walking with an older Asian woman; both are being shielded by falling exclamation points and YouTube thumbnails Wide-angle photo of Bùi Như Mai at her home, with annotations pointing to an ancestral altar, a green screen, Mai, an influencer ring light and softbox lighting

About the Project

Misinformation and the deepfakes polluting our media ecosystem are a known issue to much of the U.S. population. But this awareness does not always spread to immigrant communities who don’t speak English. We wanted to change this.

The Markup started by doing an information needs assessment with one community: Vietnamese people above the age of 50 in Oakland, California. We conducted a focus group interview with 30 people, did individual interviews, and analyzed the YouTube archive of one volunteer who donated her YouTube viewing history (17,000+ videos). We learned that many community members got their news on YouTube, and from specific influencers who translated sites like Newsmax and Breitbart into Vietnamese. Our first story focused on shedding light on what we found: For some immigrant communities, the only news that is accessible to them, is translated misinformation.

With this as our base, we set out to amplify existing community solutions and develop new resources that would equip the community with the ability to access better information.

We co-wrote a first person narrative with a 67-year-old Vietnamese grandmother, Bùi Như Mai, who shared her story as an oral history. Bùi started translating mainstream media, like the New York Times and the Atlantic, into Vietnamese because she was worried about the misinformation consumed by her community. Bùi has her own news segment on the YouTube channel of Vietnamese news publication Người Việt.

Our story was published in English and Vietnamese at The Markup and Vietnamese grassroots organization Viet Fact Check. Recently, we kicked off a new video series with Bùi: We wrote scripts about how Vietnamese folks can find better information and Bùi broadcasted it to her steady following on YouTube. We wanted to provide vital information where the Vietnamese community already was: On YouTube with a trusted voice.

We also wanted to give people actionable information on how to navigate misinformation with their families. Many children of immigrants have struggled to talk to their parents about the misinformation their parents have amplified. So our next story became a guide for all second generation immigrants on how to talk to their loved ones about misinformation, released just before the December holiday season when families are likely to gather in-person.

Finally, we started a workshop series with the Vietnamese elders that we featured in our first story. People had told us about a problem, so we worked to give solutions. We built two workshops. The first explained the main misinformation issues the community was facing and showcased tools like Google Translate for the community to use. It was service journalism delivered in person and in Vietnamese, and was received with excitement. A handful of Vietnamese community members installed the Google Translate app. The second workshop focused on explaining artificial intelligence and its capacity to create deepfakes, and how people can spot them. When we gave this workshop, none of the community members in the room had heard of artificial intelligence, so we broke the topic down for them.

The post Languages of Misinformation appeared first on Online Journalism Awards.

Digital Book Banning

$
0
0
Illustration of a pixelated lock over two buttons labeled \"BLOCK\" and \"VISIBLE,\" set over a background of cascading screenshots of websites Illustration of a pixelated hand holding a magnifying glass, set against a background of overlapping browser windows displaying a lock Illustration of four pictures of students in overlapping browser windows; multiple browser windows in the background show a pixelated lock over an abstract halftone background Illustration of cascading website browsers for sites like Wikipedia, Google, Nasa, It Gets Better and The Trevor Project; the browser windows have an inverted halftone filter that obscures the site's content

About the Project

In the wake of high-profile book bans in schools and public libraries across the country, and amid politically-charged debates over LGBTQ+ rights and abortion access, The Markup sought to understand the scope and impact of less-examined “digital book bans,” in which internet access for K-12 students is censored, but receives significantly less attention.

A team of Markup journalists spent months requesting records from school districts, challenging records denials, traveling to talk to students, writing software to test school district blocking patterns, analyzing censorship records, and interviewing students, teachers, attorneys, constitutional law scholars, and advocates.

We anchored our reporting in the experiences of students who are routinely censored, so we could investigate why. To reach them, reporter Tara García Mathewson traveled to the suburbs of St. Louis, Missouri, to talk to students and families in the Rockwood School District, which blocked LGBTQ+ resources, abortion information, Wikipedia, YouTube, social networking, search sites and more, representing one of the most aggressive filtering systems of any district examined by The Markup. She spent hours doing phone and Zoom interviews with students in Texas, California, Michigan, and New York to round out her understanding of the filters’ harms.

The team revealed how web filters across 16 school districts in 11 states thwarted basic research and web browsing, forcing students to resort to workarounds like using personal cellphones, and relegating students with no other internet access to an inferior educational experience. It also showed inequities across districts, where some students were blocked from health and safety resources, including suicide prevention resources for LGBTQ+ teens, abortion information, and sex education, that other districts made available.

The investigation also raised questions over whether districts have improperly exceeded the regulations that mandate internet filtering. To qualify for federal internet subsidies, school districts must keep students from seeing obscene and harmful images over their schools’ networks. But experts said some blocking the team identified—for example, of supportive LGBTQ+ sites, while anti-LGBTQ+ sites are allowed—crossed the line into unconstitutional discrimination. Other censorship, like blanket blocks against all social media sites, ran afoul of guidance from federal regulators. The vast majority of blocks the team reviewed were not required by federal rules.

The Markup presented its findings to readers through five distinct articles: A main story with an interactive chart letting readers explore the blocked websites, a gallery showcasing the experiences of five individual students with online censorship, an interactive survey to let students test and report the extent of blocking at their schools, a detailed explanation of the team’s reporting methodology, and a guide to help high school students obtain public records showing blocked websites. A team of engineers and designers created eye-catching visuals, including animated graphics and stylized pullquotes.

Judges Comments

The focus on information available at public school libraries is often on physical books. This deeply sourced and dogged reporting digs into the topic of online information available in public schools across the U.S. The main article provides an explanation in plain language of what’s happening in school libraries and internet access. It include the harms, or potential harms, that that blocked information might have on students in the short-term for a class assignment, and longer term for how they might come to feel or believe about themselves or a member of the LGBTQIA+ community as an example. This investigation uses technology, FOIA and crowdsourcing to discover a content censorship criteria and presents it in a clear and beautiful way. This team also explains its behind-the- scenes effort so everyone can learn, or get inspired by their cases, to amplify potential impact.

The post Digital Book Banning appeared first on Online Journalism Awards.

The Markup

$
0
0
About the Project

The Markup challenges technology to serve the public good. Our journalism gives people control over the technology affecting their lives. We accomplish this in three ways.

1. Data- and Software-Driven Reporting

Using nearly 100 accounts, The Markup conducted the first field audit of Instagram’s content moderation algorithms and our data found that they routinely limited the reach of posts supporting Palestine (specifically nongraphic images of war, captions, comments, and hashtags) and denied users the ability to appeal.

Our uniquely comprehensive “digital book ban” investigation used data to reveal how web filters across 16 U.S. school districts in 11 states kept students from doing homework, exacerbated inequities, and discriminated ideologically—blocking suicide-prevention resources for LGBTQ+ teens while keeping anti-LGBTQ+ sites available.

Finally, The Markup testing revealed that NYC’s AI chatbot told business owners to break the law. The Markup asked the bot dozens of questions and found it was frequently wrong, advising visitors to discriminate in housing and to take workers’ tips. Business owners also shared getting false information in response to their own questions.

2. Tools that Give People Superpowers

After the Washington Post reported X was throttling links to competitors, The Markup published a tool that lets anyone check if X throttles any link. Readers ran hundreds of tests and found delays reaching Patreon, WhatsApp, and Messenger, leading to a second story with Patreon creators explaining how the delays hurt their income.

Readers have also used our real-time privacy inspector, Blacklight, to scan over 13 million websites for trackers, exposing abusive tracking by OB-GYNs, online pharmacies, and edtech companies. We recently upgraded Blacklight to be able to scan EU sites with more stringent privacy laws, compare mobile and desktop versions of a site, and more.

3. Partnership Between Communities and Newsrooms

In late 2022, The Markup exposed how major internet providers systematically give the worst deals to poorer and least-White neighborhoods—a data-driven investigation that eventually won a Philip Meyer award. Since then, we’ve equipped affected communities to fight that discrimination, teaching readers how to find a better internet deal, how to fact check company’s claims to the FCC (over 5,000 of them did, including the Detroit Documenters), and publishing “magic spreadsheets” that allow anyone to analyze internet speeds for disparities (a Chicago community youth group did so, presenting their findings to the mayor). Following our work, the FCC approved rules against digital discrimination and Los Angeles became the first city to outlaw it entirely.

We sought even closer collaboration with communities for reporting on misinformation, doing stories based on what 30+ Vietnamese immigrants in Oakland, California, told us they needed. We wrote about how most news-related Vietnamese YouTube videos are by influencers translating misinformation, amplified the work of a 67-year-old Vietnamese grandmother who translates articles from mainstream outlets to combat this misinformation, and created a guide for second-generation immigrants on how to talk to loved ones about misinformation. Finally, we put on two workshops for the community: one on misinformation and one on how to spot deepfakes.

Judges Comments

The judges said the winner produced outstanding, actionable journalism that, given its subject matter, could not have come at a more consequential moment.

The post The Markup appeared first on Online Journalism Awards.

Viewing all 13 articles
Browse latest View live