HM FALL25 v31-A

impact on the environment includes ongoing global warming concerns. The massive [amount of] heat and the amount of energy that’s produced and used is, in some ways, similar to a nuclear power plant in terms of just the amount of disruption it does.” This disruption is not dispersed equally. Across the country, coal, natural gases, fossil fuels, and other “dirty energy” plants are disproportionately placed in economically impoverished, rural Black communities. In Colleton County, South Carolina, for example, a 2024 proposal would repurpose a previously shuttered coal plant into a natural gas plant, which has the potential to threaten the air quality of the predominantly Black residents living in the area. As reported in Capital B News, this is being done to generate power for the expanding data centers in the state. The power these data centers require may increase electricity bills of local residents, disproportionately impacting Black households, where electricity on average takes up a larger percentage of household budget, as noted by the ACEEE and the Department of Energy. Like any digital technology , from laptops to the phones in our pockets, new tools reliant on AI require rare-earth elements , heavy metals that are essential to modern electronics manufacturing but require immense mining and processing of raw ore to make usable. Mining and processing these materials carries significant environmental risks, many of them shouldered by people in the Global South. For recent Howard graduate Becca Haynesworth, the technology is inextricably tied to colonization. “When people make assumptions that AI just appears out of nowhere on our laptops, we have to really trace the footsteps of this piece of technology,” Haynesworth explained. “When you do, it’s rooted in material like cobalt, which is very abundant in the Democratic Republic of Congo.” Cobalt is a crucial rare-earth element used in semiconductors and lithium-ion batteries that power everything from phones to laptops to electric vehicles. There is evidence that the Congolese cobalt industry has led to water pollution, forced displacement, and slavery, as reported by NPR, Amnesty International, and African Resources Watch. These human costs are not strictly new to AI’s growth; however, it does place it in a long history of how technological progress for some can mean exploitation — often unseen — for many.

“It might be unpopular, but I’m very anti-AI,” said rising junior Olivia Ocran, who is an English major with a minor in education. “At least in the writing world, I don’t see it having a place because of the way it’s designed. It’s taking what it finds on the internet and it’s regurgitating it out. It’s stealing the work and the years that people have put into their writing that they have found the courage to share.” middle and high school classrooms. She worries that young students today are losing the skills that made her pursue writing and teaching in the first place, such as forming an argument, conducting research, and going to the library to ask for information. In her education courses, Ocran heard conversations about how AI could be used as an educational tool. However, she remains unconvinced. “We have created well-educated, successful people without AI,” she said. “I feel like because students Ocran has seen the impacts of AI on younger students first-hand during observation hours in have gotten used to abusing it, keeping it in the classroom is not going to be any help at all.” Even amongst students who regularly engage with AI tools, there is an apprehension toward just how quickly the technology has changed our everyday lives. Computer engineering sophomore Kamili Campbell works alongside Dr. Nias at the Human-Centered AI Institute. Campbell, an international student who’s been interested in coding since high school, credits experiencing the immense diversity of cultures growing up in Trinidad and Tobago and coming to the United States as the catalyst for her interest in the work of Nias’ Brave IDEAS Lab. In addition to her computer engineering work, Campbell has embraced generative AI in her organizational work as a co-president of the Howard chapter of the American Red Cross, where she uses it to help refine ideas for outreach events. For Campbell, AI is a brainstorming tool that requires an amount of personal responsibility. Campbell also has long-term concerns about AI’s impact on her prospective career field, just as she’s beginning to enter it. Certain sectors, such as computer programming, are facing steep declines in jobs. The role of computer scientists — and the skills that are essential to being one — are also shifting. “There are companies [that are] getting mid-level engineering scripts from AI technology like ALO with no human employees, just straight AI,” Campbell said. “It’s great because of the efficiency; they definitely decrease human error. As somebody who’s working hard in school to become a mid-level engineer, what do I do?” Out of the Lab and Into the Classroom (and the Recording Studio, and the Office, and the ... ) For Howard professors and students alike, AI has quickly become a part of daily life, and how best to approach its entrance into the classroom is far from settled.

FACING REALITY Autumn McDonald incorporates AI as she teaches students the business of music and the arts. Photo courtesy Autumn McDonald.

documented in a 2024 study published in Nature titled “AI Generates Covertly Racist Decisions About People Based on Their Dialect.” The study found that current language models “embody covert racism in the form of dialect prejudice, exhibiting racial linguistic stereotypes about speakers of African American English that are more negative than any human stereotypes about African Americans ever experimentally recorded.” As new AI tools are proposed in everything from housing to hiring to criminal sentencing, there is a real risk that they will recreate or even exacerbate discrimination. Nias made a similar point while discussing a presentation by Pennsylvanian social workers on how AI is being used in the recommendation system for whether children should be removed from their homes. “For years, these systems have been a part of our judicial system,” said Nias. “It’s been present; we just didn’t have a grounding in what it meant and what it could do, and I think now people are a little more aware of that power and how it can harm or help.” As researchers find ways in which AI can build up communities, they inescapably require massive amounts of data, such as art, music, films, research papers, and social media posts. Harvesting this data presents major concerns related to privacy and ownership. “There’s a mad dash to gain access to more and more data — and that really means more and more of people’s writings — to begin to use datasets to train the technology ,” said Green. “A lot of times people are not aware of the ways that their public or published writings become a part of these data sets because you’ve turned over your rights to that information. Those long and insightful Facebook posts or blog posts published to some of these companies’ websites can become fair game that can be used in ways you may not have intended, with a purpose you may not have intended as well.”

the ways of the global North in order to participate. But there is a lot of knowledge embedded in how other cultures engage and utilize their own indigenous technologies. As a result of this flattening, we are losing really important cultural artifacts, like mother tongue.”

Balancing Privacy and Community

While the work of Nias and Project Elevate Black Voices shows that AI can be used to aid and uplift, the potential to harm users and cultures is difficult to ignore. David F. Green, Ph.D., associate chair of writing in the Literature and Writing Department, worked on the Project Elevate Black Voices team to help researchers identify and categorize aspects of AAE dialects. Green, who has a background in African American, cultural, hip-hop, and technological rhetoric, is a member of the MLA-CCCC Joint Task Force on Writing and AI. The task force was created to develop resources, guidelines, and professional standards on AI and writing. Through working papers, guidelines, and collections of teachers’ personal experiences, the task force is providing a framework for educators to teach ethical and responsible AI use. One of Green’s chief concerns with AI is how it can affect communication. “It flattens language,” he explained. “If you’re only drawing on a limited set of authors, writers, and thinkers, it limits the possibilities and the capabilities of how expression occurs. And so, cultural influences disappear. The unique identity markers begin to disappear.” This flattening is a major reason that research such as Project Elevate Black Voices is essential. Users, especially Black users and others with distinct cultural ways of speaking and expressing themselves, may not hear or see themselves reflected in the technology and may self-censor while using it, further ingraining biases. This cycle can have real-world consequences, as

Environmental Demands of Digital Devices

AI’s impact isn’t simply limited to digital spaces. Companies like Amazon and Google require legions of data centers, all of which house thousands of servers that require the use of a lot of electricity and water, which is used to maintain cooling systems. According to research from the Massachusetts Institute of Technology, by 2026, the electricity consumption of these centers is expected to approach 1,050 terawatts, making them the fifth-largest electrical demand in the world. “The data servers and the technologies themselves use massive amounts of energy to do the work that they continue to do for folks,” said Green. “The

Next-Gen Concerns

There has been no shortage of headlines on generative AI’s impact on education, painting a picture of college students readily replacing their own critical thinking with AI-generated work. While it would be dishonest to say there are no concerns about how technology is affecting young learners, among Howard’s students, the conversation is more nuanced.

40

41

Howard Magazine

Fall 2025

Fall 2025

Howard Magazine

Made with FlippingBook - professional solution for displaying marketing and sales documents online