At the heart of the debate is anxiety over how to balance the reality of AI as an increasingly essential tool for students as they begin their professional lives. Some of the most exciting discussions and debates about AI’s role are happening in Howard’s Music Department. Department Director Caroll Dashiell has been a longstanding fixture in the music industry and at Howard. A generational alum and the father of three Howard grads — including fellow music professor Christie Dashiell — Caroll’s music career has included playing jazz bass in multiple orchestras and with international performers, as well as recording, producing, and directing. Dashiell also has three decades of academic service at East Carolina University. He sees AI technology affecting all aspects of music and feels a duty to ensure his students are prepared for the future. “It’s so important,” Dashiell said. “I always say to people who are against it — because we have people who are actually against it — you’ve been using it all the time. Have you ever used your GPS or asked Siri a question?” As someone who must balance the creative and business aspects of his work, Dashiel has found uses for AI that enhance what he is able to do creatively, rather than stripping the human element away. “As an artist, I don’t want AI to generate the bass,” he explained. “But as a producer, it always comes down to economics. I look at the orchestra and the strings when we’re doing shows. It used to be that it would be a full string section, but now from a financial standpoint, I can’t afford to pay for a full string section. I can pay for two string players, four at the most, then I’m using AI to double parts. In the performance hall when people are listening, they hear a full orchestra. It’s four people playing, but it’s enhanced and overdubbed and stacked.” Dashiell understands the hesitancies of other faculty members and agrees there needs to be a greater understanding of where AI can be helpful in education and what its limitations are. To that end, he formed an exploratory committee with other music faculty members, whose attitudes toward AI broadly differ. Angela Powell Walker, a member of the Music Department’s AI working group alongside professors Matthew Franke and Autumn McDonald, began using AI in her own life at the urging of her husband, a graphic designer who uses the technology on a daily basis. Describing herself as an AI newbie, Powell Walker now uses it for everything from conducting research to crafting quizzes. “I love using it as a research tool because
I can type in, ‘Can you name five composers from the impressionistic period that wrote songs about flowers?’” she said. “It will get that information out to you in like five seconds. And then you can go do the research accordingly on those composers.” Powell Walker shares Dashiell’s belief that students should adapt to using the tool responsibly, rather than avoiding it altogether. “Knowledge is power,” she added. “Students are going to find what makes life the easiest for them. Not just the students, everybody’s going to do that. If the tool’s there, I think it’s important that we don’t necessarily resist it but that we learn about it and how we can make it a successful, useful thing for the kids.” Musicology professor Matthew Franke, Ph.D., is far more skeptical of its use, and has found generative AI tools to be far too unreliable. “I try to avoid it, honestly, because I’ve just seen too many errors from students who are using it,” Franke said. “I routinely have to fail students because, say, I’ll ask ‘give me a bibliography on this topic,’ and they’ll [use] ChatGPT. Sixty percent of the bibliography are books and articles that don’t exist,.” He cautions his students against relying on any generative AI outputs and said he’s concerned about the inherently unequal power dynamics between users and the tech companies that run these tools. Franke describes it as a catch-22, especially in relation to authenticity and plagiarism concerns. “I think students are being told if they don’t jump on, they will be left behind, but if they do jump on board then they have become fake,” he said. “They lose authenticity, and this can also be used against them, so you might as well just be yourself, which is the last thing you can be if you’re using AI.” This authenticity is especially important at a time when researchers are under intense, often politically motivated scrutiny. “I tell my students if you use AI, this is another thing that people who are hostile can point to and [say] ‘go look at this,’” he explained, concerned about how overuse or misuse of AI could come back to hurt students further along in their academic careers, comparing it to accusations of plagiarism. On the other end of the spectrum, music and business professor Autumn McDonald has fully incorporated AI into her professional life, using it in everything from anthropological research to crafting presentations for her market research company, ADM Insights & Strategy. Her courses, such as marketing for the arts, prepare students for the realities of approaching the arts as a business. To her, it is unfair to ignore AI in the classroom when it is already shaping the professional world. “If I teach an entire semester and don’t have the students engage with AI during my course, then I am doing a disservice to those students,” said McDonald. “I’m not sufficiently preparing them for the world that awaits them,
so I am very intentional and thoughtful about having certain assignments in which they are required to use AI.” As someone teaching the next generation of Black entrepreneurs, McDonald is more concerned with the consequences of not utilizing AI and furthering the technological gap faced by Black communities. “The fear of being left behind is a valid fear if we look at ways that technology has entered the landscape over time,” she explained. “We know, for example, that there has been a gap between Afro-descendant households and non-Afro-descendant households and their access to laptops in the home. During the pandemic, Black households did not necessarily have access to high-speed Wi-Fi or personal laptops in the same way that white households did. That creates a learning gap, and a gap in access to information. Many of us want to be sure that there is not a similar gap that comes to bear as it pertains to this current landscape of artificial intelligence.”
Being the Bridge
The discussions in the Music Department — and across campus — are emblematic of Howard’s unique position as a research institution. As the only historically Black university with an Research One (R1) classification, Howard serves not only as a leader in science and technology, but also as a bastion of culture and advocacy for people who have historically been left out of the conversation around technological progress, even as they bear the brunt of its unintended effects. For Dr. Williams, the answer to this balance lies in embracing the community during the development process. For Project Elevate Black Voices, for example, this meant compensating participants, hosting group discussions to hear concerns about technology from the outset, and updating them via newsletter after data set completion. “The goal is to be a bridge,” said Williams. “At Howard, we are in the community; that’s why we had community activations as a part of the recruiting mechanism. We wanted to make sure we had that touch point because we didn’t want to just be passive, to have people say, ‘you’re just taking our data.’” Instead, empowerment and ownership are required to ensure the massive potential benefits and risks of AI are shared equitably. For students and faculty alike, this means taking a “knowledge is power” approach. For researchers, it means taking an active role in analyzing how technology affects everyone. “You naturally have to be in the community talking to people and building relationships. Trust is a big part of that,” said Williams. “I think that once researchers and technologists get away from the desk and [get] outside talking to organizations and communities, you’ll find it becomes easier to do.”
ELEVATING VOICES Above: Dr. Lucretia Williams is a leader of a pioneering effort to ensure that AI recognizes Black voices. Photo by courtesy Lucretia Williams. Below: Angela Powell Walker is helping students use AI responsibly. Photo by courtesy Justin D. Knight.
42
43
Howard Magazine
Fall 2025
Fall 2025
Howard Magazine
Made with FlippingBook - professional solution for displaying marketing and sales documents online