Former CNN journalist Jim Acosta has sparked controversy by conducting an artificial intelligence-generated interview with a deceased Parkland shooting victim to advocate for gun control measures. The interview, which Acosta promoted as “one of a kind” and a “show you don’t want to miss,” featured an AI recreation of Joaquin Oliver, who tragically lost his life during the 2018 Marjory Stoneman Douglas High School shooting.
During the segment, broadcast from his home studio, Acosta engaged with the AI-generated avatar in what many viewers found to be a disturbing exchange. The interaction began with Acosta asking about Oliver’s death, to which the AI responded in a notably artificial voice about being “taken too soon due to gun violence while at school.”
The conversation continued with Acosta questioning the AI about potential solutions to gun violence. The digital recreation offered predictable AI-generated responses, suggesting a combination of stricter gun control legislation, mental health support, and community engagement initiatives. The interview also included seemingly out-of-place questions about the NBA and Star Wars.
Acosta defended the controversial segment, stating it was meant to inspire hope and continue the fight against gun violence. He acknowledged the potentially startling nature of the AI recreation but emphasized it was “an expression of love from the Oliver family for their son.”
The segment included a follow-up interview with Oliver’s father, who had previously utilized similar AI technology in 2024 to pressure Congress on gun control legislation. In that earlier instance, the AI likeness delivered a message highlighting congressional inaction on gun violence in the years following the Parkland tragedy.
The interview has drawn significant criticism across social media platforms, with many viewers expressing concern about the ethical implications of using AI to recreate deceased individuals. Parkland residents and others have condemned the segment as “gross” and “morally evil,” with some calling for legislation to protect the memory of the deceased from such technological exploitation.
This controversial piece comes at a time when Acosta has recently announced his departure from traditional media to pursue independent journalism. He revealed on CNN that he had declined an offer involving reduced airtime and compensation, opting instead to launch his own Substack platform called “The Jim Acosta Show.”
The backlash against the AI interview has been particularly intense, with critics pointing out the problematic nature of scripting responses for a deceased person, even with family consent. Many viewers expressed discomfort with the format, suggesting it undermined the seriousness of the gun violence issue rather than advancing meaningful dialogue.
The incident reflects broader concerns about the intersection of artificial intelligence and journalism, raising questions about ethical boundaries and appropriate uses of AI technology in covering sensitive topics. While the Oliver family supported the project, the public response indicates significant discomfort with this novel approach to advocacy journalism.
The controversy adds to ongoing discussions about media ethics and the role of artificial intelligence in shaping public discourse around critical social issues. As AI technology continues to evolve, this incident may serve as a cautionary tale about the limits of
technological innovation in addressing deeply personal and tragic events.
