Is the human still in the loop?
Image generated by OpenAI’s DALL-E via ChatGPT
Lauren is a thought leader in Digital Journal’s Insight Forum (become a member).
Accuracy, fairness, and independence have long defined journalistic standards. As newsrooms embrace data-driven processes with open arms, human judgment, independence, and expertise are confronted by algorithmic systems that prioritize optimization over traditional journalistic values.
The rise of algorithms has reshaped journalism, raising urgent questions about journalistic authority and who controls the narrative. Journalistic language is evolving — from informing audiences to appeasing search engines — where traditional values now bow to production demands.
With an increased access to information comes an increased competition to be found, trusted and read. The grounding of human judgment, independence and expertise is now confronted by clouded algorithmic requirements in order to get a journalist’s perspective front and centre.
What difference, if any, do these human standards make when machines become essential members of the news-production team? How are these principles transformed when algorithms are introduced into journalism, influencing everything from story selection to content distribution?
Who decides what we see?
From story selection to distribution, algorithms now shape every stage of the news cycle, guided by audience-engagement metrics. While technology is often seen and understood as objective, algorithms are fundamentally influenced by the values of their human creators and the datasets they have selected.
This becomes particularly controversial when these algorithms are then introduced as a way to prioritize content in the newsroom based on predictions of potential audience engagement.
Social media platforms have become gatekeepers of news, with algorithms deciding what content gets seen. This reliance forces newsrooms to prioritize algorithm-friendly content, often at the expense of journalistic integrity and independence.
The prioritization of sensationalism over depth, or even fact, as well as the offer of personalization, become unintended consequences of adopting an “objective” technology, slowly but surely eroding journalistic integrity in favour of click-driven content.
The battle for impartiality
The algorithmic landscape’s first substantive challenge to journalistic standards lies in exacerbating an already complex tension: the aspiration for impartiality versus the need for engagement.
With algorithms dictating publication and distribution, journalists face mounting pressure to craft stories that inform — and captivate. This balancing act is further complicated by the ways algorithms shape not just distribution but also the content itself. When it comes to the actual articles we read, the impact of algorithms becomes more concerning still as we are seeing a new trend in automated articles created by large language models and algorithmic compilations of content.
Automated journalism excels at summarizing data but lacks the nuance, depth, and critical perspective that define high-quality reporting.
This of course can lead to fears of replacing journalists and eroding diverse human perspectives. In a surprising turn of events, it is precisely the human bias, the lack of perceived objectivity, that gives journalism the edge over tech-generated content.
It is human judgment, contextual understanding and the ethical considerations of journalists that the algorithms cannot replicate. Consider, for example, a breaking news story about a local protest. An algorithm might prioritize articles based on engagement metrics — boosting headlines that feature sensational phrases like “clashes erupt” or “violent riots.” However, a journalist on the ground brings a nuanced perspective: they can recognize the protest’s underlying social and economic issues, interview participants to understand their motivations, and ensure that the coverage reflects diverse voices, not just the most clickable angles.
Algorithms lack the ability to question the narratives it promotes or to see beyond the data. This is where human judgment is indispensable — it ensures that the story isn’t reduced to a sensational headline but instead provides readers with a comprehensive understanding of the event, fostering informed discourse rather than polarization.
Maintaining this positionality in light of the previous argument for click-based sensationalism then becomes a challenge to journalistic authority and authenticity — is it still journalism if accuracy and public interest are challenged by engagement-based models?
The transparency problem
Transparency and disclosure have emerged as central themes in our analysis. As algorithms become more embedded in journalism, the need to openly acknowledge their influence on news production finds itself rising to the forefront.
Despite its growing influence, the industry struggles to disclose algorithmic involvement to audiences. Without proper disclosure, the lines between human editorial judgment and algorithmic decision-making blur, raising serious ethical questions.
We have all heard at this point that bias and transparency are not exactly strong points for algorithms, especially given that they are trained on historical (and historically biased) data. The perpetuation and exacerbation of societal biases and inequalities then becomes even more prevalent in the magnifying glass that is large data models and artificial intelligence.
Disclosure of use of algorithms and AI to generate content and push it to readers is not something that is yet regulated which puts the responsibility back on the company and its readership to determine accountability for transparency and erosion of the human in the loop.
To me, the human in the loop represents more than just oversight — it’s about maintaining a vital connection between technology and humanity. It’s the journalist’s ability to apply ethics, context, and critical thinking to ensure the stories we consume reflect the complexity of the world, not just the calculations of an algorithm and the priorities of the people who created it.
Without this human element, journalism risks losing the empathy and insight that make it more than just a delivery system for information — it becomes the soul of storytelling itself.
Perception is reality in journalism
We are now living in an era where perception often is reality, especially in journalism and media. The way stories are framed, the headlines we scroll past, and the platforms that surface them shape what audiences believe to be true.
Algorithms, designed to optimize for engagement, have exacerbated this phenomenon by prioritizing content that aligns with existing biases, incites strong emotions, or simply keeps users scrolling. In this system, the loudest voices and most sensational stories dominate, while nuance and context are often left behind. The danger is clear: when algorithms amplify perception over fact, they don’t just distort the narrative—they redefine it entirely.
Consider the rise of “fake news” during election cycles or the selective visibility of certain voices in social movements. An algorithm prioritizing divisive content because it drives clicks might inadvertently tilt public perception, making a fringe viewpoint appear mainstream or a misleading headline go viral. In this reality, perception is not just shaped by the truth — it becomes the truth for many consumers.
Without human judgment to interrogate these narratives, journalism risks becoming a tool of manipulation rather than a force for accountability. The human in the loop, then, isn’t just a safeguard; it’s the last line of defense against a world where algorithms define reality.
Redefining journalism in the algorithmic age
The key question isn’t what to do about journalism’s changing landscape — it’s how these shifts reshape news-reporting standards. When algorithms dictate content visibility, core journalistic principles like accuracy, independence, and public trust are forced to adapt. Can these values survive in an engagement-driven system, or must they evolve? As technology continues to dominate, journalism faces a crossroads: reconcile traditional ideals with algorithmic realities, or risk losing its relevance and authority.
Is the human still in the loop?
#human #loop