The New Gatekeepers: Social Media, AI, and the Power to Shape Truth

Session Summary

Important
Quotations

"Suicide rates have increased not only among teens but also among tweens, children as young as 10. We cannot afford to put our basic humanity, our children, at risk by rushing products into their hands that we know are not safe."
Julie Scelfo
"When all of our preferences are perfectly met and all our systems are customized to us, we end up in a cocoon of cognitive bias, confirmation bias, and dopamine reinforcement. It becomes very difficult to bring people back into the cooperative environment needed to drive change anywhere."
Sophie Schmidt
"Over the past decade and a half, social media has fundamentally reshaped how we get information, what information we get, and how we understand the truth. Generative AI is changing that again, shifting even more power into the hands of a few large technology companies."
Jason Dean
"When people only see information from one side, they become more extreme in their beliefs and less tolerant of other people and ideas."
John Gable

Key
Takeaways

  • AI Accountability Crisis: AI is fundamentally changing information creation – most AI-generated content is now created by tech companies themselves, unlike social media’s user-generated model. This creates new accountability challenges as AI handles both content creation and moderation, “decoupling the mechanism for accountability.” 

  • The Personalization Trap: Hyper-personalized AI experiences create “cocoons of cognitive bias, confirmation bias, and dopamine” that make cooperative societal engagement increasingly difficult. This threatens our ability to bring people together for collective problem-solving.


  • Human Connection as Technology: Research shows structured conversations between people with opposing viewpoints create lasting attitude changes remaining measurable two months later. Human connection itself should be viewed as a technology that counters digital isolation.


  • Suppression vs. Inoculation: Content moderation fails long-term because “information will find a new home.” “Inoculation” strategies that help people identify and reject misinformation independently prove more effective than suppression approaches.

 

Action
Items

  • Educational Leaders & Parents: Call for transparency before AI implementation in schools. Join advocacy organizations like MAMA to collectively pressure for better standards. Advocate for safeguards in AI products designed for children.

 

  • Policymakers & Government: Implement comprehensive safeguards for AI products interacting with children. Establish independent monitoring systems for AI safety compliance. Create accountability mechanisms for companies causing unnecessary harm.

 

  • Communities & Civil Society: Develop community-level inoculation programs teaching misinformation identification. Foster real-world connection opportunities through schools, sports teams, and community organizations. Support dialogue initiatives bringing different perspectives together.

 

  • Technology Companies & Private Sector: Prioritize “people first versus technology first” approaches. Build solutions faster than government regulation, focusing on bridging societal divides. Develop transparent bias detection systems promoting balanced information exposure. Create platforms enabling real relationships rather than superficial interactions.

 

  • Media & Information Literacy: Shift focus from content suppression to user empowerment. Develop comprehensive digital literacy programs addressing multifactorial belief formation. Create tools helping people navigate personalized information environments more critically.

Important Notice Regarding Fraudulent Website

We have identified a website operating under www.theconcordiasummit.org that is impersonating Concordia and copying our brand, language, and images. This site is not affiliated with Concordia. Our only official website is www.concordia.net.

If you have shared personal or payment information with the fraudulent site, please contact us immediately at enquiries@concordia.net

We are actively working to have the site removed.