Artwork

A tartalmat a Plutopia News Network biztosítja. Az összes podcast-tartalmat, beleértve az epizódokat, grafikákat és podcast-leírásokat, közvetlenül a Plutopia News Network vagy a podcast platform partnere tölti fel és biztosítja. Ha úgy gondolja, hogy valaki az Ön engedélye nélkül használja fel a szerzői joggal védett művét, kövesse az itt leírt folyamatot https://hu.player.fm/legal.
Player FM - Podcast alkalmazás
Lépjen offline állapotba az Player FM alkalmazással!

Sophie Nightingale: Our Minds on Digital Technology

1:02:07
 
Megosztás
 

Manage episode 517601026 series 2292604
A tartalmat a Plutopia News Network biztosítja. Az összes podcast-tartalmat, beleértve az epizódokat, grafikákat és podcast-leírásokat, közvetlenül a Plutopia News Network vagy a podcast platform partnere tölti fel és biztosítja. Ha úgy gondolja, hogy valaki az Ön engedélye nélkül használja fel a szerzői joggal védett művét, kövesse az itt leírt folyamatot https://hu.player.fm/legal.

The Plutopia podcast hosts Dr. Sophie Nightingale, a psychologist at Lancaster University, to discuss how digital technology — especially social media, generative AI, and the constant flow of online information — shapes human memory, judgment, and vulnerability to deception. She explains that people struggle to evaluate critically the sheer volume of information they encounter, so they’re more likely to accept content that aligns with their preexisting beliefs, and this helps misinformation spread. Nightingale traces her research from early work on how taking photos can impair memory to current studies showing that most people can spot fake or AI-generated images only slightly better than chance, and even training improves performance only modestly. She and the hosts dig into the limits of AI “guardrails,” the uneven global landscape of AI regulation, the rise of misogynistic online spaces, and the troubling growth of AI-enabled nonconsensual intimate imagery, arguing that legal reform, platform accountability, and public education are all needed to reduce harm.

One of the things that tends to make people quite susceptible is just information overload, purely that we live in an age where we are accessing so much information all the time we can’t possibly interpret, or critically think about, everything. So we might well just accept things that we wouldn’t otherwise. There’s quite a lot of evidence showing that’s especially the case, if that information coincides with your pre-existing beliefs. So for example, if I happen to be a huge fan of Donald Trump, let’s say, and I saw some misinformation around Donald Trump that was positive about him, then I would probably be more likely to believe that than somebody who was not a fan of Donald Trump already, if you see what I mean. So those biases definitely exist. There’s a lot of evidence showing that. And then I think, you know, it kind of comes back as well to — if you want to believe something, you will.

  continue reading

274 epizódok

Artwork
iconMegosztás
 
Manage episode 517601026 series 2292604
A tartalmat a Plutopia News Network biztosítja. Az összes podcast-tartalmat, beleértve az epizódokat, grafikákat és podcast-leírásokat, közvetlenül a Plutopia News Network vagy a podcast platform partnere tölti fel és biztosítja. Ha úgy gondolja, hogy valaki az Ön engedélye nélkül használja fel a szerzői joggal védett művét, kövesse az itt leírt folyamatot https://hu.player.fm/legal.

The Plutopia podcast hosts Dr. Sophie Nightingale, a psychologist at Lancaster University, to discuss how digital technology — especially social media, generative AI, and the constant flow of online information — shapes human memory, judgment, and vulnerability to deception. She explains that people struggle to evaluate critically the sheer volume of information they encounter, so they’re more likely to accept content that aligns with their preexisting beliefs, and this helps misinformation spread. Nightingale traces her research from early work on how taking photos can impair memory to current studies showing that most people can spot fake or AI-generated images only slightly better than chance, and even training improves performance only modestly. She and the hosts dig into the limits of AI “guardrails,” the uneven global landscape of AI regulation, the rise of misogynistic online spaces, and the troubling growth of AI-enabled nonconsensual intimate imagery, arguing that legal reform, platform accountability, and public education are all needed to reduce harm.

One of the things that tends to make people quite susceptible is just information overload, purely that we live in an age where we are accessing so much information all the time we can’t possibly interpret, or critically think about, everything. So we might well just accept things that we wouldn’t otherwise. There’s quite a lot of evidence showing that’s especially the case, if that information coincides with your pre-existing beliefs. So for example, if I happen to be a huge fan of Donald Trump, let’s say, and I saw some misinformation around Donald Trump that was positive about him, then I would probably be more likely to believe that than somebody who was not a fan of Donald Trump already, if you see what I mean. So those biases definitely exist. There’s a lot of evidence showing that. And then I think, you know, it kind of comes back as well to — if you want to believe something, you will.

  continue reading

274 epizódok

Minden epizód

×
 
Loading …

Üdvözlünk a Player FM-nél!

A Player FM lejátszó az internetet böngészi a kiváló minőségű podcastok után, hogy ön élvezhesse azokat. Ez a legjobb podcast-alkalmazás, Androidon, iPhone-on és a weben is működik. Jelentkezzen be az feliratkozások szinkronizálásához az eszközök között.

 

Gyors referencia kézikönyv

Hallgassa ezt a műsort, miközben felfedezi
Lejátszás