#eyetracking — Public Fediverse posts
Live and recent posts from across the Fediverse tagged #eyetracking, aggregated by home.social.
-
DATE: May 13, 2026 at 08:00AM
SOURCE: PSYPOST.ORG** Research quality varies widely from fantastic to small exploratory studies. Please check research methods when conclusions are very important to you. **
-------------------------------------------------TITLE: Your eyes reveal how strongly you believe fake news before you even make a choice
A recent study published in the Proceedings of the National Academy of Sciences suggests that our preexisting beliefs deeply influence how we learn new information in our daily lives. By tracking eye movements and decision-making during a simulated news evaluation game, scientists found that people readily learn from rewards that match their existing views but struggle to adapt when rewards challenge their preconceived notions.
These findings provide evidence for the cognitive pathways that allow misinformation to persist in the modern digital landscape. This dynamic explains why simply presenting factual corrections often fails to change minds.
People increasingly rely on social media platforms for their daily news consumption, where automated algorithms tend to filter content to match users’ existing preferences. This digital environment provides a fertile ground for disinformation to spread rapidly across large populations, raising the question of why individuals continue to believe false content even when objective fact-checking is readily available.
“I began seriously considering this line of research in 2021, after witnessing firsthand the damage misinformation caused during the COVID-19 pandemic, particularly in relation to the vaccination campaign,” said study author Stefano Lasaponara, an associate professor in the department of psychology at Sapienza University of Rome. “That experience led me to wonder to what extent fake news might affect not only what people believe, but also how they learn from feedback and experience.”
Lasaponara and his colleagues sought to understand how a person’s preexisting judgments and internal confidence interact with the way they learn from external feedback. They designed the study to test whether our tendency to favor belief-consistent information might be rooted in basic, everyday learning mechanisms. By examining these fundamental learning processes, the authors hoped to uncover why people find it so difficult to update their opinions when faced with misleading news stories.
To explore these questions, the scientists recruited a final sample of 28 healthy young adults, aged between 18 and 36, to participate in a detailed three-part experiment. In the first phase, participants viewed a set of 324 news headlines that had recently circulated on popular social media platforms. Half of these selected headlines contained real news events, and the other half contained entirely false information. Participants had to read each headline on a computer screen and judge whether it was true or fake.
They also wagered a virtual amount of money, ranging from zero to 99 cents, on their provided answer. This financial bet served as a measurable indicator of their internal confidence regarding each specific news item. Based on these answers, the scientists grouped the headlines into four personalized categories for each individual participant. These customized categories included news judged as true with high confidence, true with low confidence, fake with high confidence, and fake with low confidence.
During this phase, the researchers used specialized eye-tracking glasses to measure the participants’ pupil dilation as they read. Pupil dilation is an involuntary physical response that indicates mental effort, focused attention, and physiological arousal. Measuring this subtle response allowed the team to track brain engagement in real time without interrupting the participants.
In the second phase, the researchers tested how well participants could learn new rules based on their previous judgments. Participants played a computer game where they had to choose between pairs of the headlines they had just rated in the first phase. The goal was to select the specific headline that would win them a 20-cent virtual monetary reward. Unknown to the participants, the rewards were not randomly assigned throughout the game.
In different rounds of the game, the 83 percent chance of winning a reward was tied to specific categories established during the initial evaluation. For example, in one round, picking headlines the participant had previously judged as true provided the reward. In another round, picking headlines judged as fake gave the reward. Other rounds rewarded choices based on high or low confidence, and one single round gave rewards entirely at random to serve as a baseline comparison.
The third and final phase tested whether the learning game had changed the participants’ minds regarding the news items. The scientists showed the participants the original headlines again, along with their initial true or false judgments and their associated confidence wagers. Participants were given the option to either confirm their original judgment or change their mind completely. If their final answer matched the actual real or fake status of the news, they kept their wagered money as a final payout.
The outcomes of the learning phase showed that participants learned very differently depending on the hidden rules of the computer game. When the game rewarded participants for choosing headlines they already believed to be true, they learned the winning strategy quickly and earned high scores. On the other hand, performance dropped when the game rewarded them for picking headlines they believed were fake. Participants also struggled to figure out the game’s hidden rules when rewards were tied to their confidence levels rather than their beliefs about truth.
“One important takeaway is that our prior beliefs can begin shaping our decisions even before we explicitly express a judgment,” Lasaponara said. “In our study, these pre-existing convictions were strong enough to influence learning itself. More broadly, this suggests that we should approach new information as critically and as openly as possible, trying, when we can, to evaluate it without immediately filtering it through our preconceptions.”
To understand the underlying mental strategies at play, the scientists used computational modeling, which involves creating mathematical simulations of human decision-making processes. The models revealed that when the rewards matched a participant’s belief in the truth, they used broad, generalized rules to make their choices.
When the rewards no longer matched their sense of truth, the participants abandoned these broad generalization strategies. Instead, they reverted to simply reacting to positive and negative feedback on a trial by trial basis, which proved to be a much less effective way to navigate the game.
The eye-tracking data provided physical evidence that our beliefs engage our nervous systems before we even make a conscious choice. In the initial phase, participants’ pupils dilated more when they were looking at headlines they would later judge with high confidence. This noticeable dilation suggests that strong subjective beliefs trigger an early physical arousal response within the body. During the learning phase, pupils dilated when participants faced a mental conflict, such as having to choose between a strongly held belief and a competing reward signal.
“I expected to find pupillary effects related to the moment of decision itself, but I did not expect to observe them at an earlier stage, during the formation of a belief-consistent choice tendency,” Lasaponara noted. “That was particularly interesting because it suggests that the influence of prior beliefs may begin unfolding before an overt response is made.”
When participants received feedback that went against their established beliefs, their pupils also widened, indicating cognitive surprise and an increased mental load. In the final feedback phase, participants showed a strong tendency to stick to their original opinions about the headlines. They rarely changed their minds, especially if they had placed a high confidence wager during the very first phase of the experiment.
Interestingly, high confidence made people resistant to changing their minds regardless of whether the headline was actually true or false in reality. Participants were slightly more willing to update their beliefs if they had initially expressed low confidence in their judgment. While the study provides detailed evidence on how subjective beliefs shape learning, there are potential misinterpretations and limitations to keep in mind.
Because the study required participants to experience all the different reward rules back to back, the learned rules from one round might have affected how they behaved in the next round. “An important caveat is that this study does not yet allow us to make strong claims about correcting misinformation, or about when and how people truly change their minds after learning,” Lasaponara explained. “Our results show that prior beliefs can bias reinforcement learning, but they do not yet tell us how to reliably undo that bias. This is something we are currently addressing in follow-up work.”
The experiment also relied exclusively on political and social news headlines, meaning these learning patterns might look different if the topics were neutral or completely unrelated to current events. Future research could expand on these physiological findings by using different types of information to see if this learning behavior applies to other areas of human life.
“Our broader goal is not only to better understand why people believe fake news, but also to identify the conditions under which misinformation becomes less effective,” Lasaponara added. “In follow-up studies, we are investigating whether different reinforcement structures can lead to varying degrees of belief updating and how computational models can help explain when people remain resistant to correction and when they become more flexible.”
Scientists could also design experiments that explicitly present participants with direct evidence contradicting their beliefs, rather than just changing a computer game’s reward rules. This alternative approach would help map out the exact conditions that might finally encourage people to update their most stubborn opinions.
“The title is also a small nod to Metallica, whom I am a big fan of,” Lasaponara added. “More importantly, this work would not have been possible without my co-authors, especially Valentina Piga and Silvana Lozito, whose contributions were fundamental to the project.”
The study, “Eye of the beholder: Pupillary response reflects how subjective prior beliefs shape reinforcement learning with fake news,” was authored by Silvana Lozito, Valentina Piga, Sara Lo Presti, Angelica Scuderi, Fabrizio Doricchi, Massimo Silvetti, and Stefano Lasaponara.
-------------------------------------------------
DAILY EMAIL DIGEST: Email [email protected] -- no subject or message needed.
Private, vetted email list for mental health professionals: https://www.clinicians-exchange.org
Unofficial Psychology Today Xitter to toot feed at Psych Today Unofficial Bot @PTUnofficialBot
NYU Information for Practice puts out 400-500 good quality health-related research posts per week but its too much for many people, so that bot is limited to just subscribers. You can read it or subscribe at @PsychResearchBot
Since 1991 The National Psychologist has focused on keeping practicing psychologists current with news, information and items of interest. Check them out for more free articles, resources, and subscription information: https://www.nationalpsychologist.com
EMAIL DAILY DIGEST OF RSS FEEDS -- SUBSCRIBE: http://subscribe-article-digests.clinicians-exchange.org
READ ONLINE: http://read-the-rss-mega-archive.clinicians-exchange.org
It's primitive... but it works... mostly...
-------------------------------------------------
#psychology #counseling #socialwork #psychotherapy @psychotherapist @psychotherapists @psychology @socialpsych @socialwork @psychiatry #mentalhealth #psychiatry #healthcare #depression #psychotherapist #FakeNews #BeliefDrivenLearning #EyeTracking #PupillaryResponse #Misinformation #ReinforcementLearning #CognitiveBias #MediaLiteracy #FactCheck #NewsResearch
-
New global research shows eye movements reveal how native languages shape reading
#Science #Language #Reading #Immigration #Education #Research #Multilingual #Literacy #EyeTracking #Learning #GlobalStudies #Migration #LanguageSkills #EyeMovement
https://the-14.com/new-global-research-shows-eye-movements-reveal-how-native-languages-shape-reading/ -
# Hospiz-Projekt geht endlich weiter
Wir haben endlich den technischen Dienstleister des Hospizes ranbekommen, um uns ans Personalrufsystem anzuschließen. Er meinte, das wird nicht ganz so einfach sein. Ist wohl kein einfaches DECT-System. (Vielleicht doch mit Servos den Knopf drücken?)
# Tutorial für Optikey
Wir bereiten gerade das erste große Tutorial vor: Eine Optikey-Einführung, in Video- und Textform.
-
Wie unterscheiden sich Lehramtsstudierende und erfahrene Lehrkräfte im Erkennen von Unterrichtsstörungen? Ann-Sophie Grub, Antje Biermann und Roland Brünken analysieren Eye-Tracking-Studien zur professionellen Wahrnehmung von Klassenführung – mit Fokus auf „Noticing“ und „Reasoning“.
#ClassroomManagement #Lehrerbildung #ProfessionalVision #EyeTracking #Bildungsforschung
#peDOCS
⬇️
https://www.pedocs.de/frontdoor.php?source_opus=21187&pk_campaign=SocMed&pk_kwd=mastodonsource_opus=21704 -
Die GPN war klasse! Danke euch allen! 💜
Hier unser Vortrag:
FOSS-Eye-Tracking für Menschen mit ALSAußerdem, wer mitmachen will, gerne melden: https://eyes-on-disabilities.org/de/06-miscellaneous/contact.html
Was zum Beispiel gebraucht wird:
- Texte schreiben
- Setups ausprobieren
- Software einfacher machen
- Weitere Soft- und Hardware finden
- Kopfbewegung lösen
- 3D-Modellierung und –Druck
- Mobile-Dev -
Wir halten einen Vortrag bei der GPN 🤩: https://cfp.gulas.ch/gpn23/talk/QXC9UX/
2025-06-22 13:45–14:05, ZKM MedientheaterAußerdem besuchen wir nächsten Monat unser erstes Hospiz.
#gpn23 #EyesOnDisabilities #eyetracking #headtracking #disabilities #inklusion
-
»Here’s what’s inside #Meta’s experimental new smart glasses: with advanced #eyetracking and #handtracking capabilities.« https://www.theverge.com/news/679707/meta-aria-gen-2-upgrades-specs-ai?ARglass.es #ARglasses #SmartGlasses #AugmentedReality #AR
-
TOLLE NEUIGKEITEN:
Wir gehen bald regelmäßig in die Alltags- und Pflegeassistent. Wir möchten das Leben von Betroffenen von ALS und MS besser verstehen, und mit ihnen zusammen unsere Eye-Tracking-Lösungen weiterentwickeln. Wir freuen uns riesig auf die Menschen, die wir kennenlernen werden und hoffen, ihnen helfen zu können.Den Kontakt ermöglicht uns die IFB-Stiftung. Danke für das Vertrauen: https://ifb-stiftung.de/
#EyesOnDisabilities #eyetracking #headtracking #disabilities #inklusion
-
Virtual Reality Experiment Generator For Research.
#Vizard #SightLabVR
#EyeTracking #HandTracking #FaceTrackinghttps://www.worldviz.com/virtual-reality-experiment-generator-for-research
-
Eye-tracking study reveals where women and men look when viewing a female butt https://www.psypost.org/eye-tracking-study-reveals-where-women-and-men-look-when-viewing-a-female-butt/?utm_source=dlvr.it&utm_medium=mastodon #EyeTracking #GenderDifferences #ResearchStudy #HumanBehavior #VisualAttention
-
Neuigkeiten zu #EyesOnDisabilities gibt es ab sofort hier:
https://cccwi.social/@eyes_on_disabilities -
#EyesOnDisabilities diese Woche:
Wir waren auf dem Chaos Communication Congress und konnten unsere bisherigen Ergebnisse demonstrieren. Danke an alle Interessierten.
-
Researchers test machine learning’s potential to reveal personality traits through eye tracking https://www.psypost.org/researchers-test-machine-learnings-potential-to-reveal-personality-traits-through-eye-tracking/?utm_source=dlvr.it&utm_medium=mastodon #MachineLearning #PersonalityTraits #EyeTracking #Machiavellianism #Extraversion
-
In our latest FinnBrain article, we examined the reciprocal relations between mother-child interaction and child engagement with faces from infancy to preschool age. We used Emotional Availability Scales to examine mothers’ emotional availability in interaction and eye tracking to examine attention dwell time for pictured faces and non-face patterns under distraction at 8, 30, and 60 months.
1/3
-
Отличается ли внимание человека и модели-трансформера?
Для того, чтобы понимать язык и делать различные выводы, человек рассуждает, опираясь на знания о мире и здравый смысл. Несмотря на то, что большие языковые модели достигли значительных успехов в обработке естественного языка, рассуждение на основе здравого смысла остаëтся одним из самых сложных навыков. Наиболее распространëнным способом оценки способностей моделей рассуждать, опираясь на здравый смысл, является тест на основе схемы Винограда ( The Winograd Schema Challenge , или WSC), названный в честь Терри Винограда, профессора компьютерных наук в Стэнфордском университете. Тест основан на разрешении синтаксической неоднозначности. Давайте рассмотрим пример из схемы Винограда: "Кубок не помещается в коричневый чемодан, потому что он слишком большой." Что в этом случае является слишком большим: чемодан или кубок? Для человека ответ является очевидным, а для модели?.. Мы расскажем про наше исследование, в котором сравнили внимание человека и модели, а также проанализировали, на какие слова при решении схемы Винограда обращают внимание человек и модель. Хотя внимание человека и внимание трансформера кажутся совершенно разными, отдельные результаты говорят о взаимосвязи между ними.
https://habr.com/ru/companies/sberbank/articles/839634/
#natural_language_processing #selfattention #transformers #eyetracking
-
DIY Eye and Face Tracking for the Valve Index VR Headset https://hackaday.com/2024/05/19/diy-eye-and-face-tracking-for-the-valve-index-vr-headset/ #VirtualReality #mouthtracking #facetracking #eyetracking #valve #vr
-
DIY Eye and Face Tracking for the Valve Index VR Headset - The Valve Index VR headset has been around for a few years now. It doesn’t come wi... - https://hackaday.com/2024/05/19/diy-eye-and-face-tracking-for-the-valve-index-vr-headset/ #virtualreality #mouthtracking #facetracking #eyetracking #valve #vr
-
Overrated or underrated: #AppleVisionPro for #cognitiveScience?
Researchers and Clinicians: how might you use it?
Is it "a beacon of advancement in psychological research and therapy? (https://doi.org/10.3389/fpsyg.2023.1280213 )
Pre-order on Jan 19 at 5:00 a.m. PT. Available Feb 2: https://www.apple.com/apple-vision-pro/
#science #Tech #AR #VR #wearables #positionTracking #eyeTracking #faceTracking #mouthTracking #handTracking
-
🌐 "Eye and Face Tracking in VR: Avatar Embodiment and Enfacement with Realistic and Cartoon Avatars."
Our latest article explores the impact of eye and face tracking on avatar experiences in immersive settings and discusses the current state of eye- and face-tracking technology. 📖💡
Read the full article here: https://doi.org/10.1145/3626705.3627793 📚✨
#VirtualReality #MUM23 #EyeTracking #FaceTracking #VRExperience 🚀👾
-
#ISOQOL content that conference attendees liked on other platforms:
Julie Ratcliffe presented among other things on their work including insights from the application of #EyeTracking technology when filling out the #EQ5D
https://link.springer.com/article/10.1007/s11136-023-03488-w🔥For papers published in Quality of Life Research in 2023, this is so far the most accessed paper!🥳
-
This is attempt #4 for DIY eye tracking in the vive pro 2
It... Works but I'm going for a revision 6 to maybe move all the LEDs behind the shroud.
Yes I skipped 5 because it was worse.
In VR I can use single eye tracking reliably but dual eye leads to so many crossed eye moments.
-
Last week "Quality of Life Research" #ISOQOL published 6 papers:
https://link.springer.com/journal/11136/online-firstFor example,
the use of #EyeTracking technology to investigate use of the #EQ5D
https://link.springer.com/article/10.1007/s11136-023-03488-wthe Norwegian Fatigue Characteristics and Interference Measure for #stroke survivors
https://link.springer.com/article/10.1007/s11136-023-03477-z #Rasch #Psychometricsand social factors of health-related quality of life in older adults
https://link.springer.com/article/10.1007/s11136-023-03472-4 -
Wow. In 24 hours, we have gone from zero to 4.4K followers, that‘s crazy. Thank you for a warm welcome and excellent tips. I gave up on replying to all of you after someone pointed out that I was spamming thousands of people – sorry! Also, please do not read too much into it if we do not respond or take a long time responding, we are a busy bunch and may simply sometimes miss your post or messages. Mastodon allows long posts so I am taking advantage of that, so here are a few things that you may – or may not – want to know.
—Who are we?—
Research in the Icelandic Vision Lab (https://visionlab.is) focuses on all things visual, with a major emphasis on higher-level or “cognitive” aspects of visual perception. It is co-run by five Principal Investigators: Árni Gunnar Ásgeirsson, Sabrina Hansmann-Roth, Árni Kristjánsson, Inga María Ólafsdóttir, and Heida Maria Sigurdardottir. Here on Mastodon, you will most likely be interacting with me – Heida – but other PIs and potentially other lab members (https://visionlab.is/people) may occasionally also post here as this is a joint account. If our posts are stupid and/or annoying, I will however almost surely be responsible!
—What do we do?—
Current and/or past research at IVL has looked at several visual processes, including #VisualAttention , #EyeMovements , #ObjectPerception , #FacePerception , #VisualMemory , #VisualStatistics , and the role of #Experience / #Learning effects in #VisualPerception . Some of our work concerns the basic properties of the workings of the typical adult #VisualSystem . We have also studied the perceptual capabilities of several unique populations, including children, synesthetes, professional athletes, people with anxiety disorders, blind people, and dyslexic readers. We focus on #BehavioralMethods but also make use of other techniques including #Electrophysiology , #EyeTracking , and #DeepNeuralNetworks
—Why are we here?—
We are mostly here to interact with other researchers in our field, including graduate students, postdoctoral researchers, and principal investigators. This means that our activity on Mastodon may sometimes be quite niche. This can include boosting posts from others on research papers, conferences, or work opportunities in specialized fields, partaking in discussions on debates in our field, data analysis, or the scientific review process. Science communication and outreach are hugely important, but this account is not about that as such. So we take no offence if that means that you will unfollow us, that is perfectly alright :)
—But will there still sometimes be stupid memes as promised?—
Yes. They may or may not be funny, but they will be stupid.
#VisionScience #CognitivePsychology #CognitiveScience #CognitiveNeuroscience #StupidMemes
-
#psycholinguistics #linguistics
#LanguageProcessing #EyeTrackingIs there any reason why it should be more difficulty to comprehend noun phrases with 'this' or 'that' type deictic determiners instead of generic 'the'? Can anyone point me to some relevant literature?
I'm utterly flummoxed by some eye-tracking results that I keep replicating for reasons that are utterly beyond me.
-
I just posted the first wrap-up video in the Pupillometry online course we started this week, with Professor Bruno Laeng answering some questions from learners.
It is not too late to join if you want to learn about #Pupillometry and #EyeTracking. The course runs for six weeks, with a workload of 2-3 hours per week. And it is free!
-
Reading "Eye Movements and Vision" by Yarbus and am fascinated by the amount of techniques these early researchers had to master in order to conduct their experiments. What are some modern examples of people tinkering in their labs and creating unique experimental apparatus?
#CognitivePsychology #Vision #VisualPerception #EyeTracking @psychology @cogneurophys @cognition
-
Look what the CUNY Interlibrary Loan delivered! They truly are the unsung heroes of academia. @psychology @cogneurophys
#EyeTracking #CognitivePsychology #Vision #VisualPerception #Neuroscience
-
New week, time for an intro: I'm an assistant professor in #Translation at the University of Warwick. I work primarily with #audiovisual translation, #subtitling and #technologies, and everything in between. I'm currently working on #eyetracking, #machinetranslation implementation by experts and non-experts, #GlobalSouth and #GlobalNorth cooperation, and #translator training.
-
How Advances in Eye-and-Hand-Tracking Technologies Will Enrich XR - Extended reality is expected to be a $300 billion market by 2024, and it’s growing... - https://readwrite.com/how-advances-in-eye-and-hand-tracking-technologies-will-enrich-xr/ #hand-tracking #xrtechnology #eyetracking #hand-eye #tech #xr
-
Web app tracks pupil size in people, mice
https://www.spectrumnews.org/news/web-app-tracks-pupil-size-in-people-mice/
#machinelearning #pupillometry #eyetracking #biomarkers #attention #Toolbox #autism #News -
heise+ | Handsfree Coding: Softwareentwicklung ohne Maus und Tastatur
Programmieren lässt sich mittlerweile auch mit Sprache, Gestik und Mimik statt der üblichen Kombination Maus und Tastatur.
Handsfree Coding: Softwareentwicklung ohne Maus und Tastatur -
Connecting autism-linked genetic variation to infant social behavior
https://www.spectrumnews.org/opinion/viewpoint/connecting-autism-linked-genetic-variation-to-infant-social-behavior/
#sensoryperception #commonvariants #faceprocessing #genetictesting #brainimaging #eyetracking #Viewpoint #babysibs #autism #EEG -
Mobalytics raises $11M and adds eye tracking metrics to its automated gaming coach - Back in 2016, Mobalytics wowed the judges at Disrupt SF with its data-based coach for the exploding... more: http://feedproxy.google.com/~r/Techcrunch/~3/odE-tiwHQMA/ #leagueoflegends #fundings&exits #recentfunding #battlefield #eyetracking #mobalytics #startups #funding #gadgets #gaming #tobii #tc
-
Open Source Headset With Inside-Out Tracking, Video Passthrough - The folks behind the Atmos Extended Reality (XR) headset want to provide improved accessibility with... more: https://hackaday.com/2019/06/15/open-source-headset-with-inside-out-tracking-video-passthrough/ #insideouttracking #extendedreality #virtualreality #computerhacks #wearablehacks #developerkit #handtracking #headtracking #eyetracking #atmos #ar #vr #xr