home.social

#visionscience — Public Fediverse posts

Live and recent posts from across the Fediverse tagged #visionscience, aggregated by home.social.

  1. This is the kind of science headline that instantly sparks imagination: a reported living 3D-printed cornea tied to vision restoration.
    #Bioprinting #VisionScience #MedicalInnovation #Cornea #Biotech #FutureMedicine #news

  2. Gearing up for our Spectacular Science Celebration at #NDSU tomorrow and thought I'd post a quick video of one of the demos we'll have out: The tiny Ames Room! #visionscience #stemeducation youtube.com/watch?v=n3QcyZG1z2c

  3. We benchmarked the direct competitor of the EyeLink 1000 Plus: VPixx' TRACKPixx3 eye-tracker. Across 8 tasks (see graph), accuracy was mostly similar, especially at the fixation level. A key difference emerged at the sample level: TRACKPixx3 uses an undocumented internal filter which yields smoother sample data but may affect saccade kinematics and fixation onsets. Check the paper for fine-grained results:

    osf.io/preprints/psyarxiv/vucg

    #EyeTracking #VisionScience #Psycholinguistics #ResearchMethods

  4. We benchmarked the direct competitor of the EyeLink 1000 Plus: VPixx' TRACKPixx3 eye-tracker. Across 8 tasks (see graph), accuracy was mostly similar, especially at the fixation level. A key difference emerged at the sample level: TRACKPixx3 uses an undocumented internal filter which yields smoother sample data but may affect saccade kinematics and fixation onsets. Check the paper for fine-grained results:

    osf.io/preprints/psyarxiv/vucg

    #EyeTracking #VisionScience #Psycholinguistics #ResearchMethods

  5. We benchmarked the direct competitor of the EyeLink 1000 Plus: VPixx' TRACKPixx3 eye-tracker. Across 8 tasks (see graph), accuracy was mostly similar, especially at the fixation level. A key difference emerged at the sample level: TRACKPixx3 uses an undocumented internal filter which yields smoother sample data but may affect saccade kinematics and fixation onsets. Check the paper for fine-grained results:

    osf.io/preprints/psyarxiv/vucg

    #EyeTracking #VisionScience #Psycholinguistics #ResearchMethods

  6. We benchmarked the direct competitor of the EyeLink 1000 Plus: VPixx' TRACKPixx3 eye-tracker. Across 8 tasks (see graph), accuracy was mostly similar, especially at the fixation level. A key difference emerged at the sample level: TRACKPixx3 uses an undocumented internal filter which yields smoother sample data but may affect saccade kinematics and fixation onsets. Check the paper for fine-grained results:

    osf.io/preprints/psyarxiv/vucg

    #EyeTracking #VisionScience #Psycholinguistics #ResearchMethods

  7. My first research project back in '08 was on the efficiency of cortical networks responsible for processing lighting direction when viewing Lambertian surfaces. It was surprising as an undergraduate how well we can perceive things with eyes that have such low quantum efficiency

    #VisionScience #Psychology #Neurocience

  8. tl;dr summary: it's the ipRGCs (intrinsically photosensitive retinal ganglion cells).

    "[O]ur results suggest cone photoreceptors do not play a measurable role in the effects of light on #melatonin suppression and subjective alertness at night."

    cell.com/iscience/fulltext/S25

    #HumanVision #VisionScience #Cones #Eye

  9. 👀🔬 Groundbreaking discovery: apparently eye movements are somehow linked to how fast we see things! 🚀 Who knew?! In other news, Nature.com recommends upgrading from your ancient browser, suggesting that even their website is tired of living in the past. 💻🔧
    nature.com/articles/s41467-025 #eyeMovements #visionScience #browserUpgrade #techNews #HackerNews #ngated

  10. 👀🔬 Groundbreaking discovery: apparently eye movements are somehow linked to how fast we see things! 🚀 Who knew?! In other news, Nature.com recommends upgrading from your ancient browser, suggesting that even their website is tired of living in the past. 💻🔧
    nature.com/articles/s41467-025 #eyeMovements #visionScience #browserUpgrade #techNews #HackerNews #ngated

  11. 👀🔬 Groundbreaking discovery: apparently eye movements are somehow linked to how fast we see things! 🚀 Who knew?! In other news, Nature.com recommends upgrading from your ancient browser, suggesting that even their website is tired of living in the past. 💻🔧
    nature.com/articles/s41467-025 #eyeMovements #visionScience #browserUpgrade #techNews #HackerNews #ngated

  12. My seminar series on "The phenomenon of Colour" starts this Friday. Sadly, I haven't been able to figure out zoom, but a couple of spots for in-person attendees are still open until Thursday!

    tusharchauhan.com/courses/2025

    #colour #color #course #vision #VisionScience #iap #colorscience

  13. Funded PhD opportunity! Work in Leeds and Hull with Briony Yorke, Yvonne Nyathi, Tyler Howell-Bray and me on understanding rhodopsin biochemistry in fish vision, as part of the Yorkshire Bioscience BBSRC DTP

    #FishSci #VisionScience #Biochemistry #Evolution #AcademicChatter #PhD #cichlids #Yorkshire

    findaphd.com/phds/project/seei

  14. I will be holding my seminar series "The Phenomenon of Colour" again in 2025.

    I am thinking of designing an info-poster for it. If any former students/participants have artwork that I can use for the posters, please feel free to send it my way !

    Details of the 2025 edition:
    tusharchauhan.com/courses/2025

    I will be adding a new seminar: "Colour and Technology" to the programme.

    #Vision #VisionScience #Color #Colour #ColorVision #ColourVision #ColorScience #ColourScience #TPOC #IAP #MIT #Harvard

  15. After more than 3 years of work, the official version of my paper on human perception of perspective—spanning concepts in human #visionscience, #arthistory, and #computationalphotography —is now online! jov.arvojournals.org/article.a

  16. The Centre for Vision Research (CVR) at York University is hosting their annual 'CVR-VISTA Vision Science Summer School' program for undergraduates considering graduate studies in vision science

    I attended it back in 2009 and it was well worth the experience, more info here:

    yorku.ca/cvr/summer-school/

    #psychology #psychophysics #neuroscience #VisionScience #cvr #yorkuniversity

  17. I will be holding a short seminar series starting this Monday, titled:

    "The Phenomenon of Color: From Newton to Darwin".

    More details here: tusharchauhan.com/courses/the-

    I'm afraid I couldn't figure out the zoom logistics on time, so it is in-person only.

    #Vision #VisionScience #Color #Colour #ColorVision #ColourVision #MIT #Harvard

  18. I was asked to review this paper last year for inclusion in the journal "Developmental Science" and agreed on the condition that I could share my review publicly, and they asked that I wait until the article had been published. It now has!

    doi.org/10.1101/2022.06.06.494

    Brilliant paper, give it a read if you're into #ColourVision!

    #PreReview
    #ColorVision #Color #Colour
    #DevelopmentalScience #VisionScience
    @prereview

  19. @icevislab

    @Myndex currently researching visual readability for self-illuminated displays, i.e. web content. Creator of APCA contrast method and guidelines. #color #colour #visualPerception #readability #visionScience #typography

  20. Wow. In 24 hours, we have gone from zero to 4.4K followers, that‘s crazy. Thank you for a warm welcome and excellent tips. I gave up on replying to all of you after someone pointed out that I was spamming thousands of people – sorry! Also, please do not read too much into it if we do not respond or take a long time responding, we are a busy bunch and may simply sometimes miss your post or messages. Mastodon allows long posts so I am taking advantage of that, so here are a few things that you may – or may not – want to know.

    —Who are we?—

    Research in the Icelandic Vision Lab (visionlab.is) focuses on all things visual, with a major emphasis on higher-level or “cognitive” aspects of visual perception. It is co-run by five Principal Investigators: Árni Gunnar Ásgeirsson, Sabrina Hansmann-Roth, Árni Kristjánsson, Inga María Ólafsdóttir, and Heida Maria Sigurdardottir. Here on Mastodon, you will most likely be interacting with me – Heida – but other PIs and potentially other lab members (visionlab.is/people) may occasionally also post here as this is a joint account. If our posts are stupid and/or annoying, I will however almost surely be responsible!

    —What do we do?—

    Current and/or past research at IVL has looked at several visual processes, including #VisualAttention , #EyeMovements , #ObjectPerception , #FacePerception , #VisualMemory , #VisualStatistics , and the role of #Experience / #Learning effects in #VisualPerception . Some of our work concerns the basic properties of the workings of the typical adult #VisualSystem . We have also studied the perceptual capabilities of several unique populations, including children, synesthetes, professional athletes, people with anxiety disorders, blind people, and dyslexic readers. We focus on #BehavioralMethods but also make use of other techniques including #Electrophysiology , #EyeTracking , and #DeepNeuralNetworks

    —Why are we here?—

    We are mostly here to interact with other researchers in our field, including graduate students, postdoctoral researchers, and principal investigators. This means that our activity on Mastodon may sometimes be quite niche. This can include boosting posts from others on research papers, conferences, or work opportunities in specialized fields, partaking in discussions on debates in our field, data analysis, or the scientific review process. Science communication and outreach are hugely important, but this account is not about that as such. So we take no offence if that means that you will unfollow us, that is perfectly alright :)

    —But will there still sometimes be stupid memes as promised?—

    Yes. They may or may not be funny, but they will be stupid.

    #VisionScience #CognitivePsychology #CognitiveScience #CognitiveNeuroscience #StupidMemes

  21. Hello everyone, here's my not so original #introduction:

    I'm Aurelio, I am a research fellow in #ExperimentalPsychology and #CognitiveNeuroscience @YorkPsychology.

    My research is mainly focused on: #VisionScience, #VisualPerception, #VisualCognition, #TimePerception, #Metacognition and #IndividualDifferences.

    Preferred #Methods: #Psychophysics, #MEG, #EEG #pupillometry and #Computational Modeling.

    I am particularly interested in studying temporal deficits in #clinical populations.