home.social

#deepneuralnetworks — Public Fediverse posts

Live and recent posts from across the Fediverse tagged #deepneuralnetworks, aggregated by home.social.

  1. One Open-source Project Daily

    Turn your two-bit doodles into fine artworks with deep neural networks, generate seamless textures from photos, transfer style from one image to another, perform example-based upscaling, but wait... there's more! (An implementation of Semantic Style Transfer.)

    https://github.com/alexjc/neural-doodle

    #1ospd #opensource #deeplearning #deepneuralnetworks #imagegeneration #imagemanipulation #imageprocessing

  2. Do you have a recommendation for a #CloudComputing provider (with #GPU, suitable for training #deepNeuralNetworks)? We are looking for options with a maximum of:
    - #GreenIT, low CO2 footprint, #sustainability
    - #DataPrivacy

    #followerpower

  3. 🚀 We've released a new version of DIANNA, our open-source #ExplainableAI (#XAI) tool designed to help researchers get insights into predictions of #DeepNeuralNetworks.

    What's new:
    👉improved dashboard
    👉extensive documentation
    👉added tutorials

    MORE: esciencecenter.nl/news/new-rel

  4. Does anyone know the URL for the "observatory" website (I think that's what they called it) where one of the AI/DNN labs had analysed various machine vision models and built a map of all of the nodes.

    You could click on each node and see the images (and sometimes text) that triggered it, and also images that were generated when they excited that node while clamping others (like Deep Dreams)

    I can't remember who it was and can't find it.

    #AI #DeepNeuralNetworks #NeuralNets #YOLO #deepdream

  5. With the success of #DeepNeuralNetworks in building #AI systems, one might wonder if #Bayesian models are no longer significant. New paper by Thomas Griffiths and colleagues argues the opposite: these approaches complement each other, creating new opportunities to use #Bayes to understand intelligent machines 🤖

    📔 "Bayes in the age of intelligent machines", Griffiths et al. (2023)
    🌍 arxiv.org/abs/2311.10206

    #DNN #NeuronalNetworks

  6. Why #DeepNeuralNetworks need #Logic:

    Nick Shea (#UCL/#Oxford) suggests

    (1) Generating novel stuff (e.g., #Dalle's art, #GPT's writing) is cool, but slow and inconsistent.

    (2) Just a handful of logical inferences can be used *across* loads of situations (e.g., #modusPonens works the same way every time).

    So (3) by #learning Logic, #DNNs would be able to recycle a few logical moves on a MASSIVE number of problems (rather than generate a novel solution from scratch for each one).

    #CompSci #AI

  7. Wow. In 24 hours, we have gone from zero to 4.4K followers, that‘s crazy. Thank you for a warm welcome and excellent tips. I gave up on replying to all of you after someone pointed out that I was spamming thousands of people – sorry! Also, please do not read too much into it if we do not respond or take a long time responding, we are a busy bunch and may simply sometimes miss your post or messages. Mastodon allows long posts so I am taking advantage of that, so here are a few things that you may – or may not – want to know.

    —Who are we?—

    Research in the Icelandic Vision Lab (visionlab.is) focuses on all things visual, with a major emphasis on higher-level or “cognitive” aspects of visual perception. It is co-run by five Principal Investigators: Árni Gunnar Ásgeirsson, Sabrina Hansmann-Roth, Árni Kristjánsson, Inga María Ólafsdóttir, and Heida Maria Sigurdardottir. Here on Mastodon, you will most likely be interacting with me – Heida – but other PIs and potentially other lab members (visionlab.is/people) may occasionally also post here as this is a joint account. If our posts are stupid and/or annoying, I will however almost surely be responsible!

    —What do we do?—

    Current and/or past research at IVL has looked at several visual processes, including #VisualAttention , #EyeMovements , #ObjectPerception , #FacePerception , #VisualMemory , #VisualStatistics , and the role of #Experience / #Learning effects in #VisualPerception . Some of our work concerns the basic properties of the workings of the typical adult #VisualSystem . We have also studied the perceptual capabilities of several unique populations, including children, synesthetes, professional athletes, people with anxiety disorders, blind people, and dyslexic readers. We focus on #BehavioralMethods but also make use of other techniques including #Electrophysiology , #EyeTracking , and #DeepNeuralNetworks

    —Why are we here?—

    We are mostly here to interact with other researchers in our field, including graduate students, postdoctoral researchers, and principal investigators. This means that our activity on Mastodon may sometimes be quite niche. This can include boosting posts from others on research papers, conferences, or work opportunities in specialized fields, partaking in discussions on debates in our field, data analysis, or the scientific review process. Science communication and outreach are hugely important, but this account is not about that as such. So we take no offence if that means that you will unfollow us, that is perfectly alright :)

    —But will there still sometimes be stupid memes as promised?—

    Yes. They may or may not be funny, but they will be stupid.

    #VisionScience #CognitivePsychology #CognitiveScience #CognitiveNeuroscience #StupidMemes

  8. Through scaling #DeepNeuralNetworks we have found in two different domains, #ReinforcementLearning and #LanguageModels, that these models learn to learn (#MetaLearning).

    They spontaneously learn internal models with memory and learning capability which are able to exhibit #InContextLearning much faster and much more effectively than any of our standard #backpropagation based deep neural networks can.

    These rather alien #LearningModels embedded inside the deep learning models are emulated by #neuron layers, but aren't necessarily deep learning models themselves.

    I believe it is possible to extract these internal models which have learned to learn, out of the scaled up #DeepLearning #substrate they run on, and run them natively and directly on #hardware.

    This allows those much more efficient learning models to be used either as #LearningAgents themselves, or as a further substrate for further meta-learning.

    I have an #embodiment #research on-going but with a related goal and focus specifically in extracting (or distilling) the models out of the meta-models here:
    github.com/keskival/embodied-e

    It is of course an open research problem how to do this, but I have a lot of ideas!

    If you're inspired by this, or if you think the same, let's chat!

  9. @rachelwilliams, yes, the #DeepNeuralNetworks exhibit true #intuition and #creativity. However, the large amount of #compute required is because we are using traditional #computers which are #synchronous, #dense and #sequential to emulate these #NeuralNetworkArchitectures which are #asynchronous, #sparse and massively #parallel.
    With proper #cores they should take much less power than the human #brain, which is 12 W.

  10. Another reason (among many) why I became disenchanted with #Cybernetics and had to “invent” #Kihbernetics.

    Warren Sturgis McCulloch, the co-inventor of the first computational model of a #Neuron that was the precursor for #AI and #DeepNeuralNetworks, uses a racial slur to incorrectly suggest that Cybernetics is somehow the result of the “interbreeding” between the Natural and the Artificial in the preface he wrote for Gordon Pask’s book:

    goodreads.com/en/book/show/396

  11. I got interested in #BiologicallyInspiredComputing when I learned about #ArtificialLife #ALife. At that time the computing resources available were limited compared to today. Now we have #DeepNeuralNetworks #DNN but it is widely agreed (including by me) that they do not replace natural #cognition. #BiologicallyInspiredComputing can be used as an application technology, but how do we use #Computing to understand #Cognition?

  12. Tesla acquires computer vision startup DeepScale in push towards robotaxis - Tesla has acquired DeepScale, a Silicon Valley startup that uses low-wattage processors to power mo... more: feedproxy.google.com/~r/Techcr #deepneuralnetworks #forrestiandola #autonomouscar #california #co-founder #autopilot #deepscale #hyperloop #real-time #transport #elonmusk #driver #tesla #cars #ceo #tc