home.social

Search

1000 results for “sparse_array”

  1. 'Outlier Robust and Sparse Estimation of Linear Regression Coefficients', by Takeyuki Sasai, Hironori Fujisawa.

    jmlr.org/papers/v26/23-1583.ht

    #outlier #outliers #robust

  2. @liamvhogan #Canberra was pretty sparse and to me felt reasonably egalitarian in the 1970s/1980s. The cost and availability of housing now is a huge problem. Isn’t it nationwide - in fact international? #ausecon

  3. The #Python/Scipy sparse linalg solver couldn't solve the quadratic placement problem for the industry standard 'adaptec1' benchmark. It ran out of memory (16 GB of RAM). This isn't even a large problem by today's standards. #vlsi #algebra

  4. 'Numerically Stable Sparse Gaussian Processes via Minimum Separation using Cover Trees', by Alexander Terenin et al.

    jmlr.org/papers/v25/22-1170.ht

    #interpolation #sparse #gaussian

  5. Interesting #math video on sparse rulers, which has applications ranging from #stellar #interferometry to #ElectricalEngineering designing voltage taps on transformers (ok, these days we have switchmode PS, but it's still cool).

    youtube.com/watch?v=JQkFyuwAEd

    #discrete #maths #proofs

  6. Wikipedia on disadvantages of sparse files:
    "Loading executables on 32 bit Windows (exe or dll) which are sparse takes a much longer time since the file cannot be memory mapped in the limited 4 GB address space, and are not cached as there is no codepath for caching 32 bit sparse executables"

    Who thought "I will make a dll that's full of 0 bytes, and let the filesystem handle it as a sparse file" could possibly be a solution to their problem?

    en.m.wikipedia.org/wiki/Sparse

    #CursedTech

  7. “we identified a sparse subset of neurons in the primary visual cortex (V1) and higher visual areas that respond emergently to ICs [illusory contours]. We found that these highly selective "IC-encoders" mediate the neural representation of IC inference. Strikingly, selective activation of these neurons using two-photon holographic optogenetics was sufficient to recreate IC representation in the rest of the V1 network, in the absence of any visual stimulus.”

    From: “Recurrent pattern completion drives the neocortical representation of sensory inference” by Shin et al. 2023 biorxiv.org/content/10.1101/20

    #neuroscience #NeuroPixels #mouse #CerebralCortex

  8. "The Desperation of Sparse Structure", a tetraptych in the Dark Proteome series. #darkProteome #G3BP1

  9. A connection between sparse and low rank matrices. Let S be a sparse similarity matrix, for example the distances of the 3 nearest neighbours in a low dimensional manifold. Can you recover S if you have a low rank (dense) matrix L from in a high dimensional space? This paper provides a geometric interpretation for S = max(0,L). It proposes a decomposition algorithm, that can be modelled as a ReLU neural network layer.

    #MachineLearning #SparseDecomposition #LowRank #TMLR
    openreview.net/forum?id=p8gncJ

  10. 'EiGLasso for Scalable Sparse Kronecker-Sum Inverse Covariance Estimation', by Jun Ho Yoon, Seyoung Kim.

    jmlr.org/papers/v23/21-0511.ht

    #kronecker #sparse #gaussian

  11. Un po' di cose sparse che ho fatto nell'ultima settimana:
    Aiutato i miei con la raccolta delle olive, rivisto Bibi, la loro nuova gattina, fatto un mini cinnamon rolls vegano, ricominciato a giocare ad Animal Crossing dopo mesi, ho un dente nuovo ( di cui vi risparmio la foto :D)...
    Adesso invece mi sto bevendo una "birra" che ha il sapore e la consistenza di uno smoothie alla fragola.
    Ma dopo mi ci vuole assolutamente una IPA come si deve!
    #gattinisegreti #cucinaSegreta #birrette

  12. Un po' di cose sparse che ho fatto nell'ultima settimana:
    Aiutato i miei con la raccolta delle olive, rivisto Bibi, la loro nuova gattina, fatto un mini cinnamon rolls vegano, ricominciato a giocare ad Animal Crossing dopo mesi, ho un dente nuovo ( di cui vi risparmio la foto :D)...
    Adesso invece mi sto bevendo una "birra" che ha il sapore e la consistenza di uno smoothie alla fragola.
    Ma dopo mi ci vuole assolutamente una IPA come si deve!
    #gattinisegreti #cucinaSegreta #birrette

  13. Un po' di cose sparse che ho fatto nell'ultima settimana:
    Aiutato i miei con la raccolta delle olive, rivisto Bibi, la loro nuova gattina, fatto un mini cinnamon rolls vegano, ricominciato a giocare ad Animal Crossing dopo mesi, ho un dente nuovo ( di cui vi risparmio la foto :D)...
    Adesso invece mi sto bevendo una "birra" che ha il sapore e la consistenza di uno smoothie alla fragola.
    Ma dopo mi ci vuole assolutamente una IPA come si deve!
    #gattinisegreti #cucinaSegreta #birrette

  14. DeepSeek AI Releases DeepSeek-V4: Compressed Sparse Attention and Heavily Compressed Attention Enable One-Million-Token Contexts DeepSeek-AI has released a preview version of the DeepSeek-V4 series...

    #Agentic #AI #AI #Infrastructure #AI #Paper #Summary #AI #Shorts #Applications #Artificial

    Origin | Interest | Match
  15. Looks like #bhyve backed with sparse zvols presents virtual NVME disks with 512b sectors, but ahci and vtbd with 16K sectors. Huh.

    Same underlying storage, why change the sector size? #freebsd

  16. 🌪 Allerta FOMO
    🌧 Piogge sparse di FUD
    🎈 Raffiche di HYPE

    📈 Alta pressione sui fondamentali
    ☀️ Clima stabile sul Fair Value

    Previsioni a medio-lungo termine:
    Dash è costruito per resistere. 😎

    #CFV #DashTo5000 #Dash #CryptoWeather #CryptoTwitter #BuiltToLast #Crypto #DAO #Privacy #Freedom #Trustless #Sovereignty #Meme #Web3 #FairValue

    x.com/ItaliaDash/status/201094

  17. Công cụ mới mang tên Sparse giúp nén các mô hình AI đã fine-tune dưới dạng "delta" (phần chênh lệch) so với mô hình gốc.

    Điểm nổi bật:
    - Nén mô hình 14GB xuống còn 1.4GB (không mất dữ liệu) hoặc 50MB (tương đương LoRA).
    - Áp dụng được cho MỌI mô hình đã huấn luyện xong, không cần can thiệp lúc training.
    - Tốc độ khôi phục cực nhanh (khoảng 4 giây).
    - Giải pháp tối ưu cho lưu trữ và phân phối các mô hình AI chuyên biệt.

    #AI #MachineLearning #Sparse #Compression #DeepLearning #CongNghe #TriTueNh

  18. I added a short wikipage on Sparse Sets to my wiki: sheeeeeeeep.art/ds-sparse-set.

    Also, I added a favicon with my et nihilum signature.

    #SiteUpdates

  19. Всё что нужно знать про torch.sparse

    Разработчики PyTorch предоставили модуль torch.sparse для работы с разреженными тензорами , где большинство элементов – нули. Зачем это нужно? Представьте матрицу смежности графа, сильно обрезанную сеть или облако точек – хранить такие данные плотным массивом без надобности расточительно. Разрежённая структура сохраняет только ненулевые элементы и их индексы, что сильно экономит память и ускоряет вычисления . Например, матрица размером 10,000 на 10,000 с 100 000 ненулевых float-значений в разрежённом COO-формате займёт не 400 МБ, а около 2 МБ. Несмотря на перспективы, API разрежённых тензоров в PyTorch пока в бете и может менять крошечные детали . Будьте к этому готовы: часть операций поддерживается, часть – нет, и некоторые автоград-ячейки пока работают только для COO, а для CSR, например, градиент не считается. Но обо всём по порядку.

    habr.com/ru/companies/otus/art

    #ml #Data_Science #разрежённые_тензоры #PyTorch #оптимизация_памяти #torchsparse #матричное_умножение

  20. quadrable: Authenticated multi-version database: sparse binary merkle tree with compact partial-tree proofs lobste.rs/s/eae7p2 #databases #distributed #merkle-trees
    github.com/hoytech/quadrable