home.social

#intel-arc — Public Fediverse posts

Live and recent posts from across the Fediverse tagged #intel-arc, aggregated by home.social.

fetched live
  1. Newest #IntelArc #GPU family member is here, the Panther Lake Arc B390... and it... purrs? 🖖 🥺 🐈‍⬛
    My OpenCL-Benchmark on the B390 measures ~7.4 TFlops FP32 and ~120GB/s memory bandwidth. hw-smi also works with the B390.
    #FluidX3D benchmarks here: github.com/ProjectPhysX/FluidX
    And the #OpenCL infos:
    -Arc B390: opencl.gpuinfo.org/displayrepo
    - Core Ultra X7 358H: opencl.gpuinfo.org/displayrepo

  2. Die DMA/IOMMU-Verbesserungen für neuere Intel-Systeme im Kernel 6.19.10/11 sind schon ziemlich nett. Erstes mal seit langem, dass der Lunar Lake mehrere Tage durchackert, ohne dass die interne ARC via Xe-Treiber den Dienst quittiert.

    #Linux #Intel #IntelArc

  3. Has anyone with an intel arc gpu encountered this kind of stuttering when recording using the quicksync av1 with obs? Same result with SVT-AV1. Would love to know how you solved it.

    youtu.be/A2gS4KJWjHg

    EDIT: Just had a new driver release (32.0.101.8629) and that solved this issue!

    EDIT2: It seems like the AV1 encoding is fine, it's playback that's the issue! Before the new driver, viewing recordings in VLC had the exact stutter. However after the update, AV1 files with the stutter view perfectly fine! I still get this stutter in DaVinci resolve though. It seems like the AMD hardware decoder could be the issue? I solved it disabling the AMD decode, leaving the Intel Quick Sync as the only decoder. Davinci Resolve -> Preferences -> Decoding Options -> Disable AMD.

    #intelarc #gaming

  4. instead of targeting gamers, these cards are aimed squarely at AI and professional workloads, signaling Intel’s strategic pivot toward high-memory, workstation-class GPUs over consumer gaming flagships.

    wccftech.com/big-battlemage-gp

    #Intel #IntelArc #Battlemage #GPU #AIHardware #WorkstationGPU #GDDR6 #GraphicsCard #TechNews #Semiconductors

  5. Intel’s long-awaited “Big Battlemage” GPU has finally arrived as the Arc Pro B70 and B65, both packing a massive 32GB of GDDR6 memory and built on the flagship BMG-G31 die, marking Intel’s most powerful discrete GPU yet.

    However, instead of targeting gamers, these cards are aimed squarely at AI and professional workloads, signaling Intel’s strategic pivot toward high-memory, workstation-class GPUs over consumer gaming flagships.

    wccftech.com/big-battlemage-gp

    #Intel #IntelArc #Battlemage #GPU #AIHardware #WorkstationGPU #GDDR6 #GraphicsCard #TechNews #Semiconductors

  6. Intel’s long-awaited “Big Battlemage” GPU has finally arrived as the Arc Pro B70 and B65, both packing a massive 32GB of GDDR6 memory and built on the flagship BMG-G31 die, marking Intel’s most powerful discrete GPU yet.
    wccftech.com/big-battlemage-gp

    #Intel #IntelArc #Battlemage #GPU #AIHardware #WorkstationGPU #GDDR6 #GraphicsCard #TechNews #Semiconductors #tech

  7. PSA: Do not support Crimson Desert with your money. They used AI slop and actively block execution on Intel GPUs

    #CrimsonDesert #ai #IntelArc #gaming

  8. @niccolove @niccolo_ve

    Comment on this video (which us not shown on my Fedi instance, yay federation!)

    tube.kockatoo.org/w/63cGWTvhzq

    I cannot believe that that GPU is not usable for video work. That is what GPUs are made for!

    Poorly, there are a few caveats:
    - there are always more DEcoders supported than ENcoders. I guess not being able to play a video (it would play but be less efficiently decoded on the CPU) is worse than being bound to #h265 instead of #av1 for recording.
    - thus it makes sense that GPUs generally work flawlessly for gaming, as that is just displaying videos (or rendering content, which is kinda different, no idea why video codecs are more difficult than rendering methods)
    - for every video format out there exists an encoder. If you use software rendering (on the CPU), you can use ANYTHING. Like encode to modern (and free!) AV1 using #svtav1 on any old CPU.
    - if you want to encode videos on a GPU, you need drivers, but also a different encoder. On #NVIDIA you use #NVENC (NVidia ENCoder), for example nvenc_x265 for h265/HEVC videos. On #IntelArc (and afaik AMD GPUs), you use the #QSV (quick sync video) library which builds on top of #VAAPI, which is open source. So you can encode to AV1 using qsv_av1 for example.

    What encoders are supported depends on the GPU a lot. Intel Arc seems to have supported the free, efficient and future-proof AV1 codec the earliest, but also the older free codecs VP9 and VP8 (also used in webm), like in a GPU I recently bought used. My older NVIDIA GPU only supports H265, which is proprietary and often less efficient than AV1 (not always, video codecs are black magic).

    As OBS can use your GPU for video capture (encoding), the issue is purely in software support.

    I only have experience with #ffmpeg for re-encoding videos, but that is also in many video editing tools. Try if you can get them somehow.

    For NVIDIA I needed the latest proprietary drivers (#NixOS makes this extremely easy, can recommend) and ffmpeg, nvenc worked out of the box.

    For intel Arc I used the regular drivers included in the kernel, but after adding the QSV runtime to my "hardware.graphics.extraPackages" I needed to recompile the entire kernel. Using a #longterm kernel (currently 6.18) makes this more viable... longterm XD.

    #MicrosoftSurface #KDENlive #Shotcut #VideoEditing

  9. Any #Linux users on #IntelArc here?

    I bought an Intel Arc A750 and it looks fancy, works for gaming and AV1 encoding (after adding "vpl-gpu-rt" recompiling the kernel for that I guess), but I cannot control the fan at all!

    The fan mode is "automatic" and on full power all the time which is crazy loud.

    I tried everything, updated my kernel to 6.18 longterm, used a windows install to use that intel tool to upgrade the GPU firmware, no change.

    Now I plan on connecting the fan to the case fan connector and controlling it through BIOS (I can read temperature now so that is okay-ish I guess).

    I couldn't find a single report of this being a thing at all, do these GPUs really have THAT shitty drivers? I thought it was just performance issues (which I didnt really see).

    Discourse Thread:

    discourse.nixos.org/t/76041

    @[email protected] @pcmasterrace @[email protected] @buildapc @[email protected] @[email protected] @intel_arc #PcBuilding #GraphicsCard #NixOS

  10. Was testing with and a couple of games getting it ready to sell when it crashes.

    Huh - power it back up and wont post. Stuff around with it for a bit. Suspect its not detecting the old gpu, put old gpu in the dock and now its not detecting it either.

    I think my bit the dust to protest me buying an

    I never have any luck selling old PCs 2/2

  11. Meinasin että on vaivaton homma vaihtaa pöytäkoneissa näytönohjaimet keskenään (Intel ARC A380 ja Radeon RX 570 Nitro+): ykköskoneeseen #IntelARC, kakkoskoneeseen #Radeon, koska kakkoskoneella haluaa kokeilla kaikkea eikä ARCiin ole esim. BSD:ssä juuri tukea.

    Muuten hyvä, mutta #Debian kieltäytyi käynnistymästä ARCilla! Piti ensin vikasietotilassa asentaa `firmware-intel-graphics`-paketti, jota ei-vapaana ei tietysti oletusarvoisesti asennettu. #atkjuttuja #linux #floss

  12. #IntelArc asettaa todella ankaria rajoituksia sille, mitä vapaata järjestelmää kokeilukoneelle kannattaa yrittää. Kaikki #BSD:t #FreeBSD:tä myöten ovat joko kieltäytyneet pääsemästä graafiseen tilaan tai buutanneet itsensä yrittäessään; Linuxeistakin #GuixSystem'issä oli näytönpiirto-ongelmia ja lopulta #Kwin lakkasi käynnistymästä (eli työpöydälle pääsi mutta ikkunoilla ei ollut reunoja ja ne olivat vain kiinteä alue näytön vasemmassa yläkulmassa). #linux #floss #atkjuttuja

  13. #CachyOS on ihan ookoo, vaikkei ehkä minua varten: nopeus miellyttää, mutten siitä, että ulkoasu on valmiiksi räätälöity sähköisesti korttitaloksi, ettei sitä uskalla lähteä sorkkimaan itselleen sopivaksi,

    Nyt taas kokeiluun #FreeBSD, vaikka pelkään yrityksen päättyvän lyhyeen, ellei #IntelARC-näytönohjainta saa toimimaan sen #X11:ssä tai #Wayland'issä. #atkjuttuja #linux #bsd #floss

  14. Pitäisikö vaihtaa päikseen ykkös- ja kakkospöytäkoneen näytönohjaimet… Kakkoskone kävisi kokeiluihin paremmin, jos siinä olisi muu #näytönohjain kuin #IntelARC, jolle tuki on vajavaista jo #BSD:issäkin. #atkjuttuja #Linux #FLOSS

  15. I need more #JustJosh interviews like this.

    They guy just pulls out the checklist and cites it line by line to the people who are (at least somewhat) responsable.

    youtube.com/watch?v=AzGFbkKZE7A

    "No consumer is going to understand this [X and H nomenclature]. How do we solve this?"

    #CES #Technology #Computer #Intel #Computers #PC #PCHardware #CPU #APU #IntelGraphics #IntelArc #ArrowLake #PantherLake #IntelCore

  16. Intel Core Ultra Serii 3 oficjalnie. Panther Lake i proces 18A mają uratować honor „Niebieskich”

    Na ten moment czekali wszyscy inwestorzy i fani technologii. Intel podczas CES 2026 oficjalnie zaprezentował procesory Core Ultra Serii 3 (kryptonim Panther Lake).

    To pierwsze układy wyprodukowane w długo wyczekiwanym procesie technologicznym Intel 18A. Nowe czipy trafią do laptopów jeszcze w tym miesiącu.

    Pomóż nam rozwijać iMagazine – ruszyło badanie czytelnictwa 2026

    18A – być albo nie być dla Intela

    Najważniejszą informacją nie są tu same gigaherce, ale sposób produkcji. Kafelek obliczeniowy (Compute Tile) w nowych procesorach powstaje w litografii 18A. To technologia, która ma być „powrotem króla” i dowodem na to, że fabryki Intela są w stanie konkurować z tajwańskim TSMC. Sukces Panther Lake jest kluczowy nie tylko dla sprzedaży laptopów, ale dla całego planu biznesowego Intela (Intel Foundry), który chce produkować czipy dla zewnętrznych firm.

    Architektura: krok w tył, by zrobić dwa do przodu?

    Panther Lake to w pewnym sensie odwrót od filozofii znanej z poprzedniej serii (Lunar Lake / Core Ultra 200V). Tamte układy miały pamięć RAM wbudowaną w procesor (jak w Apple Silicon) i były produkowane głównie przez TSMC.

    W Serii 3 Intel wraca do korzeni, choć w nowoczesnym wydaniu. Mamy tu konstrukcję kafelkową (Foveros):

    • Compute Tile (CPU + NPU): produkcja Intel 18A (to tu dzieje się magia).
    • Platform Controller Tile (I/O): produkcja TSMC.
    • Graphics Tile (GPU): wersja mocna (12 rdzeni) – TSMC. Wersja słabsza (4 rdzenie) – starszy proces Intel 3.

     

     

    Serie „X” i podział na wydajność

    Intel wprowadza ciekawe rozróżnienie w nazewnictwie, które ma pomóc w doborze sprzętu:

    • Core Ultra X9 i X7: to układy dla ultramobilnych maszyn bez dedykowanej karty graficznej. Posiadają potężne, 12-rdzeniowe iGPU Intel Arc B390 oraz wsparcie dla superszybkiej pamięci LPDDR5x-9600. Mają jednak mniej linii PCIe (12), bo nie przewiduje się łączenia ich z zewnętrznym GPU.
    • Core Ultra 9 i 7: to układy pomyślane do laptopów gamingowych i stacji roboczych z dedykowaną grafiką (np. NVIDIA GeForce). Mają słabsze iGPU (4 rdzenie), ale za to aż 20 linii PCIe, by obsłużyć mocną kartę graficzną.

    Wydajność i bateria

    Intel składa odważne deklaracje: Panther Lake ma być do 60% szybszy w zadaniach wielowątkowych i do 77% szybszy w grafice niż poprzednik (Lunar Lake).

    Imponująco wygląda też efektywność energetyczna. Referencyjny laptop Lenovo z układem Core Ultra X9 388H pozwolił na 27 godzin streamowania Netflixa w 1080p. Oczywiście to warunki laboratoryjne, ale wynik robi wrażenie.

    W kwestii AI, zintegrowane NPU osiąga 50 TOPS. To wystarcza, by spełnić wymogi standardu Microsoft Copilot+ PC (40 TOPS), choć konkurencja ucieka nieco dalej (AMD Ryzen AI ma 60 TOPS, a nowe Snapdragony aż 80 TOPS).

    Pierwsze laptopy z Panther Lake trafią na półki sklepowe 27 stycznia.

    Masz nowego laptopa z Intel Core Ultra, a Zdjęcia Google w Chrome „tną”? Wiemy, jak to naprawić

    #CES2026 #Intel18A #IntelArc #IntelCoreUltraSeries3 #news #PantherLake #procesoryDoLaptopów
  17. Schade. Auch der aktuelle Kernel hat noch Probleme, wenn man #Firefox mit #IntelArc nutzen möchte (#LunarLake). Am Anfang läuft alles OK, aber mit der Zeit reagiert alles immer langsamer. So weit, dass selbst Tastatur-Events nicht mehr erkannt werden. Keypress ohne Release, anyone? Alle paar Sekunden geht dann mal ein Block an Events wieder durch. Sobald man Firefox beendet oder minimiert ist wieder alles normal. Selbes System läuft mit Iris Xe ohne Murren.

  18. So how's the #IntelArc idle power draw issue these days, under #Linux? Also, anyone got any stats comparing the idle power across different models, maybe even across Alchemist and Battlemage?

  19. Dang it. I was gonna order an Intel ARC B580 LE this weekend, but now the ARC Pro B50 is an option and I’m not sure what to get.

    It’s not like I’m a gamer, but if my system allowed it, I might.

    Focus is gonna be on Blender, Davinci Resolve, and some AI experimentation, all in Linux of course.

    I know the spec difference, core counts, bus speeds, etc. not many reviews comparing the two as they are different classes of cards.

    I think my system is really the current limiter being a 10th gen Intel i7 with PCI express v3, but the card can go into my next system as well.

    Suggestions? Input? Things to consider or think about?

    #IntelArc #blender #blender3d #davinciresolve

  20. In its NVIDIA & Intel announcement, it reads:

    For personal computing, Intel will build and offer to the market x86 system-on-chips (SOCs) that integrate NVIDIA RTX GPU chiplets. These new x86 RTX SOCs will power a wide range of PCs that demand integration of world-class CPUs and GPUs.

    I really hope this doesn't mean that Intel Arc is being discontinued. We really need that 3rd player to break up the AMD/NVIDIA GPU duopoly.

    nvidianews.nvidia.com/news/nvi

    #intel #AMD #NVIDIA #GPU #graphics #IntelArc

  21. I stay to my assertion that especially for us here in #Europe where power is kinda expensive the #IntelArc GPUs are currently the best option on the market

    (except when you're one of the few people that really wants to play the latest tripple A games at highest settings with the most expensive latest GPU, then you're kinda locked in with whatever team green puts out as these games are usually only optimized for their cards).

    But for everyone else Intel Arc. Especially when you're on #Linux.