home.social

#hardwarechoices β€” Public Fediverse posts

Live and recent posts from across the Fediverse tagged #hardwarechoices, aggregated by home.social.

  1. Debating TinyBox (TinyCorp) vs NVIDIA DGX Station is like choosing between a tactical off-grid cabin and a corporate panic room.
    TinyBox: OSS-friendly, sips power, runs whisper quietβ€”great for AI nerds who want root.
    DGX: Raw CUDA firepower, but you’ll need NV cash and maybe a colocation rack.
    Depends if you want control… or just raw scale.
    #AI #TinyBox #DGX #SelfHosted #OpenSource #HardwareChoices

  2. Planning to locally host an LLM πŸ€–. Having a lot of internal struggles figuring out the best price curve.

    A used NVIDIA Jetson Xavier NX can be acquired for ~$250 πŸ’°.
    A 4060 w/ 16gb for ~$400 πŸ’³.

    Not sure I really want to do either πŸ€”. Waiting for NPU support on Transformer models may be a better approach.

    #LLM #LocalHosting #NVIDIA #AI #MachineLearning #NPU #Transformers #TechDilemmas #HardwareChoices

  3. Planning to locally host an LLM πŸ€–. Having a lot of internal struggles figuring out the best price curve.

    A used NVIDIA Jetson Xavier NX can be acquired for ~$250 πŸ’°.
    A 4060 w/ 16gb for ~$400 πŸ’³.

    Not sure I really want to do either πŸ€”. Waiting for NPU support on Transformer models may be a better approach.

    #LLM #LocalHosting #NVIDIA #AI #MachineLearning #NPU #Transformers #TechDilemmas #HardwareChoices

  4. Planning to locally host an LLM πŸ€–. Having a lot of internal struggles figuring out the best price curve.

    A used NVIDIA Jetson Xavier NX can be acquired for ~$250 πŸ’°.
    A 4060 w/ 16gb for ~$400 πŸ’³.

    Not sure I really want to do either πŸ€”. Waiting for NPU support on Transformer models may be a better approach.

    #LLM #LocalHosting #NVIDIA #AI #MachineLearning #NPU #Transformers #TechDilemmas #HardwareChoices

  5. Planning to locally host an LLM πŸ€–. Having a lot of internal struggles figuring out the best price curve.

    A used NVIDIA Jetson Xavier NX can be acquired for ~$250 πŸ’°.
    A 4060 w/ 16gb for ~$400 πŸ’³.

    Not sure I really want to do either πŸ€”. Waiting for NPU support on Transformer models may be a better approach.

    #LLM #LocalHosting #NVIDIA #AI #MachineLearning #NPU #Transformers #TechDilemmas #HardwareChoices

  6. Planning to locally host an LLM πŸ€–. Having a lot of internal struggles figuring out the best price curve.

    A used NVIDIA Jetson Xavier NX can be acquired for ~$250 πŸ’°.
    A 4060 w/ 16gb for ~$400 πŸ’³.

    Not sure I really want to do either πŸ€”. Waiting for NPU support on Transformer models may be a better approach.

    #LLM #LocalHosting #NVIDIA #AI #MachineLearning #NPU #Transformers #TechDilemmas #HardwareChoices