home.social

#ollama4j — Public Fediverse posts

Live and recent posts from across the Fediverse tagged #ollama4j, aggregated by home.social.

  1. Is Java ready for real #AI? With #Llama & #Ollama4J, you can build chatbots, assistants, or content filters—all running on your machine, not in the cloud. Lutske de Leeuw walks you through the architecture.

    Try it yourself: javapro.io/2025/10/15/tame-you

    @ollama #LLM #JAVAPRO #Java

  2. Running #LLMs locally in #Java used to mean pain: Python wrappers, cloud hacks, or workarounds. With @ollama & #Ollama4J, it’s now native, clean, and offline.
    Lutske de Leeuw explains the setup, tuning, & use cases.

    See what’s possible: javapro.io/2025/10/15/tame-you

    @ollama #JAVAPRO

  3. Exploring AI with Groovy™ using Ollama4j, LangChain4J, Spring AI, Embabel, Micronaut, & Quarkus (Spring AI updated to 1.1.0, added Micronaut, Quarkus, and AI tools examples):
    groovy.apache.org/blog/groovy-
    @ApacheGroovy #TheASF #embabel #groovylang #ollama4j #langchain4j #springai #Micronaut #Quarkus #holidaytips

  4. Thanks to Lutske de Leeuw’s walkthrough, you can now run #Llama 2-7B locally in #Java—with no APIs, no latency, and no surprises. Ideal for chatbots, summaries, or code generation.

    Build your own local #AI stack: javapro.io/2025/10/15/tame-you

    @ollama #Ollama4J #LLM #JAVAPRO

  5. Want to use #LLMs in #Java without sending your data to the cloud? With Lutske de Leeuw’s guide to running #Llama locally via #Ollama4J, #AI becomes private, predictable and fully under your control.

    Learn how to set it up: javapro.io/2025/10/15/tame-you

    @ollama #JAVAPRO