#ollamawebui — Public Fediverse posts
Live and recent posts from across the Fediverse tagged #ollamawebui, aggregated by home.social.
-
Perplexica is a system similar to Copilot and Perplexity.ai.
You know:
- You ask a question
- AI searches the internet
- Then AI summarises all it has found
- Then presents the result with the references to original web sites
- Also a list of images and youtube videos on the right
- Also follow-up questions ready for you to click, if you'd like to explore the topic a bit moreBUT: Instead of calling copilot or perplexity.ai and telling all the world what you are after,
you can now host similar service on your own PC or laptop!See for yourselves: https://www.glukhov.org/post/2024/08/selfhosting-perplexica-ollama/
#AI #LLM #perplexity.ai #perplexica #ollama #ollamawebui #LLMs
-
@dpowr @dsoft same for me. Why send my data over the wire when I can have all that power locally on my machine. For once, I do not want to be the product.
With regards to interfacing with #Ollama, I use #MindMac, #LMStudio and #Diffusionbee. I installed the #OllamaWebUI too. Have to play with it some more.
Another great feature is that the Ollama API is compatible with OpenAI’s now. That allows Ollama to work with so many more tools and libraries out there.
-
@dpowr @dsoft same for me. Why send my data over the wire when I can have all that power locally on my machine. For once, I do not want to be the product.
With regards to interfacing with #Ollama, I use #MindMac, #LMStudio and #Diffusionbee. I installed the #OllamaWebUI too. Have to play with it some more.
Another great feature is that the Ollama API is compatible with OpenAI’s now. That allows Ollama to work with so many more tools and libraries out there.
-
@dpowr @dsoft same for me. Why send my data over the wire when I can have all that power locally on my machine. For once, I do not want to be the product.
With regards to interfacing with #Ollama, I use #MindMac, #LMStudio and #Diffusionbee. I installed the #OllamaWebUI too. Have to play with it some more.
Another great feature is that the Ollama API is compatible with OpenAI’s now. That allows Ollama to work with so many more tools and libraries out there.
-
@dpowr @dsoft same for me. Why send my data over the wire when I can have all that power locally on my machine. For once, I do not want to be the product.
With regards to interfacing with #Ollama, I use #MindMac, #LMStudio and #Diffusionbee. I installed the #OllamaWebUI too. Have to play with it some more.
Another great feature is that the Ollama API is compatible with OpenAI’s now. That allows Ollama to work with so many more tools and libraries out there.
-
@dpowr @dsoft same for me. Why send my data over the wire when I can have all that power locally on my machine. For once, I do not want to be the product.
With regards to interfacing with #Ollama, I use #MindMac, #LMStudio and #Diffusionbee. I installed the #OllamaWebUI too. Have to play with it some more.
Another great feature is that the Ollama API is compatible with OpenAI’s now. That allows Ollama to work with so many more tools and libraries out there.