Perplexica

Vane (Perplexica 2.0) Quickstart With Ollama and llama.cpp

Vane (Perplexica 2.0) Quickstart With Ollama and llama.cpp

Self-hosted AI search with local LLMs

Vane is one of the more pragmatic entries in the “AI search with citations” space: a self-hosted answering engine that mixes live web retrieval with local or cloud LLMs, while keeping the whole stack under your control.

Self-hosting Perplexica - with Ollama

Self-hosting Perplexica - with Ollama

Running copilot-style service locally? Easy!

That’s very exciting! Instead of calling copilot or perplexity.ai and telling all the world what you are after, you can now host similar service on your own PC or laptop!