FAIRplexica is an open-source AI assistant for research data management (RDM). It is based on Perplexica, uses the metasearch engine SearXNG to retrieve relevant RDM resources and local LLMs (via Ollama) to answer user questions. API providers like OpenAI, Groq, Anthropic, Google as well as custom API endpoints are available as well.
fairplexica.mp4
-
Ensure Docker is installed and running on your system.
-
Clone the FAIRplexica repository:
git clone https://github.com/UB-Mannheim/FAIRplexica
-
Navigate to the project directory:
cd FAIRplexica -
Duplicate the
sample.config.tomlfile and rename it toconfig.toml.- Fill in the empty fields depending on the LLM Providers you want to use; i.e. provide valid API keys for
OpenAI,Anthropicetc. - Set your
SearXNGURL - Set your
OllamaURL if you have Ollama running and want to use itNote: You can change API keys after starting FAIRplexica inside the Settings dashboard.
[GENERAL] ... GLOBAL_CONTEXT = "research data management" ... [ADMIN] # Provide a username for the admin account USERNAME = "admin" # Provide a secure password for the admin account PASSWORD = "changeme" # Provide a base64 string with a length of at least 32 characters. # You can create your JWT_SECRET with a common shell command like: # openssl rand -base64 32. JWT_SECRET = "replace-with-secure-secret" ... [MODELS.OLLAMA] # If running Ollama models make sure the correct Ollama port is used # depending on your setup API_URL = "http://host.docker.internal:11434" API_KEY = "" ... [API_ENDPOINTS] # Make sure the correct SearXNG port is used depending on your setup SEARXNG = "http://host.docker.internal:4001/"
- Fill in the empty fields depending on the LLM Providers you want to use; i.e. provide valid API keys for
-
Change to the main project directory (containing the
docker-compose.yaml) and execute:docker compose up -d
Docker will pull the image, install everything and start the server. This may take a while.
-
Open
http://localhost:3000/in your browser, to check if everything works correctly. -
Navigate to
http://localhost:3000/adminand log into the Admin Settings with theUSERNAMEandPASSWORDyou have set in section 4.- Inside the dashboard you can set your LLM and embedding models as well as API keys and your
OllamaURL.
- Inside the dashboard you can set your LLM and embedding models as well as API keys and your