Customize your local AI assistant using Dify
Dify is an AI application development platform. It's one of the key open-source projects that Olares integrates to help you build and manage AI applications while maintaining full data ownership. Additionally, you can integrate your personal knowledge base documents into Dify for more personalized interactions.
Before you begin
To use local AI models on Dify, ensure you have:
- Ollama installed and running in your Olares environment
- Open WebUI installed with your preferred language models downloaded
TIP
For optimal performance, consider using lightweight yet powerful models like
gemma2
orqwen
, which offer a good balance between speed and capability.
Install Dify
Install Dify from Market based on your role:
- For admin: Install both "Dify For Cluster" and "Dify".
- For team members: Ensure your admin has installed "Dify For Cluster", and install "Dify" only.
Create an AI assistant app
Open Dify, navigate to the Studio tab, and select Create from Blank to create an app for the AI assistant. Here, we created an agent named "Ashia".
Click Go to settings on the right to access the model provider configuration page. You can choose between remote models or locally hosted models.
Add Ollama as model provider
Set up the Ollama access entrance:
a. Navigate to Settings > Application > Ollama > Entrances, and set the authentication level for Ollama to Internal. This configuration allows other applications to access Ollama services within the local network without authentication.
b. From the Entrances page, enter the Set up endpoint page to retrieve the default route ID for Ollama(
39975b9a
). Now you get the local access address for Ollama:https://39975b9a.local.{your username}.olares.com
, for example,https://39975b9a.local.kevin112.olares.com
.In Dify, navigate to Settings > Model Provider.
Select Ollama as the model provider, with the following configurations:
- Model Name: Enter the model name. For example:
gemma2
. - Base URL: Enter Ollama's local address you get in step 1, for example,
https://39975b9a.local.kevin112.olares.com
.
TIP
You can keep default values for other required fields.
- Model Name: Enter the model name. For example:
Click Save.
Configure Ashia
Navigate to Dify's Studio tab and enter Ashia.
From the model list on the right, select the Gemma2 model you just configured.
Click Publish. Now you can chat with Ashia in the Debug & Preview window.
Set up local knowledge base
- In Dify, navigate to the Knowledge tab.
- Locate your default knowledge base. It will be named after your Olares ID and monitors the
/Documents
folder in Files. - Enter
/Documents
and add documents to the knowledge base. - In Ashia's orchestration page, click addAdd to add context support for Ashia.
- Click Publish. Now try asking a domain-specific question with the help of the knowledge base.