Skip to content

Demonstrate two types of chat interactions with a Mattermost instance leveraging Mattermost OpenAPI v3 spec and Spring AI.

License

Notifications You must be signed in to change notification settings

pacphi/mattermost-ai-service

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

17 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Mattermost AI Service

GA Github Action CI Workflow Status Known Vulnerabilities

Background

Demonstrate two types of chat interactions with a Mattermost instance leveraging Mattermost OpenAPI v3 spec and Spring AI.

Use-cases:

  • Ingest a Mattermost channel into a VectorStore, then chat with that point-in-time corpus of knowledge
  • Chat by configuring tool-calling to ask for insights in real-time

Getting started

I got started with:

  • A Github account
  • A Stackhero account
  • This Spring Initializr configuration
  • Mattermost
    • credentials (i.g., username, password)
  • An LLM provider
    • one of Groq Cloud, OpenAI, or Ollama

Prerequisites

  • Docker Compose
  • Git CLI
  • An Open AI or Groq Cloud account
  • Java SDK 21
  • Maven 3.9.9
  • Mattermost OpenAPI v3 spec
  • yq CLI

How to clone

with Git CLI

git clone https://github.com/pacphi/mattermost-ai-service

with Github CLI

gh repo clone pacphi/mattermost-ai-service

Keeping up-to-date with OpenAPI spec changes

Visit https://api.mattermost.com/ in your favourite browser periodically to grab updates that may have been introduced to the Mattermost OpenAPI specification.

Click on the Download button next to the Download OpenAPI specification label at the top of the page.

Copy and overwrite the file named openapi.json into the resources directory here.

Execute the following command in a terminal shell in that directory:

cat openapi.json | yq eval -P '.' > mattermost-openapi-v3.yml

It's strongly encouraged that you compare the version in Git commit history with the version you just fetched. There are likely to be breaking changes that result in compilation failures for generated source.

How to build

Open a terminal shell, then execute:

cd mattermost-ai-service
mvn clean package

For more exotic build and packaging alternatives, refer to the guide here.

How to run

Set these environment variables

with username and password

export MATTERMOST_BASE_URL=
export MATTERMOST_USERNAME=
export MATTERMOST_PASSWORD=

Add appropriate values for each of the required MATTERMOST_ prefixed environment variables above.

with a personal access token

export MATTERMOST_BASE_URL=
export MATTERMOST_PERSONAL_ACCESS_TOKEN=

Likewise, add appropriate values for each of the required MATTERMOST_ prefixed environment variables above.

Refer to this guide to understand the various run configuration alternatives available.

If you chose to launch containers with docker compose you will first need to create a Mattermost account. Visit http://localhost:8065 to do that.

Make sure to use the same credentials as you had exported above. Or if you forgot to set the environment variables, just use the defaults as declared in application.yml.

Open your favorite browser and visit http://localhost:8080.

Back in the terminal shell, press Ctrl+C to shutdown.

In case you launched containers with docker compose, you can clean everything up by running:

docker stop $(docker ps -a -q) && docker rm $(docker ps -a -q) && docker volume rm $(docker volume ls -q)

How to deploy to Tanzu Platform for Cloud Foundry

Refer to the instructions here.

About

Demonstrate two types of chat interactions with a Mattermost instance leveraging Mattermost OpenAPI v3 spec and Spring AI.

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published