- When using OpenAI, setup your API key with this environment variable: OPENAI_API_KEY
- The OpenAI cost for running this project with the default model should be a few cents at most.
- See application.properties for more details
- See MovieService and MusicServiceTest
- See ImageService and ImageServiceTest
- See ChatMemory and ChatMemoryTest
- See BookService and BookServiceTest
- Install Ollama and run it locally
- Inside application.properties, configure the model that you are running inside Ollama (default is
spring.ai.ollama.chat.model=llama3.2
) - Inside pom.xml, change the dependency
spring-ai-openai-spring-boot-starter
intospring-ai-ollama-spring-boot-starter
- Run all the tests (some will not be working as some of the features are not supported by local models)