Skip to main content

One post tagged with "container images"

View All Tags

Use Local LLMs with Oracle AI Vector Search

· 7 min read
Adrian Png
Director of Innovation, AI and Cloud Solutions @ Insum, a Talan Company

Oracle Database 23ai and Ollama logos together.

There a few reasons why we would want to perform "generative AI" tasks, such as generating embeddings, summary, and text, using local resources, rather than relying on external, third-party providers. Perhaps the most important ones are data privacy, costs, and the convenience of choosing what large language models (LLMs) to use. About four months ago, I wrote that with the Oracle Database 23ai 23.5.0.24.06 release on the Oracle Autonomous Database (ADB), it was possible to create the credentials to access these third-party LLMs through their REST APIs. Fast-forward, my ADB now has the upgraded version 23.6.0.24.10 that has support for "local" third-party REST providers, specifically, Ollama. In this article, I will demonstrate how you can utilise other Oracle Cloud Infrastructure (OCI) managed resources to quickly get started.