Holly Cummins

The Power of LLMs in Java – Leveraging Quarkus and LangChain4j

2024-10-24 IBM TechXchange holly cumminsai

Generative AI has taken the world by storm over the last year, and it seems like every executive leader out there is telling us “regular” Java application developers to “add AI” to our applications. Does that mean we need to drop everything we’ve built and become data scientists instead now? Fortunately, we can actually infuse AI models built by actual AI experts into our applications fairly straightforwardly, thanks to some new projects out there. We promise it’s not as complicated as you might think! Thanks to the ease of use and superb developer experience of Quarkus and the nice AI integration capabilities that the LangChain4j libraries offer, it becomes trivial to start working with AI and make your stakeholders happy :) In this session, you’ll explore a variety of AI capabilities. We’ll start from the Quarkus DevUI where you can try out AI models even before writing any code. Then we’ll get our hands dirty with some code and exploring LangChain4j features such as prompting, chaining, and preserving state; agents and function-calling; enriching your AI model’s knowledge with your own documents using retrieval augmented generation (RAG); and discovering ways to run (and train) models locally using tools like Ollama and/or Podman AI Lab. In addition, we’ll take a look at observability and fault tolerance of the AI integration and compile the app to a native binary.