Taking Action with Tools and Function Calling
Explore how to empower large language models with tools and function calling to access real-time data, execute complex tasks, and improve responsiveness. Understand the structured interaction loop between the LLM and external functions to produce accurate, dynamic answers in applications.
In our last lesson, we built a powerful RAG pipeline. We successfully provided our model with an “open book” to read from, enabling it to answer questions about specific, static documents. It is now a brilliant expert researcher.
But a researcher’s knowledge is based on the books in their library. What happens when a user asks a question that can’t be answered by a static document?
“What’s the current weather in London?”
“What’s the price of a ticket from NYC to LA next Tuesday?”
“What is the status of my recent order?”
The RAG pipeline is useless in such scenarios. The model’s internal knowledge is outdated or irrelevant. It needs the ability to fetch live information and interact with the outside world. How do we give our LLM “hands”?
From knower to doer: Function calling
The solution is a technique called function calling (also known as “Tool Use”). This is the mechanism that allows us to break the LLM out of its digital confinement and let it interact with our own code. It enables the model to:
Access real-time information: Connect to live APIs for weather, stock prices, or flight information.
Interact with private systems: Query ...