Quantcast
Channel: Andrej Baranovskij Blog
Viewing all articles
Browse latest Browse all 717

LLM Structured Output for Function Calling with Ollama

$
0
0
I explain how function calling works with LLM. This is often confused concept, LLM doesn't call a function - LLM retuns JSON response with values to be used for function call from your environment. In this example I'm using Sparrow agent, to call a function. 

 

Viewing all articles
Browse latest Browse all 717

Latest Images

Trending Articles



Latest Images