Quantcast
Channel: Andrej Baranovskij Blog
Viewing all articles
Browse latest Browse all 705

LLM Structured Output with Local Haystack RAG and Ollama

$
0
0
Haystack 2.0 provides functionality to process LLM output and ensure proper JSON structure, based on predefined Pydantic class. I show how you can run this on your local machine, with Ollama. This is possible thanks to OllamaGenerator class available from Haystack. 

 

Viewing all articles
Browse latest Browse all 705

Trending Articles