How to use Ollama python library to give context before query

Sending context to LLM before it generate response

import ollama

# Define the context and question
context = """
Indian shooting legend Abhinav Bindra believes that despite the country having a population of 1.4 billion people, the “playing population” of India remains significantly less.

Bindra was speaking in the aftermath of India’s 2024 Paris Olympics campaign, where the nation won just six medals (one silver and five bronze), despite much being expected from a majority of the athletes.

And on another six occasions in the Paris Games, Indian athletes endured fourth-place finishes. Had those fourth-place finishes been converted into medals, India’s medals tally might very well have gone into double digits.

"""
question = "How many medals india won in olympic 2024? use the given context only to answer my question"

# Structure the messages
messages = [
    {"role": "system", "content": context},
    {"role": "user", "content": question}
]

# Make the API call
response = ollama.chat(model="llama3.1", messages=messages)

# Print the response
print(response["message"]["content"])

Output:

India won 6 medals at the 2024 Paris Olympics. 

These 6 medals consist of:
- 1 Silver
- 5 Bronze

Reference: Above context has been copied from a news artcile in firstpost.com

links

social