🚀 New models by Bria.ai, generate and edit images at scale 🚀

In this blog post, we will query for the details of a recently released expansion pack for Elden Ring, a critically acclaimed game released in 2022, using the Tavily tool with the ChatDeepInfra model.
Using this boilerplate, one can automate the process of searching for information with well-written responses. This is a great way to create a chatbot that can interact with users and provide them with the information they need.
First, let's create a virtual environment and activate it:
python3 -m venv venv
source venv/bin/activate
Next, install the required packages:
pip install python-dotenv langchain langchain-community
Before we start, we need to load our DeepInfra API key and Tavily API key. You can get your DeepInfra API key from here. After obtaining the API key, create a .env file in the root directory of your project and add the following line:
DEEPINFRA_API_TOKEN=YOUR_DEEPINFRA_API_KEY
TAVILY_API_KEY=YOUR_TAVILY_API_KEY
After installing the required packages and setting up the environment, we can create a LangChain agent that uses the ChatDeepInfra model and the Tavily tool to search for information on the web. The ChatDeepInfra model is a powerful conversational model that can generate human-like responses to user queries. The Tavily tool allows us to search the web for information and retrieve the search results.
Here's the complete Python script to create and run a LangChain agent using the ChatDeepInfra model:
from dotenv import load_dotenv, find_dotenv
from langchain_community.chat_models import ChatDeepInfra
_ = load_dotenv(find_dotenv())
from langchain.agents import AgentExecutor, create_tool_calling_agent
from langchain_community.tools.tavily_search import TavilySearchResults
from langchain_core.prompts import ChatPromptTemplate
model_name = "meta-llama/Meta-Llama-3-70B-Instruct"
if __name__ == "__main__":
tools = [TavilySearchResults(max_results=1)]
llm = ChatDeepInfra(
model = model_name
)
prompt = ChatPromptTemplate.from_messages(
[
(
"system",
"You are a helpful assistant. Make sure to use the tavily_search_results_json tool for information.",
),
("placeholder", "{chat_history}"),
("human", "{input}"),
("placeholder", "{agent_scratchpad}"),
]
)
agent = create_tool_calling_agent(llm, tools, prompt)
agent_executor = AgentExecutor(agent=agent, tools=tools, verbose=True, stream_runnable=False)
question = "Why is the hype for Shadow of the Erdtree so high?"
result = agent_executor.invoke({"input": question})
print(result["output"])
# According to the search results, the new DLC for Elden Ring is called "Shadow of the Erdtree".
One crucial point is to use stream_runnable=False in the AgentExecutor.
Stage is yours now! You can futher extend the agent to include more tools and models to improve your workflows which will be topic for another blog post.
Stay tuned for more updates and happy coding!
Juggernaut FLUX is live on DeepInfra!Juggernaut FLUX is live on DeepInfra!
At DeepInfra, we care about one thing above all: making cutting-edge AI models accessible. Today, we're excited to release the most downloaded model to our platform.
Whether you're a visual artist, developer, or building an app that relies on high-fidelity ...
How to use CivitAI LoRAs: 5-Minute AI Guide to Stunning Double Exposure ArtLearn how to create mesmerizing double exposure art in minutes using AI. This guide shows you how to set up a LoRA model from CivitAI and create stunning artistic compositions that blend multiple images into dreamlike masterpieces.
Compare Llama2 vs OpenAI models for FREE.At DeepInfra we host the best open source LLM models. We are always working hard to make
our APIs simple and easy to use.
Today we are excited to announce a very easy way to quickly try our models like
Llama2 70b and
[Mistral 7b](/mistralai/Mistral-7B-Instruc...© 2025 Deep Infra. All rights reserved.