Introduction
LangChain is the most popular framework for building LLM-powered applications, and adding web search capabilities to LangChain agents is one of the most common patterns. While Tavily is often the default choice due to its built-in integration, Keiro offers a far more comprehensive and affordable alternative. In this guide, we show you how to integrate Keiro into LangChain as a set of powerful tools.
Prerequisites
pip install langchain langchain-openai requests
You will also need a Keiro API key from kierolabs.space and an OpenAI API key.
Creating a Keiro Search Tool
The simplest integration is a LangChain Tool that wraps Keiro's /search endpoint:
import requests
from langchain.tools import Tool
KEIRO_API_KEY = "your-keiro-api-key"
KEIRO_BASE = "https://kierolabs.space/api"
def keiro_search(query: str) -> str:
"""Search the web using Keiro API."""
response = requests.post(f"{KEIRO_BASE}/search", json={
"apiKey": KEIRO_API_KEY,
"query": query
})
results = response.json().get("results", [])
if not results:
return "No results found."
output = []
for r in results[:5]:
output.append(f"Title: {r.get('title', 'N/A')}")
output.append(f"URL: {r.get('url', 'N/A')}")
output.append(f"Content: {r.get('content', r.get('snippet', 'N/A'))}")
output.append("---")
return "\n".join(output)
search_tool = Tool(
name="keiro_web_search",
description="Search the web for current information. Use this when you need up-to-date facts, news, or data.",
func=keiro_search
)
Creating a Keiro Research Tool
Keiro's /research endpoint is perfect for complex queries that need multi-step investigation:
def keiro_research(query: str) -> str:
"""Perform deep research on a topic using Keiro API."""
response = requests.post(f"{KEIRO_BASE}/research", json={
"apiKey": KEIRO_API_KEY,
"query": query
})
data = response.json()
summary = data.get("summary", "No summary available.")
sources = data.get("sources", [])
output = f"Research Summary:\n{summary}\n\nSources:\n"
for s in sources[:5]:
output += f"- {s.get('title', 'N/A')}: {s.get('url', 'N/A')}\n"
return output
research_tool = Tool(
name="keiro_deep_research",
description="Perform deep, multi-step research on a complex topic. Use this for questions that require synthesizing information from multiple sources.",
func=keiro_research
)
Creating a Keiro Answer Tool
def keiro_answer(query: str) -> str:
"""Get a direct, sourced answer using Keiro API."""
response = requests.post(f"{KEIRO_BASE}/answer", json={
"apiKey": KEIRO_API_KEY,
"query": query
})
data = response.json()
answer = data.get("response", "No answer available.")
sources = data.get("sources", [])
source_list = "\n".join([f"- {s.get('url', '')}" for s in sources[:3]])
return f"{answer}\n\nSources:\n{source_list}"
answer_tool = Tool(
name="keiro_answer",
description="Get a direct answer to a factual question with source citations. Best for straightforward questions.",
func=keiro_answer
)
Creating a Keiro Web Crawler Tool
def keiro_crawl(url: str) -> str:
"""Extract clean content from a web page using Keiro API."""
response = requests.post(f"{KEIRO_BASE}/web-crawler", json={
"apiKey": KEIRO_API_KEY,
"url": url
})
data = response.json()
content = data.get("content", "Could not extract content.")
title = data.get("title", "Unknown")
return f"Page Title: {title}\n\nContent:\n{content[:5000]}"
crawl_tool = Tool(
name="keiro_web_crawler",
description="Extract the full content of a specific web page URL. Use this when you need detailed information from a known URL.",
func=keiro_crawl
)
Building a LangChain Agent with Keiro Tools
Now let us combine all four tools into a LangChain agent:
from langchain_openai import ChatOpenAI
from langchain.agents import AgentExecutor, create_openai_functions_agent
from langchain.prompts import ChatPromptTemplate, MessagesPlaceholder
# Initialize the LLM
llm = ChatOpenAI(model="gpt-4o", temperature=0)
# Define the tools
tools = [search_tool, research_tool, answer_tool, crawl_tool]
# Create the prompt
prompt = ChatPromptTemplate.from_messages([
("system", (
"You are a helpful research assistant with access to web search tools. "
"Use the appropriate tool for each task:\n"
"- keiro_web_search: Quick lookups and current information\n"
"- keiro_deep_research: Complex topics requiring multi-source analysis\n"
"- keiro_answer: Simple factual questions\n"
"- keiro_web_crawler: Extract content from a specific URL\n"
"Always cite your sources."
)),
MessagesPlaceholder(variable_name="chat_history", optional=True),
("human", "{input}"),
MessagesPlaceholder(variable_name="agent_scratchpad"),
])
# Create the agent
agent = create_openai_functions_agent(llm, tools, prompt)
agent_executor = AgentExecutor(agent=agent, tools=tools, verbose=True)
# Run a query
result = agent_executor.invoke({
"input": "Research the current state of autonomous vehicles and summarize the key players and their progress"
})
print(result["output"])
Building a Retrieval Chain
For simpler use cases where you always want to search before generating, use a retrieval chain:
from langchain.chains import LLMChain
from langchain.prompts import PromptTemplate
retrieval_prompt = PromptTemplate(
input_variables=["context", "question"],
template=(
"Use the following search results to answer the question.\n\n"
"Search Results:\n{context}\n\n"
"Question: {question}\n\n"
"Answer (cite sources):"
)
)
retrieval_chain = LLMChain(llm=llm, prompt=retrieval_prompt)
def search_and_answer(question: str) -> str:
context = keiro_search(question)
result = retrieval_chain.invoke({"context": context, "question": question})
return result["text"]
answer = search_and_answer("What are the newest AI regulations in the EU?")
print(answer)
Comparison: Keiro vs Tavily in LangChain
| Aspect | Keiro (Custom Tool) | Tavily (Built-in) |
|---|---|---|
| Setup Time | 5 minutes (copy-paste above code) | 2 minutes (pip install) |
| Tools Available | Search, Research, Answer, Crawler | Search only |
| Monthly Cost (10k queries) | $5.99 | ~$400 |
| Research Capability | Built-in /research endpoint | Not available |
| Batch Processing | Free | Not available |
The 3 extra minutes of setup for Keiro saves you hundreds of dollars per month and gives you access to four tools instead of one.
Advanced: Memory-Augmented Agent
Combine Keiro's /memory-search with LangChain's memory for context-aware conversations:
def keiro_memory_search(query: str) -> str:
"""Search with conversation context using Keiro memory search."""
response = requests.post(f"{KEIRO_BASE}/memory-search", json={
"apiKey": KEIRO_API_KEY,
"query": query
})
data = response.json()
results = data.get("results", [])
output = []
for r in results[:5]:
output.append(f"{r.get('title', 'N/A')}: {r.get('content', r.get('snippet', 'N/A'))}")
return "\n".join(output) if output else "No results found."
memory_search_tool = Tool(
name="keiro_memory_search",
description="Search the web with awareness of the conversation context. Use for follow-up questions.",
func=keiro_memory_search
)
Conclusion
Integrating Keiro with LangChain gives you a suite of powerful web interaction tools at a fraction of the cost of alternatives. Whether you need a simple search tool or a full research agent, Keiro's multiple endpoints map naturally to LangChain's tool abstraction. The setup takes minutes, and the savings are immediate.
Get your Keiro API key at kierolabs.space and supercharge your LangChain agents today.