Integrate Google Search to chat bot using LangGraph
In today’s information-rich world, harnessing the vast resources of the internet is essential. Imagine a system that seamlessly connects to Google, retrieves relevant information, and uses an advanced language model to deliver precise answers all automatically. With the LangChain framework, this is possible. In this tutorial, we’ll show you how to integrate Google Search with LangGraph, transforming simple queries into comprehensive, intelligent responses. Let’s get started!
To access Google search content programmatically, you’ll need two crucial pieces: a Search Engine ID and a Google Search API Key. These elements enable your application to interact with Google’s search infrastructure securely and efficiently.
Steps to follow to get search engine ID and google search API key
- If you don’t already have a Google account, sign up. — If you have never created a Google APIs Console project, create a project in the Google API Console .
- Enable the Custom Search API :
- Navigate to the APIs & Services → Dashboard panel in Cloud Console. — Click Enable APIs and Services.
- Search for Custom Search API and click on it. Click Enable.
- URL for it: https://console.cloud.google.com/apis/library/customsearch.googleapis .com
3. To create an API key:
- Navigate to the APIs & Services → Credentials panel in Cloud Console.
- Select Create credentials, then select API key from the drop-down menu.
- The API key created dialog box displays your newly created key. You now have an API key
4. Setup Custom Search Engine so you can search the entire web
- Create a custom search engine here: https://programmablesearchengine.google.com/.
- In What to search to search, pick the Search the entire Web option. After search engine is created, you can click on it and find Search engine ID
Now you have the API key and Search Engine ID. Now to use them in the code you need to have a package installed.
pip install google-api-python-client
Now we will explore how to build a chatbot using LangChain, LangGraph, and Google Search API. We’ll walk through initializing the model and tools, defining the state and nodes, and finally executing the graph.
Step 1: Initialize the Model and Tools
First, we need to initialize our language model and tools. We’ll use ChatAnthropic
as our language model and bind it with a search tool.
from typing import Annotated
from langchain_anthropic import ChatAnthropic
from typing_extensions import TypedDict
import os
from langgraph.checkpoint.sqlite import SqliteSaver
from langgraph.graph import StateGraph
from langgraph.graph.message import add_messages
from langgraph.prebuilt import ToolNode, tools_condition
from langchain_google_community import GoogleSearchAPIWrapper
from langchain_core.tools import Tool
# Set up environment variables for API keys
os.environ['ANTHROPIC_API_KEY'] = ""
os.environ["GOOGLE_CSE_ID"] = ""
os.environ["GOOGLE_API_KEY"] = ""
# Initialize in-memory checkpointer
memory = SqliteSaver.from_conn_string(":memory:")
# Initialize language model
llm = ChatAnthropic(model="claude-3-haiku-20240307")
# Initialize Google search tool
search = GoogleSearchAPIWrapper()
tool = Tool(
name="google_search",
description="Search Google for recent results.",
func=search.run,
)
tools = [tool]
llm_with_tools = llm.bind_tools(tools)
Step 2: Define the State
We need to define the state schema for our graph. We’ll use MessagesState
, a prebuilt state schema with a list of LangChain Message objects.
class State(TypedDict):
messages: Annotated[list, add_messages]
Step 3: Define the Graph Nodes
We define two main nodes: the agent node and the tools node. The chatbot node uses llm and generate messages, and the tools node invokes the tools.
def chatbot(state: State):
return {"messages": [llm_with_tools.invoke(state["messages"])]}
graph_builder = StateGraph(State)
graph_builder.add_node("chatbot", chatbot)
tool_node = ToolNode(tools=[tool])
graph_builder.add_node("tools", tool_node)
Step 4: Define Entry Point and Graph Edges
We set the entry point for graph execution and define the edges. We use a conditional edge to determine if the agent should run the tools or finish. While compiling we have added the memory such that it remembers the previous chat.
graph_builder.add_conditional_edges(
"chatbot",
tools_condition,
)
graph_builder.add_edge("tools", "chatbot")
graph_builder.set_entry_point("chatbot")
graph = graph_builder.compile(checkpointer=memory)
Step 5: Execute the Graph
Finally, we execute the graph in a loop, allowing continuous user interaction.
while True:
config = {"configurable": {"thread_id": "1"}}
user_input = input("Query:")
events = graph.stream(
{"messages": [("user", user_input)]}, config, stream_mode="values"
)
for event in events:
if "messages" in event:
event["messages"][-1].pretty_print()he config thread ID remembers the chat that occurred using that particular thread ID. If we use a different thread ID, it won't remember the chat that happened in a previous chat which used a different config thread ID.
The config thread ID remembers the chat that occurred using that particular thread ID. If we use a different thread ID, it won’t remember the chat that happened in a previous chat which used a different config thread ID. You can try by changing the thread_id in the config.
By following these steps, you can create a robust chatbot that utilizes powerful language models and external tools to provide informative responses.
You can also do this with LangChain alone instead LangGraph, kindly check this link.
References: LangGraph Docs
We hope you found this guide helpful and that you learned something about chatbot development. Thank you for taking the time to read through this tutorial. If you have any questions or feedback, feel free to reach out. Happy coding!