On this sensible tutorial, we give life to the fundamental rules of the mannequin context protocol (MCP) by means of the implementation of an assistant to the sunshine and aware context of the context utilized by Langchain, Langgraph and the Google Gemini language mannequin. Whereas MCP’s full integration usually implies devoted servers and communication protocols, this simplified model demonstrates how the identical concepts, context restoration, invocation of instruments and dynamic interplay may be recreated in a single pocket book utilizing a modular agent structure. The wizard can reply to pure language consultations and selectively enruption of exterior instruments (similar to a personalised data base), imitating how MCP shoppers work together with context suppliers in actual world configurations.
!pip set up langchain langchain-google-genai langgraph python-dotenv
!pip set up google-generativeai
First, we set up important libraries. The primary command installs Langchain, Langgraph, the Langchain AI generative Google wrapping and the variable setting help by means of Python-Denv. The second command installs the official Google AI shopper, which permits interplay with Gemini fashions.
import os
os.environ("GEMINI_API_KEY") = "Your API Key"
Right here, we set up its API Gemini key as an setting variable in order that the mannequin can entry safely with out encoding it in its code base. Exchange “your API key” together with your actual Google Ai Studio key.
from langchain.instruments import BaseTool
from langchain_google_genai import ChatGoogleGenerativeAI
from langchain.prompts import ChatPromptTemplate
from langchain.schema.messages import HumanMessage, AIMessage
from langgraph.prebuilt import create_react_agent
import os
mannequin = ChatGoogleGenerativeAI(
mannequin="gemini-2.0-flash-lite",
temperature=0.7,
google_api_key=os.getenv("GEMINI_API_KEY")
)
class SimpleKnowledgeBaseTool(BaseTool):
title: str = "simple_knowledge_base"
description: str = "Retrieves fundamental details about AI ideas."
def _run(self, question: str):
data = {
"MCP": "Mannequin Context Protocol (MCP) is an open normal by Anthropic designed to attach AI assistants with exterior knowledge sources, enabling real-time, context-rich interactions.",
"RAG": "Retrieval-Augmented Era (RAG) enhances LLM responses by dynamically retrieving related exterior paperwork."
}
return data.get(question, "I haven't got info on that subject.")
async def _arun(self, question: str):
return self._run(question)
kb_tool = SimpleKnowledgeBaseTool()
instruments = (kb_tool)
graph = create_react_agent(mannequin, instruments)
On this block, we initialize the Gemini Language Mannequin (Gemini-2.0-Flash-Lite) utilizing Chatgooglegenerativai of Langchain, with the API key loaded safely from the setting variables. Then we outline a personalised instrument known as easy Knowledgebasetool that simulates an exterior supply of data by returning predefined solutions to consultations on ideas of AI as “mcp” and “RAG. “This instrument acts as a fundamental context supplier, just like how an MCP server would work.
import nest_asyncio
import asyncio
nest_asyncio.apply()
async def chat_with_agent():
inputs = {"messages": ()}
print("🤖 MCP-Like Assistant prepared! Kind 'exit' to give up.")
whereas True:
user_input = enter("nYou: ")
if user_input.decrease() == "exit":
print("👋 Ending chat.")
break
from langchain.schema.messages import HumanMessage, AIMessage
inputs("messages").append(HumanMessage(content material=user_input))
async for state in graph.astream(inputs, stream_mode="values"):
last_message = state("messages")(-1)
if isinstance(last_message, AIMessage):
print("nAgent:", last_message.content material)
inputs("messages") = state("messages")
await chat_with_agent()
Lastly, we configure an asynchronous chat circuit to work together with the assistant impressed by MCP. Utilizing Nest_asyncio, we allow help to execute asynchronous code inside the present occasion loop. The chat_with_agent perform () captures the consumer’s entry, feeds it to the Actt agent and transmits the actual -time mannequin responses. With every flip, the wizard makes use of the conscious reasoning of the instrument to resolve whether or not to reply immediately or invoke the customized data base instrument, emulating how an MCP shopper interacts with context suppliers to supply dynamic and wealthy responses in context.
In conclusion, this tutorial provides a sensible foundation for constructing brokers of the context impressed by the MCP normal. We now have created a purposeful prototype that demonstrates the usage of the instrument on the request and the restoration of exterior data when combining the Langchain instrument interface, the framework of Langgraph brokers and the highly effective era of Gemini languages. Though the configuration is simplified, it captures the essence of the MCP structure: modularity, interoperability and clever context injection. From right here, you possibly can lengthen the assistant to combine actual APIs, native paperwork or dynamic search instruments, evolving it in an AI system prepared for manufacturing aligned with the rules of the mannequin’s context protocol.
Right here is the Colab pocket book. Moreover, remember to observe us Twitter and be part of our Telegram channel and LINKEDIN GRsplash. Don’t forget to affix our 85k+ ml of submen.
Asif Razzaq is the CEO of Marktechpost Media Inc .. as a visionary entrepreneur and engineer, Asif undertakes to reap the benefits of the potential of synthetic intelligence for the social good. Its most up-to-date effort is the launch of a man-made intelligence media platform, Marktechpost, which stands out for its deep protection of computerized studying and deep studying information that’s technically stable and simply comprehensible by a broad viewers. The platform has greater than 2 million month-to-month views, illustrating its reputation among the many public.