The operate name permits a LLM to behave as a bridge between the pure language indications and the code or the API of the actual world. As an alternative of merely producing textual content, the mannequin decides when to invoke a predefined operate, points a so -called structured JSON with the identify of the operate and arguments, after which anticipate its utility to execute that decision and return the outcomes. This Again-And —- and and may probably invoke a number of sequence capabilities, permitting wealthy interactions of a number of steps in a dialog management. On this tutorial, we’ll implement a meteorological assistant with Gemini 2.0 Flash to display how you can configure and handle that cycle of capabilities calls. We are going to implement completely different variants of capabilities calls. By integrating capabilities calls, we remodel a chat interface right into a dynamic instrument for actual -time duties, both to acquire stay climate information, confirm the order states, program appointments or replace databases. Customers not full complicated kinds or navigate a number of screens; They merely describe what they want, and the LLM orchestra the underlying actions with out issues. This automation of pure language permits the simple development of AI brokers that may entry exterior information sources, carry out transactions or activate workflows, all inside a single dialog.
Run calls with Google Gemini 2.0 flash
!pip set up "google-genai>=1.0.0" geopy requests
We set up the Gemini Python SDK (Google-Genai ≥ 1.0.0), along with Geopy to transform location names into coordinates and requests to make calls HTTP, guaranteeing that each one primary items for our Colab’s climate assistant are in place.
import os
from google import genai
GEMINI_API_KEY = "Use_Your_API_Key"
consumer = genai.Consumer(api_key=GEMINI_API_KEY)
model_id = "gemini-2.0-flash"
We import the Gemini SDK, set up its API key and create an occasion of Genai.consumer configured to make use of the “Gemini-2.0-Flash” mannequin, establishing the premise for all subsequent requests for calling capabilities.
res = consumer.fashions.generate_content(
mannequin=model_id,
contents=("Inform me 1 good reality about Nuremberg.")
)
print(res.textual content)
We ship a consumer message (“Inform me 1 good reality about Nuremberg”) to the Flash Gemini 2.0 mannequin by way of generate_content, then print the textual content response of the mannequin, demonstrating a primary name for finish -to -end textual content technology utilizing the SDK.
Perform calling with the JSON scheme
weather_function = {
"identify": "get_weather_forecast",
"description": "Retrieves the climate utilizing Open-Meteo API for a given location (metropolis) and a date (yyyy-mm-dd). Returns an inventory dictionary with the time and temperature for every hour.",
"parameters": {
"kind": "object",
"properties": {
"location": {
"kind": "string",
"description": "The town and state, e.g., San Francisco, CA"
},
"date": {
"kind": "string",
"description": "the forecasting date for when to get the climate format (yyyy-mm-dd)"
}
},
"required": ("location","date")
}
}
Right here, we outline a JSON scheme for our GET_WEATHER_FORECAST instrument, specifying its identify, a descriptive request to information Gemini on when to make use of it, and the precise entry parameters (location and date) with their varieties, descriptions and fields required, in order that the mannequin can situation calls of legitimate capabilities.
from google.genai.varieties import GenerateContentConfig
config = GenerateContentConfig(
system_instruction="You're a useful assistant that use instruments to entry and retrieve info from a climate API. Right now is 2025-03-04.",
instruments=({"function_declarations": (weather_function)}),
)
We create a generationalntconfig that tells Gemini that he’s performing as a local weather return assistant and information his meteorological operate below instruments. Subsequently, the mannequin is aware of how you can generate structured calls when prognostic information is requested.
response = consumer.fashions.generate_content(
mannequin=model_id,
contents="Whats the climate in Berlin right now?"
)
print(response.textual content)
This name sends the bare discover (“What’s the local weather in Berlin right now?”) Not together with its configuration (and, subsequently, with out definitions of capabilities), so Gemini makes use of the completion of the flat textual content, providing generic recommendation as an alternative of invoking its execution instrument by the local weather.
response = consumer.fashions.generate_content(
mannequin=model_id,
config=config,
contents="Whats the climate in Berlin right now?"
)
for half in response.candidates(0).content material.elements:
print(half.function_call)
When the configuration passes (which incorporates its JSON -Schema instrument), Gemini acknowledges that you must name get_weather_forecast as an alternative of responding in textual content with out format. The loop on response. Candidates (0).
from google.genai import varieties
from geopy.geocoders import Nominatim
import requests
geolocator = Nominatim(user_agent="weather-app")
def get_weather_forecast(location, date):
location = geolocator.geocode(location)
if location:
attempt:
response = requests.get(f"https://api.open-meteo.com/v1/forecast?latitude={location.latitude}&longitude={location.longitude}&hourly=temperature_2m&start_date={date}&end_date={date}")
information = response.json()
return {time: temp for time, temp in zip(information("hourly")("time"), information("hourly")("temperature_2m"))}
besides Exception as e:
return {"error": str(e)}
else:
return {"error": "Location not discovered"}
capabilities = {
"get_weather_forecast": get_weather_forecast
}
def call_function(function_name, **kwargs):
return capabilities(function_name)(**kwargs)
def function_call_loop(immediate):
contents = (varieties.Content material(function="consumer", elements=(varieties.Half(textual content=immediate))))
response = consumer.fashions.generate_content(
mannequin=model_id,
config=config,
contents=contents
)
for half in response.candidates(0).content material.elements:
contents.append(varieties.Content material(function="mannequin", elements=(half)))
if half.function_call:
print("Device name detected")
function_call = half.function_call
print(f"Calling instrument: {function_call.identify} with args: {function_call.args}")
tool_result = call_function(function_call.identify, **function_call.args)
function_response_part = varieties.Half.from_function_response(
identify=function_call.identify,
response={"outcome": tool_result},
)
contents.append(varieties.Content material(function="consumer", elements=(function_response_part)))
print(f"Calling LLM with instrument outcomes")
func_gen_response = consumer.fashions.generate_content(
mannequin=model_id, config=config, contents=contents
)
contents.append(varieties.Content material(function="mannequin", elements=(func_gen_response)))
return contents(-1).elements(0).textual content.strip()
outcome = function_call_loop("Whats the climate in Berlin right now?")
print(outcome)
We implement an entire “agent” loop: ship your warning to Gemini, examine the reply for a operate name, run Get_Weather_Forecast (utilizing geopy plus an open HTTP HTTP request) after which feeds the instrument outcome within the mannequin to supply and return the ultimate conversational response.
Perform by calling Python capabilities
from geopy.geocoders import Nominatim
import requests
geolocator = Nominatim(user_agent="weather-app")
def get_weather_forecast(location: str, date: str) -> str:
"""
Retrieves the climate utilizing Open-Meteo API for a given location (metropolis) and a date (yyyy-mm-dd). Returns an inventory dictionary with the time and temperature for every hour."
Args:
location (str): The town and state, e.g., San Francisco, CA
date (str): The forecasting date for when to get the climate format (yyyy-mm-dd)
Returns:
Dict(str, float): A dictionary with the time as key and the temperature as worth
"""
location = geolocator.geocode(location)
if location:
attempt:
response = requests.get(f"https://api.open-meteo.com/v1/forecast?latitude={location.latitude}&longitude={location.longitude}&hourly=temperature_2m&start_date={date}&end_date={date}")
information = response.json()
return {time: temp for time, temp in zip(information("hourly")("time"), information("hourly")("temperature_2m"))}
besides Exception as e:
return {"error": str(e)}
else:
return {"error": "Location not discovered"}
The GET_WEATHER_FORECAST operate makes use of Geopy Nominatim to transform a metropolis and state chain into coordinates, then ship an HTTP request to the open Meteeo API to get well temperature information per hour for the given date, returning a dictionary that maps every corresponding time of time. It additionally handles errors with grace, returning an error message if the situation isn’t discovered or the so -called API fails.
from google.genai.varieties import GenerateContentConfig
config = GenerateContentConfig(
system_instruction="You're a useful assistant that may assist with climate associated questions. Right now is 2025-03-04.", # to provide the LLM context on the present date.
instruments=(get_weather_forecast),
automatic_function_calling={"disable": True}
)
This configuration information its Python Get_weather_Forecast operate as an invocable instrument. It establishes a system utility (together with the date) for the context, whereas disabled “automatic_function_calling” in order that Gemini points the payload of the operate calls as an alternative of invoking it internally.
r = consumer.fashions.generate_content(
mannequin=model_id,
config=config,
contents="Whats the climate in Berlin right now?"
)
for half in r.candidates(0).content material.elements:
print(half.function_call)
When sending the discover with its personalised configuration (together with the Python instrument however with disabled computerized calls), this fragment captures the choice of the unprocessed operate of Gemini. Then, exceed every response half to print the article .Function_call, permitting you to examine precisely what instrument you wish to invoke and with what arguments.
from google.genai.varieties import GenerateContentConfig
config = GenerateContentConfig(
system_instruction="You're a useful assistant that use instruments to entry and retrieve info from a climate API. Right now is 2025-03-04.", # to provide the LLM context on the present date.
instruments=(get_weather_forecast),
)
r = consumer.fashions.generate_content(
mannequin=model_id,
config=config,
contents="Whats the climate in Berlin right now?"
)
print(r.textual content)
With this configuration (which incorporates its GET_WEATHER_FORECAST operate and depart the automated name enabled by default), calling generate_content will make Gemini invoke its meteorological instrument behind the scene after which return a pure language response. Impression R.Textual content Outputs that ultimate response, together with the actual temperature forecast for Berlin on the required date.
from google.genai.varieties import GenerateContentConfig
config = GenerateContentConfig(
system_instruction="You're a useful assistant that use instruments to entry and retrieve info from a climate API.",
instruments=(get_weather_forecast),
)
immediate = f"""
Right now is 2025-03-04. You might be chatting with Andrew, you've got entry to extra details about him.
Consumer Context:
- identify: Andrew
- location: Nuremberg
Consumer: Am i able to put on a T-shirt later right now?"""
r = consumer.fashions.generate_content(
mannequin=model_id,
config=config,
contents=immediate
)
print(r.textual content)
We prolong to your assistant with a private context, tells you the identify and placement of Gemini Andrew (Nuremberg) and ask if it’s a t -shirt local weather, whereas utilizing the Get_weather_Forecast instrument below the hood. Then print the pure language suggestion of the mannequin primarily based on the actual prognosis for that day.
In conclusion, we now know how you can outline the capabilities (by way of the JSON scheme or the Python signatures), configure Gemini 2.0 flash to detect and broadcast capabilities and implement the “agent” loop that executes these calls and composes the ultimate reply. With these development blocks, we will prolong any LLM to a succesful and enabled assistant for instruments that automates workflows, get well stay information and interacts with its code or API as effortlessly as chatting with a colleague.
Right here is the Colab pocket book. In addition to, remember to observe us Twitter and be part of our Telegram channel and LINKEDIN GRsplash. Don’t forget to hitch our 90k+ ml of submen.
Asif Razzaq is the CEO of Marktechpost Media Inc .. as a visionary entrepreneur and engineer, Asif undertakes to make the most of the potential of synthetic intelligence for the social good. Its most up-to-date effort is the launch of a synthetic intelligence media platform, Marktechpost, which stands out for its deep protection of computerized studying and deep studying information that’s technically strong and simply comprehensible by a broad viewers. The platform has greater than 2 million month-to-month views, illustrating its reputation among the many public.