Close Menu
    Trending
    • Graph Neural Networks (GNNs) for Alpha Signal Generation | by Farid Soroush, Ph.D. | Aug, 2025
    • How This Entrepreneur Built a Bay Area Empire — One Hustle at a Time
    • How Deep Learning Is Reshaping Hedge Funds
    • Boost Team Productivity and Security With Windows 11 Pro, Now $15 for Life
    • 10 Common SQL Patterns That Show Up in FAANG Interviews | by Rohan Dutt | Aug, 2025
    • This Mac and Microsoft Bundle Pays for Itself in Productivity
    • Candy AI NSFW AI Video Generator: My Unfiltered Thoughts
    • Anaconda : l’outil indispensable pour apprendre la data science sereinement | by Wisdom Koudama | Aug, 2025
    AIBS News
    • Home
    • Artificial Intelligence
    • Machine Learning
    • AI Technology
    • Data Science
    • More
      • Technology
      • Business
    AIBS News
    Home»Artificial Intelligence»Hands‑On with Agents SDK: Your First API‑Calling Agent
    Artificial Intelligence

    Hands‑On with Agents SDK: Your First API‑Calling Agent

    Team_AIBS NewsBy Team_AIBS NewsJuly 22, 2025No Comments17 Mins Read
    Share Facebook Twitter Pinterest LinkedIn Tumblr Reddit Telegram Email
    Share
    Facebook Twitter LinkedIn Pinterest Email


    round LLMs is now evolving into the hype of Agentic AI. Whereas I hope this text doesn’t fall into an “over‑hyped” class, I personally consider this matter is essential to study. Coming from an information and analytics background, I discover that getting accustomed to it may be very useful in day‑to‑day work and to arrange for the way it may probably reshape present processes.

    My very own journey with Agentic AI remains to be fairly new (in any case, it’s a comparatively new matter), and I’m nonetheless studying alongside the best way. On this sequence of articles, I’d prefer to share a newbie‑pleasant, step‑by‑step information to growing Agentic AI primarily based on my private expertise—specializing in the OpenAI Brokers SDK framework. Some subjects I plan to cowl on this sequence embrace: software‑use brokers, multi‑agent collaboration, structured output, producing information visualizations, chat options, and extra. So keep tuned!

    On this article, we’ll begin by constructing a primary agent after which improve it right into a software‑utilizing agent able to retrieving information from an API. Lastly, we’ll wrap all the pieces in a easy Streamlit UI so customers can work together with the agent we construct.

    All through this information, we’ll keep on with a single use case: making a climate assistant app. I selected this instance as a result of it’s relatable for everybody and covers many of the subjects I plan to share. For the reason that use case is easy and generic, you may simply adapt this information in your personal initiatives.

    The hyperlink to the GitHub repository and the deployed Streamlit app is supplied on the finish of this text.

    A Transient Intro to OpenAI Brokers SDK

    The OpenAI Brokers SDK is a Python-based framework that enables us to create an agentic AI system in a easy and easy-to-use approach [1]. As a newbie myself, I discovered this assertion to be fairly true, which makes the educational journey really feel much less intimidating.

    On the core of this framework are “Brokers”—Giant Language Fashions (LLMs) that we will configure with particular directions and instruments they’ll use.

    As we already know, an LLM is educated on an unlimited quantity of knowledge, giving it robust capabilities in understanding human language and producing textual content or photos. When mixed with clear directions and the power to work together with instruments, it turns into greater than only a generator—it could actually act and turns into an agent [2].

    One sensible use of instruments is enabling an agent to retrieve factual information from exterior sources. This implies the LLM not depends solely on its (typically outdated) coaching information, permitting it to provide extra correct and up‑to‑date outcomes.

    On this article, we are going to give attention to this benefit by constructing an agent that may retrieve “actual‑time” information from an API. Let’s get began!

    Set Up the Setting

    Create a necessities.txt file containing the next two essential packages. I desire utilizing necessities.txt for 2 causes: reusability and getting ready the undertaking for Streamlit deployment.

    openai-agents
    streamlit

    Subsequent, arrange a digital atmosphere named venv and set up the packages listed above. Run the next instructions in your terminal:

    python −m venv venv
    supply venv/bin/activate      # On Home windows: venvScriptsactivate
    pip set up -r necessities.txt

    Lastly, since we are going to use the OpenAI API to name the LLM, you want to have an API key (get your API key here). Retailer this key in a .env file as follows. Vital: ensure you add .env to your .gitignore file if you’re utilizing Git for this undertaking.

    OPENAI_API_KEY=your_openai_key_here

    As soon as all the pieces is about up, you’re good to go!

    A Easy Agent

    Let’s start with a easy agent by making a Python file referred to as 01-single-agent.py.

    Import Libraries

    The very first thing we have to do within the script is import the mandatory libraries:

    from brokers import Agent, Runner
    import asyncio
    
    from dotenv import load_dotenv
    load_dotenv()

    From the Brokers SDK bundle, we use Agent to outline the agent and Runner to run it. We additionally import asyncio to allow our program to carry out a number of duties with out ready for one to complete earlier than beginning one other.

    Lastly, load_dotenv from the dotenv bundle masses the atmosphere variables we outlined earlier within the .env file. In our case, this contains OPENAI_API_KEY, which will likely be utilized by default after we outline and name an agent.

    Outline a Easy Agent

    Construction of Climate Assistant Agent.
    Generated utilizing GraphViz.

    Subsequent, we are going to outline a easy agent referred to as Climate Assistant.

    agent = Agent(
        identify="Climate Assistant",
        directions="You present correct and concise climate updates primarily based on person queries in plain language."
    )

    An agent might be outlined with a number of properties. On this easy instance, we solely configure the identify and the directions for the agent. If wanted, we will additionally specify which LLM mannequin to make use of. As an illustration, if we need to use a smaller mannequin equivalent to gpt-4o-mini (presently, the default mannequin is gpt-4o), we will add the configuration as proven beneath.

    agent = Agent(
        identify="Climate Assistant",
        directions="You present correct and concise climate updates primarily based on person queries in plain language.",
        mannequin="gpt-4o-mini"
    )

    There are a number of different parameters that we are going to cowl later on this article and within the subsequent one. For now, we are going to preserve the mannequin configuration easy as proven above.

    After defining the agent, the subsequent step is to create an asynchronous operate that may run the agent.

    async def run_agent():
        outcome = await Runner.run(agent, "What is the climate like at the moment in Jakarta?")
        print(outcome.final_output)

    The Runner.run(agent, ...) methodology calls the agent with the question “What’s the climate like at the moment in Jakarta?”. The await key phrase pauses the operate till the duty is full, permitting different asynchronous duties (if any) to run within the meantime. The results of this activity is saved within the outcome variable. To view the output, we print outcome.final_output to the terminal.

    The final half we have to add is this system’s entry level to execute the operate when the script runs. We use asyncio.run to execute the run_agent operate.

    if __name__ == "__main__":
        asyncio.run(run_agent())

    Run the Easy Agent

    Now, let’s run the script within the terminal by executing:

    python 01-single-agent.py

    The outcome will most probably be that the agent says it can not present the knowledge. That is anticipated as a result of the LLM was educated on previous information and doesn’t have entry to real-time climate circumstances.

    I can’t present real-time info, however you may test a dependable climate web site or app for the newest updates on Jakarta’s climate at the moment.

    Within the worst case, the agent may hallucinate by returning a random temperature and giving strategies primarily based on that worth. To deal with this example, we are going to later implement the agent to name an API to retrieve the precise climate circumstances.

    Utilizing Hint

    One of many helpful options of the Brokers SDK is Hint, which lets you visualize, debug, and monitor the workflow of the agent you’ve constructed and executed. You’ll be able to entry the tracing dashboard right here: https://platform.openai.com/traces.

    For our easy agent, the hint will appear like this:

    Example of a trace dashboard
    Screenshot of Hint Dashboard.

    On this dashboard, yow will discover helpful details about how the workflow is executed, together with the enter and output of every step. Since this can be a easy agent, we solely have one agent run. Nonetheless, because the workflow turns into extra complicated, this hint function will likely be extraordinarily useful for monitoring and troubleshooting the method.

    Person Interface with Streamlit

    Beforehand, we constructed a easy script to outline and name an agent. Now, let’s make it extra interactive by including a person interface with Streamlit [3].

    Let’s create a script named 02-single-agent-app.py as proven beneath:

    from brokers import Agent, Runner
    import asyncio
    import streamlit as st
    from dotenv import load_dotenv
    
    load_dotenv()
    
    agent = Agent(
        identify="Climate Assistant",
        directions="You present correct and concise climate updates primarily based on person queries in plain language."
    )
    
    async def run_agent(user_input: str):
        outcome = await Runner.run(agent, user_input)
        return outcome.final_output
    
    def important():
        st.title("Climate Assistant")
        user_input = st.text_input("Ask concerning the climate:")
        
        if st.button("Get Climate Replace"):
            with st.spinner("Considering..."):
                if user_input:
                    agent_response = asyncio.run(run_agent(user_input))
                    st.write(agent_response)
                else:
                    st.write("Please enter a query concerning the climate.")
    
    if __name__ == "__main__":
        important()

    In comparison with the earlier script, we now import the Streamlit library to construct an interactive app. The agent definition stays the identical, however we modify the run_agent operate to simply accept person enter and cross it to the Runner.run operate. As an alternative of printing the outcome on to the console, the operate now returns the outcome.

    Within the important operate, we use Streamlit parts to construct the interface: setting the title, including a textual content field for person enter, and making a button that triggers the run_agent operate.

    The agent’s response is saved in agent_response and displayed utilizing the st.write part. To run this Streamlit app in your browser, use the next command:

    streamlit run 02-single-agent-app.py
    Screenshot for single agent app utilizing Streamlit

    To cease the app, press Ctrl + C in your terminal.

    To maintain the article targeted on the Brokers SDK framework, I saved the Streamlit app so simple as doable. Nonetheless, that doesn’t imply you want to cease right here. Streamlit affords all kinds of parts that can help you get inventive and make your app extra intuitive and fascinating. For an entire listing of parts, test the Streamlit documentation within the reference part.

    From this level onward, we are going to proceed utilizing this primary Streamlit construction.

    A Software-Use Agent

    As we noticed within the earlier part, the agent struggles when requested concerning the present climate situation. It could return no info or, worse, produce a hallucinated reply. To make sure our agent makes use of actual information, we will permit it to name an exterior API so it could actually retrieve precise info.

    This course of is a sensible instance of utilizing Instruments within the Brokers SDK. On the whole, instruments allow an agent to take actions—equivalent to fetching information, working code, calling an API (as we are going to do shortly), and even interacting with a pc [1]. Utilizing instruments and taking actions is among the key capabilities that distinguishes an agent from a typical LLM.

    Let’s dive into the code. First, create one other file named 03-tooluse-agent-app.py.

    Import Libraries

    We’ll want the next libraries:

    from brokers import Agent, Runner, function_tool
    import asyncio
    import streamlit as st
    from dotenv import load_dotenv
    import requests
    
    load_dotenv()

    Discover that from the Brokers SDK, we now import an extra module: function_tool. Since we are going to name an exterior API, we additionally import the requests library.

    Outline the Operate Software

    The API we are going to use is Open‑Meteo [4], which affords free entry for non‑business use. It gives many options, together with climate forecasts, historic information, air high quality, and extra. On this article, we are going to begin with the only function: retrieving present climate information.

    As an extra word, Open‑Meteo gives its personal library, openmeteo‑requests. Nonetheless, on this information I take advantage of a extra generic method with the requests module, with the intention of constructing the code reusable for different functions and APIs.

    Right here is how we will outline a operate to retrieve the present climate for a selected location utilizing Open-Meteo:

    @function_tool
    def get_current_weather(latitude: float, longitude: float) -> dict:
        """
        Fetches present climate information for a given location utilizing the Open-Meteo API.
    
        Args:
            latitude (float): The latitude of the placement.
            longitude (float): The longitude of the placement.
    
        Returns:
            dict: A dictionary containing the climate information, or an error message if the request fails.
        """
        strive:
            url = "https://api.open-meteo.com/v1/forecast"
            params = {
                "latitude": latitude,
                "longitude": longitude,
                "present": "temperature_2m,relative_humidity_2m,dew_point_2m,apparent_temperature,precipitation,weathercode,windspeed_10m,winddirection_10m",
                "timezone": "auto"
            }
            response = requests.get(url, params=params)
            response.raise_for_status()  # Elevate an error for HTTP points
            return response.json()
        besides requests.RequestException as e:
            return {"error": f"Didn't fetch climate information: {e}"}

    The operate takes latitude and longitude as inputs to establish the placement and assemble an API request. The parameters embrace metrics equivalent to temperature, humidity, and wind velocity. If the API request succeeds, it returns the JSON response as a Python dictionary. If an error happens, it returns an error message as an alternative.

    To make the operate accessible to the agent, we enhance it with @function_tool, permitting the agent to name it when the person’s question is expounded to present climate information.

    Moreover, we embrace a docstring within the operate, offering each an outline of its function and particulars of its arguments. Together with a docstring is extraordinarily useful for the agent to grasp the right way to use the operate.

    Outline a Software-Use Agent

    Construction of Climate Specialist Agent, a toll-use agent.
    Generated utilizing GraphViz.

    After defining the operate, let’s transfer on to defining the agent.

    weather_specialist_agent = Agent(
        identify="Climate Specialist Agent",
        directions="You present correct and concise climate updates primarily based on person queries in plain language.",
        instruments=[get_current_weather],
        tool_use_behavior="run_llm_again"
    )
    
    async def run_agent(user_input: str):
        outcome = await Runner.run(weather_specialist_agent, user_input)
        return outcome.final_output

    For probably the most half, the construction is similar as within the earlier part. Nonetheless, since we at the moment are utilizing instruments, we have to add some further parameters.

    The primary is instruments, which is an inventory of instruments the agent can use. On this instance, we solely present the get_current_weather software. The following is tool_use_behavior, which configures how software utilization is dealt with. For this agent, we set it to "run_llm_again", which signifies that after receiving the response from the API, the LLM will course of it additional and current it in a transparent, easy-to-read format. Alternatively, you should use "stop_on_first_tool", the place the LLM won’t course of the software’s output additional. We’ll experiment with this feature later.

    The remainder of the script follows the identical construction we used earlier to construct the principle Streamlit operate.

    def important():
        st.title("Climate Assistant")
        user_input = st.text_input("Ask concerning the climate:")
        
        if st.button("Get Climate Replace"):
            with st.spinner("Considering..."):
                if user_input:
                    agent_response = asyncio.run(run_agent(user_input))
                    st.write(agent_response)
                else:
                    st.write("Please enter a query concerning the climate.")
    
    if __name__ == "__main__":
        important()

    Be certain to save lots of the script, then run it within the terminal:

    streamlit run 03-tooluse-agent-app.py

    Now you can ask a query concerning the climate in your metropolis. For instance, once I requested concerning the present climate in Jakarta—on the time of penning this (round 4 o’clock within the morning)—the response was as proven beneath:

    Example of agent response for Jakarta weather
    Screenshot for tool-use agent app utilizing Streamlit

    Now, as an alternative of hallucinating, the agent can present a human‑readable present climate situation for Jakarta. You may recall that the get_current_weather operate requires latitude and longitude as arguments. On this case, we depend on the LLM to produce them, as it’s probably educated with primary location info. A future enchancment can be so as to add a software that retrieves a extra correct geographical location primarily based on a metropolis identify.

    (Elective) Use “stop_on_first_tool”

    Out of curiosity, let’s strive altering the tool_use_behavior parameter to "stop_on_first_tool" and see what it returns.

    Example output when using stop_on_first_tool
    Screenshot for tool-use agent app with stop_on_first_tool possibility utilizing Streamlit

    As anticipated, with out the LLM’s assist to parse and remodel the JSON response, the output is tougher to learn. Nonetheless, this habits might be helpful in situations the place you want a uncooked, structured outcome with none further processing by the LLM.

    Improved Instruction

    Now, let’s change the tool_use_behavior parameter again to "run_llm_again".

    As we’ve seen, utilizing an LLM may be very useful for parsing the outcome. We will take this a step additional by giving the agent extra detailed directions—particularly, asking for a structured output and sensible strategies. To do that, replace the directions parameter as follows:

    directions = """
    You're a climate assistant agent.
    Given present climate information (together with temperature, humidity, wind velocity/route, precipitation, and climate codes), present:
    1. A transparent and concise rationalization of the present climate circumstances.
    2. Sensible strategies or precautions for out of doors actions, journey, well being, or clothes primarily based on the info.
    3. If any extreme climate is detected (e.g., heavy rain, thunderstorms, excessive warmth), spotlight essential security measures.
    
    Format your response in two sections:
    Climate Abstract:
    - Briefly describe the climate in plain language.
    
    Solutions:
    - Supply actionable recommendation related to the climate circumstances.
    """

    After saving the adjustments, rerun the app. Utilizing the identical query, you must now obtain a clearer, properly‑structured response together with sensible strategies.

    Screenshot for tool-use agent app with an improved instruction utilizing Streamlit
    The ultimate script of 03-tooluse-agent-app.py might be seen right here.
    from brokers import Agent, Runner, function_tool
    import asyncio
    import streamlit as st
    from dotenv import load_dotenv
    import requests
    
    load_dotenv()
    
    @function_tool
    def get_current_weather(latitude: float, longitude: float) -> dict:
        """
        Fetches present climate information for a given location utilizing the Open-Meteo API.
    
        Args:
            latitude (float): The latitude of the placement.
            longitude (float): The longitude of the placement.
    
        Returns:
            dict: A dictionary containing the climate information or an error message if the request fails.
        """
        strive:
            url = "https://api.open-meteo.com/v1/forecast"
            params = {
                "latitude": latitude,
                "longitude": longitude,
                "present": "temperature_2m,relative_humidity_2m,dew_point_2m,apparent_temperature,precipitation,weathercode,windspeed_10m,winddirection_10m",
                "timezone": "auto"
            }
            response = requests.get(url, params=params)
            response.raise_for_status()  # Elevate an error for HTTP points
            return response.json()
        besides requests.RequestException as e:
            return {"error": f"Didn't fetch climate information: {e}"}
    
    weather_specialist_agent = Agent(
        identify="Climate Specialist Agent",
        directions="""
        You're a climate assistant agent.
        Given present climate information (together with temperature, humidity, wind velocity/route, precipitation, and climate codes), present:
        1. A transparent and concise rationalization of the present climate circumstances.
        2. Sensible strategies or precautions for out of doors actions, journey, well being, or clothes primarily based on the info.
        3. If any extreme climate is detected (e.g., heavy rain, thunderstorms, excessive warmth), spotlight essential security measures.
    
        Format your response in two sections:
        Climate Abstract:
        - Briefly describe the climate in plain language.
    
        Solutions:
        - Supply actionable recommendation related to the climate circumstances.
        """,
        instruments=[get_current_weather],
        tool_use_behavior="run_llm_again" # or "stop_on_first_tool"
    )
    
    async def run_agent(user_input: str):
        outcome = await Runner.run(weather_specialist_agent, user_input)
        return outcome.final_output
    
    def important():
        st.title("Climate Assistant")
        user_input = st.text_input("Ask concerning the climate:")
        
        if st.button("Get Climate Replace"):
            with st.spinner("Considering..."):
                if user_input:
                    agent_response = asyncio.run(run_agent(user_input))
                    st.write(agent_response)
                else:
                    st.write("Please enter a query concerning the climate.")
    
    if __name__ == "__main__":
        important()

    Conclusion

    At this level, now we have explored the right way to create a easy agent and why we want a software‑utilizing agent—one highly effective sufficient to reply particular questions on actual‑time climate circumstances {that a} easy agent can not deal with. We’ve got additionally constructed a easy Streamlit UI to work together with this agent.

    This primary article focuses solely on the core idea of how agentic AI can work together with a software, fairly than relying solely on its coaching information to generate output.

    Within the subsequent article, we are going to shift our focus to a different essential idea of agentic AI: agent collaboration. We’ll cowl why a multi‑agent system might be simpler than a single “tremendous” agent, and discover other ways brokers can work together with one another.

    I hope this text has supplied useful insights to begin your journey into these subjects.

    References

    [1] OpenAI. (2025). OpenAI Brokers SDK documentation. Retrieved July 19, 2025, from https://openai.github.io/openai-agents-python/

    [2] Bornet, P., Wirtz, J., Davenport, T. H., De Cremer, D., Evergreen, B., Fersht, P., Gohel, R., Khiyara, S., Sund, P., & Mullakara, N. (2025). Agentic Synthetic Intelligence: Harnessing AI Brokers to Reinvent Enterprise, Work, and Life. World Scientific Publishing Co.

    [3] Streamlit Inc. (2025). Streamlit documentation. Retrieved July 19, 2025, from https://docs.streamlit.io/

    [4] Open-Meteo. Open-Meteo API documentation. Retrieved July 19, 2025, from https://open-meteo.com/en/docs


    You’ll find the entire supply code used on this article within the following repository: agentic-ai-weather | GitHub Repository. Be happy to discover, clone, or fork the undertaking to comply with alongside or construct your personal model.

    In the event you’d prefer to see the app in motion, I’ve additionally deployed it right here: Weather Assistant Streamlit

    Lastly, let’s join on LinkedIn!



    Source link

    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Previous ArticleHow to Implement Legal AI for Positive ROI | by Greystack Technologies | Jul, 2025
    Next Article Billionaire In-N-Out Burger Heiress Moves Out of California
    Team_AIBS News
    • Website

    Related Posts

    Artificial Intelligence

    Candy AI NSFW AI Video Generator: My Unfiltered Thoughts

    August 2, 2025
    Artificial Intelligence

    Starting Your First AI Stock Trading Bot

    August 2, 2025
    Artificial Intelligence

    When Models Stop Listening: How Feature Collapse Quietly Erodes Machine Learning Systems

    August 2, 2025
    Add A Comment
    Leave A Reply Cancel Reply

    Top Posts

    Graph Neural Networks (GNNs) for Alpha Signal Generation | by Farid Soroush, Ph.D. | Aug, 2025

    August 2, 2025

    I Tried Buying a Car Through Amazon: Here Are the Pros, Cons

    December 10, 2024

    Amazon and eBay to pay ‘fair share’ for e-waste recycling

    December 10, 2024

    Artificial Intelligence Concerns & Predictions For 2025

    December 10, 2024

    Barbara Corcoran: Entrepreneurs Must ‘Embrace Change’

    December 10, 2024
    Categories
    • AI Technology
    • Artificial Intelligence
    • Business
    • Data Science
    • Machine Learning
    • Technology
    Most Popular

    Airbnb Now Offers Bookings for Massages, Chefs, Fitness

    May 15, 2025

    This AI Trick Pays Beginners $1K/Day(Proof Inside)💵 | by Cody Max | Apr, 2025

    April 25, 2025

    Apple Intelligence falsely claims Luke Littler won darts final

    January 3, 2025
    Our Picks

    Graph Neural Networks (GNNs) for Alpha Signal Generation | by Farid Soroush, Ph.D. | Aug, 2025

    August 2, 2025

    How This Entrepreneur Built a Bay Area Empire — One Hustle at a Time

    August 2, 2025

    How Deep Learning Is Reshaping Hedge Funds

    August 2, 2025
    Categories
    • AI Technology
    • Artificial Intelligence
    • Business
    • Data Science
    • Machine Learning
    • Technology
    • Privacy Policy
    • Disclaimer
    • Terms and Conditions
    • About us
    • Contact us
    Copyright © 2024 Aibsnews.comAll Rights Reserved.

    Type above and press Enter to search. Press Esc to cancel.