8.7 C
New York
Friday, November 22, 2024

Creating an Earnings Reporting Agent with Swarm Framework


Think about in case you might automate the tedious activity of analyzing earnings experiences, extracting key insights, and making knowledgeable suggestions, all with out lifting a finger. On this article, we’ll clarify the way to create a multiagent system utilizing OpenAI’s Swarm framework, designed to deal with these precise duties. You may learn to arrange and manage three specialised brokers: one to summarize earnings experiences, one other to investigate sentiment, and a 3rd to generate actionable suggestions. By the tip of this tutorial, you will have a modular, scalable answer for optimizing monetary evaluation, with potential functions past earnings reporting.

Studying outcomes

  • Perceive the basics of OpenAI’s Swarm framework for multi-agent techniques.
  • Learn to create brokers to summarize, sentiment evaluationand suggestions.
  • Discover utilizing modular brokers for earnings report evaluation.
  • Securely handle API keys utilizing an .env file.
  • Implement a multi-agent system to automate earnings report processing.
  • Study real-world functions of multi-agent techniques in finance.
  • Arrange and run a multi-agent workflow utilizing OpenAI’s Swarm framework.

This text was printed as a part of the Information Science Blogathon.

What’s OpenAI swarm?

Swarm is a light-weight experimental OpenAI framework that focuses on multi-agent orchestration. It permits us to coordinate a number of brokers, every of which handles particular duties, comparable to summarizing content material, performing sentiment evaluation, or recommending actions. In our case, we’ll design three brokers:

  • Abstract Agent: Supplies a concise abstract of the earnings report.
  • Sentiment agent: Analyze the sentiment of the report.
  • Referral agent: Recommends shares based mostly on sentiment evaluation.

Use instances and advantages of multi-agent techniques

You possibly can prolong the multi-agent system created right here for varied use instances.

  • Portfolio administration: Automate monitoring of assorted firm experiences and counsel portfolio modifications based mostly on sentiment tendencies.
  • Finance Information Abstract: Combine real-time information feeds with these brokers to detect potential market strikes early.
  • Sentiment Monitoring: Use sentiment evaluation to foretell inventory actions or crypto tendencies based mostly on optimistic or destructive market information.

By breaking down duties into modular brokers, you may reuse particular person elements throughout totally different tasks, permitting for flexibility and scalability.

Step 1: Arrange your challenge surroundings

Earlier than we dive into coding, it is important to put a stable basis for the challenge. On this step, you’ll create the required folders and information and set up the required dependencies to maintain every part operating easily.

mkdir earnings_report
cd earnings_report
mkdir brokers utils
contact foremost.py brokers/__init__.py utils/__init__.py .gitignore

Set up dependencies

pip set up git+https://github.com/openai/swarm.git openai python-dotenv

Step 2 – Retailer your API key securely

Safety is vital, particularly when working with delicate information like API keys. This step will information you on the way to retailer your OpenAI API key securely utilizing a .env file, making certain your credentials are protected and sound.

OPENAI_API_KEY=your-openai-api-key-here

This ensures that your API key isn’t uncovered in your code.

Step 3: Deploy the brokers

Now’s the time to carry your brokers to life! On this step, you’ll create three unbiased brokers: one to summarize the earnings report, one other for sentiment evaluation, and a 3rd to generate actionable suggestions based mostly on sentiment.

Abstract Agent

The Abstract Agent will extract the primary 100 characters of the earnings report as a abstract.

Create brokers/summary_agent.py:

from swarm import Agent

def summarize_report(context_variables):
    report_text = context_variables("report_text")
    return f"Abstract: {report_text(:100)}..."

summary_agent = Agent(
    title="Abstract Agent",
    directions="Summarize the important thing factors of the earnings report.",
    capabilities=(summarize_report)
)

Sentiment agent

This agent will test to see if the phrase “revenue” seems within the report to find out if the sentiment is optimistic.

Create brokers/sentiment_agent.py:

from swarm import Agent

def analyze_sentiment(context_variables):
    report_text = context_variables("report_text")
    sentiment = "optimistic" if "revenue" in report_text else "destructive"
    return f"The sentiment of the report is: {sentiment}"

sentiment_agent = Agent(
    title="Sentiment Agent",
    directions="Analyze the sentiment of the report.",
    capabilities=(analyze_sentiment)
)

Referral agent

Relying on the sentiment, this dealer will counsel “Purchase” or “Maintain”.

Create brokers/recommendation_agent.py:

from swarm import Agent

def generate_recommendation(context_variables):
    sentiment = context_variables("sentiment")
    suggestion = "Purchase" if sentiment == "optimistic" else "Maintain"
    return f"My suggestion is: {suggestion}"

recommendation_agent = Agent(
    title="Advice Agent",
    directions="Advocate actions based mostly on the sentiment evaluation.",
    capabilities=(generate_recommendation)
)

Step 4 – Add a Helper Perform to Add Information

Loading information effectively is a vital a part of any challenge. Right here, you will create a helper perform to streamline the method of studying and loading the earnings report file, making it simpler in your brokers to entry the information.

def load_earnings_report(filepath):
    with open(filepath, "r") as file:
        return file.learn()

Step 5: Be part of every part in foremost.py

Together with your brokers prepared, it is time to put every part collectively. On this step, you’ll write the primary script that organizes the brokers, permitting them to work in concord to investigate and supply data on the earnings report.

from swarm import Swarm
from brokers.summary_agent import summary_agent
from brokers.sentiment_agent import sentiment_agent
from brokers.recommendation_agent import recommendation_agent
from utils.helpers import load_earnings_report
import os
from dotenv import load_dotenv

# Load surroundings variables from the .env file
load_dotenv()

# Set the OpenAI API key from the surroundings variable
os.environ('OPENAI_API_KEY') = os.getenv('OPENAI_API_KEY')

# Initialize Swarm consumer
consumer = Swarm()

# Load earnings report
report_text = load_earnings_report("sample_earnings.txt")

# Run abstract agent
response = consumer.run(
    agent=summary_agent,
    messages=({"position": "consumer", "content material": "Summarize the report"}),
    context_variables={"report_text": report_text}
)
print(response.messages(-1)("content material"))

# Go abstract to sentiment agent
response = consumer.run(
    agent=sentiment_agent,
    messages=({"position": "consumer", "content material": "Analyze the sentiment"}),
    context_variables={"report_text": report_text}
)
print(response.messages(-1)("content material"))

# Extract sentiment and run suggestion agent
sentiment = response.messages(-1)("content material").break up(": ")(-1).strip()
response = consumer.run(
    agent=recommendation_agent,
    messages=({"position": "consumer", "content material": "Give a suggestion"}),
    context_variables={"sentiment": sentiment}
)
print(response.messages(-1)("content material"))

Step 6 – Create a Pattern Earnings Report

To check your system, you want information! This step exhibits you the way to create a pattern earnings report that your brokers can course of, making certain every part is prepared for motion.

Firm XYZ reported a 20% enhance in income in comparison with the earlier quarter. 
Gross sales grew by 15%, and the corporate expects continued progress within the subsequent fiscal yr.

Step 7: Run this system

Now that every part is about up, it is time to run this system and watch your multi-agent system in motion because it analyzes the earnings report, performs sentiment evaluation, and supplies suggestions.

python foremost.py

Anticipated end result:

Run the program

Conclusion

We have constructed a multi-agent answer utilizing OpenAI’s Swarm framework to automate evaluation of earnings experiences. We will course of monetary data and supply sensible suggestions with just some brokers. You possibly can simply prolong this answer by including new brokers for deeper evaluation or integrating real-time monetary APIs.

Strive it your self and see how one can improve it with extra information sources or brokers for extra superior evaluation.

Key takeaways

  • Modular structure: Splitting the system into a number of brokers and utilities retains the code maintainable and scalable.
  • Swarm Framework Energy: Swarm allows seamless transfers between brokers, making it straightforward to create complicated multi-agent workflows.
  • Safety through .env: API key administration with dotenv ensures that delicate information isn’t encrypted within the challenge.
  • This challenge will be prolonged to deal with reside monetary information via API integration, permitting it to offer real-time suggestions to buyers.

Steadily requested questions

P1. What’s OpenAI’s Swarm framework?

A. OpenAI’s Swarm is an experimental framework designed to coordinate a number of brokers to carry out particular duties. It’s excellent for constructing modular techniques the place every agent has an outlined perform, comparable to summarizing content material, performing sentiment evaluation, or producing suggestions.

P2. What are the important thing elements of a multi-agent system?

A. On this tutorial, the multi-agent system consists of three key brokers: the Abstract Agent, the Opinion Agent, and the Advice Agent. Every agent performs a particular perform, comparable to summarizing an earnings report, analyzing its opinion, or recommending shares based mostly on the opinion.

P3. How do I defend my OpenAI API key on this challenge?

A. You possibly can retailer your API key securely in a .env archive. This fashion, the API key isn’t immediately uncovered in your code, sustaining safety. He .env The file will be uploaded utilizing the python-dotenv package deal.

This fall. Can I prolong this challenge to deal with reside monetary information?

A. Sure, the challenge will be prolonged to deal with reside information by integrating monetary APIs. You possibly can create extra brokers to get real-time earnings experiences and analyze tendencies to offer up-to-date suggestions.

Q5. Can I reuse the brokers in different tasks?

A. Sure, brokers are designed to be modular, so you may reuse them in different tasks. You possibly can adapt them to totally different duties, comparable to summarizing information articles, performing textual content sentiment evaluation, or making suggestions based mostly on any type of structured information.

The media proven on this article isn’t the property of Analytics Vidhya and is used on the creator’s discretion.

Good day,
I’m a Licensed TensorFlow Developer, GCP Affiliate Engineer, and GCP Machine Studying Engineer.

When it comes to GCP information, I’ve expertise working with varied GCP providers comparable to Compute Engine, Kubernetes Engine, App Engine, Cloud Storage, BigQuery, and Cloud SQL. I’ve expertise with cloud-native information processing instruments comparable to Dataflow and Apache Beam. I’m additionally proficient in utilizing Cloud SDK and Cloud Shell to deploy and handle GCP assets. I’ve hands-on expertise in configuring and managing GCP tasks, creating and managing digital machines, configuring load balancers, and managing storage.

When it comes to machine studying, I’ve expertise working with a variety of algorithms, together with supervised and unsupervised studying, deep studying, and pure language processing. I’ve additionally labored on quite a lot of tasks, together with picture classification, sentiment evaluation, and predictive modeling.

As for internet scraping, I’ve expertise utilizing quite a lot of instruments and libraries, together with Scrapy, BeautifulSoup, and Selenium. I’ve additionally labored with APIs and might deal with information cleansing, preprocessing and visualization.

Thanks in your time.

Related Articles

Latest Articles