Guides
Creating Agents
Learn how to create different types of agents using Moya. This section covers:
- OpenAI Agents: Integrate with OpenAI's API.
- Bedrock Agents: Use AWS Bedrock for generating responses.
- Remote Agents: Communicate with external APIs.
- Ollama Agents: Connect to locally hosted models.
- CrewAI Agents: Integrate with CrewAI for collaborative workflows.
Creating an OpenAI Agent
from moya.agents.openai_agent import OpenAIAgent, OpenAIAgentConfig
# Set up the agent configuration
agent_config = OpenAIAgentConfig(
system_prompt="You are a helpful AI assistant.",
model_name="gpt-4o",
temperature=0.7,
max_tokens=2000,
api_key=os.getenv("OPENAI_API_KEY") # Optional, defaults to environment variable
)
# Create the agent
agent = OpenAIAgent(
agent_name="openai_agent",
description="An agent that uses OpenAI's API",
agent_config=agent_config
)
# Set up the agent
agent.setup()
Creating a Bedrock Agent
from moya.agents.bedrock_agent import BedrockAgent, BedrockAgentConfig
# Set up the agent configuration
agent_config = BedrockAgentConfig(
system_prompt="You are a helpful AI assistant.",
model_id="anthropic.claude-v2", # Specify the Bedrock model ID
region="us-east-1", # AWS region
temperature=0.7,
max_tokens_to_sample=2000
)
# Create the agent
agent = BedrockAgent(
agent_name="bedrock_agent",
description="An agent that uses AWS Bedrock",
agent_config=agent_config
)
# Set up the agent
agent.setup()
Creating an Ollama Agent
from moya.agents.ollama_agent import OllamaAgent, OllamaAgentConfig
# Set up the agent configuration
agent_config = OllamaAgentConfig(
system_prompt="You are a helpful AI assistant.",
model_name="llama2", # Specify the Ollama model
base_url="http://localhost:11434", # Ollama API endpoint
temperature=0.7,
context_window=4096
)
# Create the agent
agent = OllamaAgent(
agent_name="ollama_agent",
description="An agent that uses locally hosted models via Ollama",
agent_config=agent_config
)
# Set up the agent
agent.setup()
Managing Memory
Moya provides tools for managing conversation memory. This guide explains how to:
- Store and retrieve messages.
- Summarize conversations.
- Use memory tools effectively.
Example of Using Memory Tool
from moya.tools.memory_tool import MemoryTool
from moya.memory.in_memory_repository import InMemoryRepository
from moya.tools.tool_registry import ToolRegistry
# Set up memory components
memory_repo = InMemoryRepository()
memory_tool = MemoryTool(memory_repository=memory_repo)
# Create a tool registry and register the memory tool
tool_registry = ToolRegistry()
tool_registry.register_tool(memory_tool)
# Store a message
memory_tool.store_message(thread_id="thread_1", sender="user", content="Hello, how are you?")
# Retrieve messages
messages = memory_tool.get_last_n_messages(thread_id="thread_1", n=5)
print(messages)
# Get a summary of the conversation
summary = memory_tool.get_thread_summary(thread_id="thread_1")
print(summary)
Building Multi-Agent Systems
Multi-agent systems allow you to create specialized agents for different tasks and route messages to the appropriate agent. This guide covers:
- Creating multiple specialized agents
- Setting up a classifier for message routing
- Using the MultiAgentOrchestrator
Example of a Multi-Agent System
from moya.agents.openai_agent import OpenAIAgent, OpenAIAgentConfig
from moya.classifiers.llm_classifier import LLMClassifier
from moya.orchestrators.multi_agent_orchestrator import MultiAgentOrchestrator
from moya.registry.agent_registry import AgentRegistry
# Create specialized agents
english_agent = OpenAIAgent(
agent_name="english_agent",
description="English language specialist",
agent_config=OpenAIAgentConfig(
system_prompt="You are a helpful assistant that always responds in English.",
model_name="gpt-4o"
)
)
english_agent.setup()
spanish_agent = OpenAIAgent(
agent_name="spanish_agent",
description="Spanish language specialist",
agent_config=OpenAIAgentConfig(
system_prompt="Eres un asistente que siempre responde en español.",
model_name="gpt-4o"
)
)
spanish_agent.setup()
# Create a classifier agent
classifier_agent = OpenAIAgent(
agent_name="classifier",
description="Message router",
agent_config=OpenAIAgentConfig(
system_prompt="You are a classifier that routes messages to appropriate agents.",
model_name="gpt-4o"
)
)
classifier_agent.setup()
# Create a classifier using the classifier agent
classifier = LLMClassifier(
llm_agent=classifier_agent,
default_agent="english_agent"
)
# Register agents with the agent registry
agent_registry = AgentRegistry()
agent_registry.register_agent(english_agent)
agent_registry.register_agent(spanish_agent)
# Create the multi-agent orchestrator
orchestrator = MultiAgentOrchestrator(
agent_registry=agent_registry,
classifier=classifier
)
# Use the orchestrator to handle messages
response = orchestrator.orchestrate(
thread_id="thread_1",
user_message="¿Cómo estás hoy?"
)
print(response) # Should be routed to the Spanish agent
Streaming Responses
Moya supports streaming responses for a more interactive user experience. This guide explains how to:
- Set up streaming with different agent types
- Handle streaming callbacks
- Integrate streaming with orchestrators
Example of Streaming Responses
from moya.agents.openai_agent import OpenAIAgent, OpenAIAgentConfig
from moya.orchestrators.simple_orchestrator import SimpleOrchestrator
from moya.registry.agent_registry import AgentRegistry
# Create an agent
agent = OpenAIAgent(
agent_name="streaming_agent",
description="An agent with streaming capabilities",
agent_config=OpenAIAgentConfig(
system_prompt="You are a helpful assistant.",
model_name="gpt-4o"
)
)
agent.setup()
# Register the agent
agent_registry = AgentRegistry()
agent_registry.register_agent(agent)
# Create an orchestrator
orchestrator = SimpleOrchestrator(
agent_registry=agent_registry,
default_agent_name="streaming_agent"
)
# Define a streaming callback function
def stream_callback(chunk):
print(chunk, end="", flush=True)
# Use the orchestrator with streaming
response = orchestrator.orchestrate(
thread_id="thread_1",
user_message="Tell me a short story about a robot.",
stream_callback=stream_callback
)