Quickstart Guide
Table of Contents
Installation
The Moya library is already set up with several example agents and tools. To get started:
pip install moya-ai
Prerequisites
Before running the examples, you'll need:
- Python 3.8 or higher
- An OpenAI API key (for OpenAI examples)
- AWS credentials (for Bedrock examples)
- Ollama installed locally (for Ollama examples)
The repository includes several pre-configured example files in moya/examples/
:
quick_start_openai.py
- Basic OpenAI chat agentquick_start_bedrock.py
- AWS Bedrock integrationquick_start_ollama.py
- Local Ollama modelsquick_start_multiagent.py
- Multiple agent orchestrationdynamic_agents.py
- Runtime agent creation
Basic Usage
Let's walk through a simple example that demonstrates Moya's core functionality:
from moya.agents.openai_agent import OpenAIAgent, OpenAIAgentConfig
# Create an agent configuration
agent_config = OpenAIAgentConfig(
system_prompt="You are a helpful AI assistant.",
model_name="gpt-4o"
)
# Initialize the agent
agent = OpenAIAgent(
agent_name="my_assistant",
description="A general-purpose AI assistant",
agent_config=agent_config
)
# Set up the agent
agent.setup()
# Send a message and get a response
response = agent.handle_message("Hello, how can you help me?")
This example demonstrates:
- Agent Configuration: Setting up an agent with specific parameters like the system prompt and model.
- Agent Initialization: Creating an agent instance with a unique name and description.
- Message Handling: Basic interaction with the agent through message passing.
Running Examples
Moya comes with several example scripts that demonstrate different capabilities:
1. Simple Chat Agent
The quick_start_openai.py
example shows how to create an interactive chat agent with memory:
python examples/quick_start_openai.py
This example demonstrates:
- Setting up a conversation memory system
- Streaming responses in real-time
- Maintaining conversation context across messages
2. Multi-Agent System
The quick_start_multiagent.py
example shows how to create a system with multiple specialized agents:
python examples/quick_start_multiagent.py
This example showcases:
- Creating multiple agents with different specializations
- Automatic message routing between agents
- Shared memory across agents
3. Local Model Integration
For users who prefer to run models locally, the Ollama integration example demonstrates using local models:
# First, install and start Ollama
# Install from https://ollama.ai
ollama run llama2 # Downloads and runs the model
# Then run the example
python examples/quick_start_ollama.py
This example shows:
- Integration with locally hosted models
- Privacy-focused deployment options
- Reduced latency for local inference
Next Steps
Now that you have Moya up and running, you can:
- Explore the Guides to learn about different agent types and features
- Follow the Tutorials to build more complex systems
- Check out the Explanations to understand how Moya works
- Browse the Reference for detailed API documentation