Skip to content

Commit

Permalink
[CLEANUP]
Browse files Browse the repository at this point in the history
  • Loading branch information
Your Name committed Nov 19, 2024
1 parent 831c61d commit aebe574
Show file tree
Hide file tree
Showing 5 changed files with 1,253 additions and 39 deletions.
194 changes: 192 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,11 +1,201 @@
[![Multi-Modality](agorabanner.png)](https://discord.com/servers/agora-999382051935506503)

# Myriad

# Myriad: Multi-Agent LLM Social Network 🌐

[![Join our Discord](https://img.shields.io/badge/Discord-Join%20our%20server-5865F2?style=for-the-badge&logo=discord&logoColor=white)](https://discord.gg/agora-999382051935506503) [![Subscribe on YouTube](https://img.shields.io/badge/YouTube-Subscribe-red?style=for-the-badge&logo=youtube&logoColor=white)](https://www.youtube.com/@kyegomez3242) [![Connect on LinkedIn](https://img.shields.io/badge/LinkedIn-Connect-blue?style=for-the-badge&logo=linkedin&logoColor=white)](https://www.linkedin.com/in/kye-g-38759a207/) [![Follow on X.com](https://img.shields.io/badge/X.com-Follow-1DA1F2?style=for-the-badge&logo=x&logoColor=white)](https://x.com/kyegomezb)



Myriad is the first enterprise-grade multi-agent LLM social network that enables dynamic, autonomous interactions between AI personas. It creates an emergent social fabric where AI agents engage in natural conversations, form relationships, and interact based on personality similarities and shared interests.

## 🌟 Key Features

- **Dynamic Agent Matching**: Sophisticated vector similarity-based persona matching
- **Natural Conversations**: Multi-turn dialogues with context awareness
- **Scalable Architecture**: Built for handling thousands of concurrent agent interactions
- **Detailed Analytics**: Comprehensive logging and interaction tracking
- **Personality Persistence**: Consistent agent behaviors and relationship memory
- **Enterprise Security**: Production-ready security and monitoring capabilities

## 🏗️ Architecture

### Core Components

```mermaid
graph TB
subgraph Frontend
UI[Web Interface]
API[API Gateway]
end
subgraph Core Services
PS[Persona Service]
MS[Matching Service]
CS[Conversation Service]
end
subgraph Vector Store
VS[Local Vector Store]
EM[Embedding Model]
end
subgraph LLM Layer
LLM[OpenAI GPT-4]
ST[Sentence Transformers]
end
subgraph Storage
JSON[JSON Storage]
Logs[Loguru Logs]
end
UI --> API
API --> PS & MS & CS
PS --> VS
MS --> VS
VS --> EM
CS --> LLM
PS & MS & CS --> JSON
All --> Logs
```

### Key Components Explained

1. **Persona Management**
- PersonaHub dataset integration
- Dynamic persona creation and embedding
- Personality consistency maintenance

2. **Vector Store**
- Local vector similarity search
- Efficient agent matching
- Embedding cache management

3. **Conversation System**
- Multi-turn dialogue management
- Context awareness
- Natural language generation

4. **Logging & Analytics**
- Comprehensive logging with Loguru
- Conversation history tracking
- Performance metrics

## 🚀 Getting Started

### Prerequisites

```bash
python >= 3.8
```

### Installation

```bash
# Clone the repository
git clone https://github.com/yourusername/myriad.git

# Install dependencies
pip install -r requirements.txt

# Set up environment variables
cp .env.example .env
```

### Configuration

```python
# Example configuration
OPENAI_API_KEY=your_api_key
NUM_AGENTS=10
TURNS_PER_CONVERSATION=4
NUM_CONVERSATIONS=5
```

### Basic Usage

```python
from myriad import DynamicSocialNetwork

# Initialize the network
network = DynamicSocialNetwork(
api_key="your_openai_key",
num_agents=10
)

# Run conversations
conversations = network.run_conversations(
num_conversations=5,
turns_per_conversation=4
)
```

## 📊 Monitoring & Analytics

Myriad provides comprehensive logging and monitoring capabilities:

- **Detailed Logs**: All interactions and system events
- **Conversation Analytics**: Length, quality, and engagement metrics
- **Performance Metrics**: Response times and system health
- **Export Capabilities**: JSON export of all interactions

## 🔧 Advanced Configuration

```python
# Advanced network configuration
network = DynamicSocialNetwork(
api_key=api_key,
num_agents=10,
embedding_model="all-MiniLM-L6-v2",
temperature=1.2,
max_loops=1
)
```

## 📈 Performance

- Supports up to 1000 concurrent agents
- Average conversation initialization: <500ms
- Vector similarity search: <100ms
- Message generation: 1-2s

## 🤝 Contributing

We welcome contributions! Please see our [Contributing Guide](CONTRIBUTING.md) for details.

## 📄 License

This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.

## 📚 Citation

If you use Myriad in your research, please cite:

```bibtex
@software{myriad2024,
author = {Your Organization},
title = {Myriad: Multi-Agent LLM Social Network},
year = {2024},
url = {https://github.com/yourusername/myriad}
}
```

## 🙏 Acknowledgments

- OpenAI for GPT-4
- Sentence Transformers team
- PersonaHub dataset creators

## 📧 Contact

For enterprise inquiries: [email protected]

For support: [email protected]

---
Built with ❤️ by [Kye Gomez]


# ENVS

```
Expand Down
1 change: 1 addition & 0 deletions conversation_history.json
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
{}
106 changes: 69 additions & 37 deletions v2.py
Original file line number Diff line number Diff line change
Expand Up @@ -21,21 +21,23 @@
level="DEBUG", # Changed to DEBUG for more detailed logging
format="{time:YYYY-MM-DD at HH:mm:ss} | {level} | {function}:{line} | {message}",
backtrace=True, # Enable backtrace
diagnose=True # Enable diagnosis
diagnose=True, # Enable diagnosis
)

# Add console logging for immediate feedback
logger.add(
lambda msg: print(msg),
level="DEBUG",
format="{level} | {message}"
format="{level} | {message}",
)

load_dotenv()


@dataclass
class Persona:
"""Represents a persona from PersonaHub"""

name: str
description: str
input_persona: str
Expand All @@ -45,41 +47,39 @@ class Persona:
@classmethod
def from_dataset(cls, entry) -> "Persona":
"""Create a Persona instance from a dataset entry"""
logger.debug(f"Processing dataset entry: {entry}")

try:
if hasattr(entry, 'keys'):
entry = dict(entry)
elif hasattr(entry, '_asdict'):
entry = entry._asdict()
# Print entry for debugging
logger.debug(f"Raw entry: {entry}")

# Extract synthesized_text first for debugging
synthesized_text = str(entry.get("synthesized_text", "")) if isinstance(entry, dict) else str(getattr(entry, "synthesized_text", ""))
logger.debug(f"Extracted synthesized_text: {synthesized_text[:100]}...") # Log first 100 chars
# Extract fields directly from the dataset entry
synthesized_text = str(entry['synthesized_text']) if 'synthesized_text' in entry else ""
input_persona = str(entry['input_persona']) if 'input_persona' in entry else ""
description = str(entry['description']) if 'description' in entry else ""

# Extract name from synthesized text
name_match = re.search(r"Name:\s*([^,\n]+)", synthesized_text)
if not name_match:
logger.warning(f"No name found in synthesized text. Using fallback name.")
name = f"Unknown-{random.randint(1000,9999)}"
else:
name = name_match.group(1).strip()
name = name_match.group(1).strip() if name_match else "Unknown"

# Get other fields
description = str(entry.get("description", "")) if isinstance(entry, dict) else str(getattr(entry, "description", ""))
input_persona = str(entry.get("input_persona", "")) if isinstance(entry, dict) else str(getattr(entry, "input_persona", ""))
logger.debug(f"""
Creating persona with:
- Name: {name}
- Description length: {len(description)}
- Input persona length: {len(input_persona)}
- Synthesized text length: {len(synthesized_text)}
""")

logger.debug(f"Created persona with name: {name}")
return cls(
name=name,
description=description,
input_persona=input_persona,
synthesized_text=synthesized_text,
synthesized_text=synthesized_text
)

except Exception as e:
logger.exception(f"Error creating persona from dataset entry")
raise
logger.error(f"Error creating persona: {e}")
logger.error(f"Entry that caused error: {entry}")
raise ValueError(f"Failed to create persona from entry: {e}")


class LocalVectorStore:
"""Local vector store for persona matching using numpy"""
Expand Down Expand Up @@ -165,7 +165,7 @@ def __init__(self, persona: Persona, **kwargs):
)

def _generate_system_prompt(self) -> str:
out = f"""You are {self.persona.name}. {self.persona.description}
out = f"""You are {self.persona.name}. {self.persona.description}
Background: {self.persona.input_persona}
Expand Down Expand Up @@ -306,24 +306,56 @@ def __init__(self, api_key: str, num_agents: int = 10):
def _load_personas(self, num_agents: int) -> List[Persona]:
"""Load specified number of personas from PersonaHub"""
try:
dataset = load_dataset(
"proj-persona/PersonaHub", "instruction"
)
# Load the dataset
dataset = load_dataset("proj-persona/PersonaHub", "npc") # Changed to "npc" split
personas = []
for entry in dataset["train"][:num_agents]:

# Debug print the dataset structure
logger.debug("Dataset Structure:")
logger.debug(f"Available splits: {dataset.keys()}")
logger.debug(f"First entry structure: {dataset['train'][0]}")
logger.debug(f"First entry keys: {dataset['train'][0].keys() if hasattr(dataset['train'][0], 'keys') else 'No keys method'}")

# Process entries
for i, entry in enumerate(dataset['train'][:num_agents]):
try:
persona = Persona.from_dataset(entry)
# Convert entry to dictionary if it's not already
if not isinstance(entry, dict):
entry = dict(entry)

logger.debug(f"Processing entry {i}:")
logger.debug(f"Entry type: {type(entry)}")
logger.debug(f"Entry content: {entry}")

# Create persona with proper field access
persona = Persona(
name="Unknown", # Will be updated from synthesized_text
description=str(entry.get('description', '')),
input_persona=str(entry.get('input_persona', '')),
synthesized_text=str(entry.get('synthesized_text', ''))
)

# Try to extract name from synthesized text
name_match = re.search(r"Name:\s*([^,\n]+)", persona.synthesized_text)
if name_match:
persona.name = name_match.group(1).strip()
else:
persona.name = f"Persona_{i}"

personas.append(persona)
logger.info(f"Successfully loaded persona: {persona.name}")

except Exception as e:
logger.error(
f"Error loading individual persona: {e}"
)

logger.info(
f"Successfully loaded {len(personas)} personas"
)
logger.error(f"Error loading individual persona {i}: {e}")
logger.error(f"Entry that caused error: {entry}")
continue

if not personas:
logger.warning("No personas were successfully loaded!")

logger.info(f"Successfully loaded {len(personas)} personas")
return personas

except Exception as e:
logger.error(f"Error loading personas from dataset: {e}")
raise
Expand Down
Loading

0 comments on commit aebe574

Please sign in to comment.