Documentation

Introduction

ProYaro Master Instructions

For AI Assistants: This folder contains comprehensive documentation and code templates for building applications using the ProYaro AI infrastructure.


šŸ“‹ Table of Contents

  1. Quick Start
  2. File Guide
  3. How to Use
  4. Service Selection

šŸš€ Quick Start

For AI Assistants (Claude, Gemini, etc.)

When helping users build applications:

  1. READ FIRST: AI_ASSISTANT_INSTRUCTIONS.md - Core directive for using ProYaro services
  2. CHECK NETWORK: NETWORK_TOPOLOGY.md - Find service locations and ports
  3. CHECK SPECS: MACHINES_INFRASTRUCTURE.md - Understand capabilities and limits
  4. API REFERENCE: API_INTEGRATION_GUIDE.md - Complete API documentation
  5. USE SKILLS: skills/ folder - Ready-to-use code templates

For Developers

This folder provides:

  • Complete API documentation for all ProYaro AI services
  • Network topology and infrastructure details
  • Ready-to-use code templates (skills) for common integrations
  • Best practices and examples

šŸ“ File Guide

Core Documentation

FilePurposeRead This When...
AI_ASSISTANT_INSTRUCTIONS.mdAI assistant directiveYou're an AI helping with development
API_INTEGRATION_GUIDE.mdComplete API referenceYou need API endpoint details
NETWORK_TOPOLOGY.mdNetwork diagram & accessYou need to know service locations
MACHINES_INFRASTRUCTURE.mdHardware specs & limitsYou need to understand capabilities

Skills (Code Templates)

SkillServiceUse For...
mlx-chat-skill.mdMLX FastAPIText generation, chatbots
skills/ folderVariousAdditional integration patterns

šŸ“– How to Use

Scenario 1: Building a Chatbot

1. Read: AI_ASSISTANT_INSTRUCTIONS.md → Confirms MLX is available
2. Check: NETWORK_TOPOLOGY.md → MLX at localhost:8004
3. Reference: API_INTEGRATION_GUIDE.md → MLX Chat API section
4. Template: skills/mlx-chat-skill.md → Ready-to-use code
5. Implement: Copy & customize the TypeScript/Python template

Scenario 2: Adding Voice Features

1. Read: AI_ASSISTANT_INSTRUCTIONS.md → Confirms Whisper + XTTS available
2. Check: NETWORK_TOPOLOGY.md → Ubuntu server (api.proyaro.com)
3. Reference: API_INTEGRATION_GUIDE.md → STT & TTS sections
4. Implement: Use job-based API pattern

Scenario 3: Image Generation

1. Read: AI_ASSISTANT_INSTRUCTIONS.md → Two options: Mac or Ubuntu ComfyUI
2. Decide:
   - Mac Mini: Direct API, faster for single images
   - Ubuntu: Job queue, better for batches
3. Reference: API_INTEGRATION_GUIDE.md → ComfyUI sections
4. Implement: Based on chosen approach

šŸŽÆ Service Selection Guide

Decision Matrix

NeedUse ServiceLocationSkill File
Text GenerationMLXMac Mini :8004mlx-chat-skill.md
Image (Fast)ComfyUIMac Mini :8188TBD
Image (Queue)ComfyUIUbuntu :8188*TBD
Speech-to-TextWhisperUbuntu :8001*TBD
Text-to-SpeechXTTS-v2Ubuntu :8002*TBD
Embeddings (Small)MLXMac Mini :8004TBD
Embeddings (Large)E5-LargeUbuntu :8003*TBD

* Access via Ubuntu backend API (/jobs endpoint)


šŸ’” Common Patterns

Pattern 1: Direct API (Mac Mini)

// For: Text generation, MLX embeddings, Mac ComfyUI
const response = await fetch('http://localhost:8004/v1/chat/completions', {
  method: 'POST',
  body: JSON.stringify({ prompt: '...', max_tokens: 500 }),
});
const data = await response.json();
console.log(data.text);

Pattern 2: Job-Based API (Ubuntu)

// For: STT, TTS, embeddings (Ubuntu), image gen (Ubuntu)
// Step 1: Create job
const job = await fetch('https://api.proyaro.com/jobs', {
  method: 'POST',
  headers: { 'Authorization': `Bearer ${token}` },
  body: JSON.stringify({
    job_type: 'speech_to_text',
    parameters: { audio_path: '...' }
  }),
});

// Step 2: Poll or use WebSocket for result
const result = await pollJob(job.id);
// OR
websocket.onmessage = (update) => { /* handle result */ };

šŸ” Authentication

All services require JWT authentication except direct MLX/ComfyUI on Mac Mini (internal only).

# Get token
POST /api/auth/login
{
  "email": "admin@a2zadd.com",
  "password": "..."
}

# Use token
Authorization: Bearer <token>

šŸ—ļø Architecture Overview

Your Application
      ↓
ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”
│  Choose Access Path │
ā”œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”¤
│                     │
│ → Direct (Internal) │ → Mac Mini MLX/ComfyUI
│                     │   (http://10.0.0.188:xxxx)
│                     │
│ → Ubuntu API (Prod) │ → Ubuntu Backend
│                     │   (https://api.proyaro.com)
│                     │   ↓
│                     │   Routes to:
│                     │   • Whisper (STT)
│                     │   • XTTS (TTS)
│                     │   • Embeddings
│                     │   • ComfyUI (GPU)
│                     │   • MLX (proxied)
ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜

šŸ› ļø Development Checklist

When starting a new project:

  • Read AI_ASSISTANT_INSTRUCTIONS.md
  • Identify required AI capabilities
  • Check NETWORK_TOPOLOGY.md for service locations
  • Review MACHINES_INFRASTRUCTURE.md for limits
  • Find relevant endpoints in API_INTEGRATION_GUIDE.md
  • Copy code template from skills/ folder
  • Implement with proper error handling
  • Test service availability
  • Deploy

šŸ“š Additional Resources

In This Folder

  • All documentation is versioned and dated
  • Skills are production-ready templates
  • API guide includes curl, TypeScript, and Python examples

External


šŸ”„ Updates

This documentation is actively maintained. Check the "Last Updated" date in each file.

  • Core Docs: Updated as infrastructure changes
  • Skills: Updated when new patterns emerge
  • API Guide: Updated when APIs change

šŸŽ“ Learning Path

Beginner

  1. Read README (this file)
  2. Read AI_ASSISTANT_INSTRUCTIONS.md
  3. Try mlx-chat-skill.md example
  4. Explore API_INTEGRATION_GUIDE.md examples

Intermediate

  1. Understand NETWORK_TOPOLOGY.md
  2. Review MACHINES_INFRASTRUCTURE.md
  3. Implement job-based workflows
  4. Add WebSocket real-time updates

Advanced

  1. Combine multiple services
  2. Build custom workflows
  3. Optimize performance
  4. Implement caching strategies

⚔ Quick Commands

# Navigate to master instructions
cd /Users/yaro/Documents/a2zadd/master-instruction

# List all files
ls -la

# Read a specific doc
cat AI_ASSISTANT_INSTRUCTIONS.md

# View skills
ls skills/

# Search for a topic
grep -r "embeddings" .

šŸ¤ Support

If you're an AI assistant and need clarification:

  • Ask the user (Yaro) about infrastructure details
  • Refer to the latest documentation in this folder
  • Use the provided skills as starting points

If you're a developer:

  • All services are documented in API_INTEGRATION_GUIDE.md
  • Check service health endpoints before debugging
  • Review skills folder for working examples

šŸ“Š Service Status

To check if services are running:

# Mac Mini
curl http://localhost:8004/health              # MLX
curl http://localhost:8188/system_stats        # ComfyUI
curl http://localhost:3000/api/system/status   # Backend (all)

# Ubuntu
curl https://api.proyaro.com/health            # API
ssh root@10.0.0.11 "cd /mnt/storage/new-stack && docker compose ps"

Welcome to ProYaro AI Infrastructure!

Everything you need to build amazing AI-powered applications is documented here. Start with the Quick Start section above, and happy building! šŸš€


Version: 1.0 Last Updated: 2025-01-01

ProYaro AI Infrastructure Documentation • Version 1.2