Multi-provider framework in Elixir
-
Updated
Jul 9, 2025 - Elixir
Multi-provider framework in Elixir
AI Gateway: Claude Pro, Copilot, Gemini subscriptions → OpenAI/Anthropic/Gemini APIs. No API keys needed.
Terraform module to provision a VPC peering across multiple VPCs in different accounts by using multiple providers
A powerful, AI Gateway designed from scratch for AI
Production-ready Python library for multi-provider LLM orchestration
Multi cloud control of VM Instances across AWS, Azure, GCP and AliCloud - unified instance management
Easy to use Multi-Provider ASR/Speech To Text and NLP engine
Your Digital Companion. Self-hosted Telegram bot orchestrating multiple AI providers (OpenAI, Anthropic, Google, xAI, DeepSeek, Mistral, Alibaba, MiniMax) with autonomous agent capabilities, MCP integrations, and async task execution. Not a tool. A partner.
Automate DNS updates and rollbacks across multiple providers using DNSControl and GitHub Actions
Python for logic. English for intelligence.
🚀 Intelligent Claude Code status line with multi-provider AI support, real-time token counting, and universal model compatibility. Supports Claude (Sonnet 4: 1M, 3.5: 200K), OpenAI (GPT-4.1: 1M, 4o: 128K), Gemini (1.5 Pro: 2M, 2.x: 1M), and xAI Grok (3: 1M, 4: 256K) with verified 2025 context limits.
Qurio is a high-velocity AI knowledge workspace built for teams that demand more than basic chat. It supports generic providers. Highlights include Deep Research for complex tasks, Custom Agents for specialized workflows, rich tool orchestration (MCP + HTTP), long-term memory and structured reasoning views with export-ready outputs.
One API, every AI model, instant switching. Change from GPT-4 to Gemini to local models with a single config update. LLMForge is the lightweight, TypeScript-first solution for multi-provider AI applications with zero vendor lock-in
OpenAI-compatible AI proxy: Anthropic Claude, Google Gemini, GPT-5, Cloudflare AI. Free hosting, automatic failover, token rotation. Deploy in 1 minute.
Your Universal AI Coding Agent
Simple file management via a provider like S3
Streaming-first multi-provider LLM client in TypeScript with home-made tool calling
A flexible agent framework for building AI agents with MCP (Model Context Protocol) integration, Core abstractions for LLM and Embedding models using mcp architecture. to specifically make AI agents easier to build.
Add a description, image, and links to the multi-provider topic page so that developers can more easily learn about it.
To associate your repository with the multi-provider topic, visit your repo's landing page and select "manage topics."