# Use multiple LLM backends in a single crate, simple builder-based configuration, and built-in prompt

> Clean Markdown view of GeekNews topic #18774. Use the original source for factual precision when an external source URL is present.

## Metadata

- GeekNews HTML: [https://news.hada.io/topic?id=18774](https://news.hada.io/topic?id=18774)
- GeekNews Markdown: [https://news.hada.io/topic/18774.md](https://news.hada.io/topic/18774.md)
- Type: news
- Author: [jester1337](https://news.hada.io/@jester1337)
- Published: 2025-01-17T13:43:51+09:00
- Updated: 2025-01-17T13:43:51+09:00
- Original source: [github.com/graniet](https://github.com/graniet/rllm)
- Points: 1
- Comments: 0

## Topic Body

RLLM is a Rust library that lets you use multiple LLM backends in a single project: OpenAI, Anthropic (Claude), Ollama, DeepSeek, xAI, Phind and Google. With a unified API and builder style - similar to the Stripe experience - you can easily create chat or text completion requests without multiplying structures and crates.

## Comments



_No public comments on this page._
