Support

Documentation

9.OpenRouter

9.OpenRouter

This allows you to use TinyAI with the API of OpenRouter, an American AI company.

Note

Unlike most other AI service providers, OpenRouter does not have its own models and does not run inference (generative AI) on its own infrastructure. They just provide a common API which can "talk" to several other companies providing inference. You get access to a lot more models, but the pricing may be higher in some cases, depending on which inference provider your request is routed to.

Good to know:

  • Company: OpenRouter

  • Model Families: Several, including Jamba, Aion, Nova, Magnum, Claude, Arcee, QwQ, Ernie, TARS, Mistral, Command, DeepSeek R1, Gemini, Gemma, Mercury, Inflection, LFM, Weaver, Llama, Phi, MiniMax, Kimi, Hermes, OpenAI GPT, Perplexity R1, Perplexity Sonar, Qwen, Grok, and many more.

  • Free Tier: Yes, for the models marked as “(free)”.

To use TinyAI with OpenRouter follow these steps:

  1. Log into your OpenRouter account and go to the API keys page.

  2. Click on Create API Key to create a new API key, and note it down.

  3. Edit the System - AITiny plugin.

  4. Under AI Service select OpenRouter.

  5. Enter your API key in the API Key / Token area.

  6. Click on Save.

  7. When the page reloads, select your Model.

Notes and Caveats

Not all models are suitable for text generation and/or alt text generation. Moreover, some models are only provided by a single provider which may make them unreliable. If you keep getting errors, or seemingly nothing happens when you use AITiny try using a different model.