Support

Documentation

6.Groq

6.Groq

This allows you to use TinyAI with the API of Groq, an American company providing inference (AI model execution) services.

Note

This integration has ABSOLUTELY NOTHING to do with the Grok AI model developed by xAI.

Good to know:

  • Company: Groq, Inc

  • Model Families: various open weight models such as Llama, DeepSeek, Qwen, Mistral, Gemma, …

  • Free Tier: Yes

To use TinyAI with Groq follow these steps:

  1. Log into your Groq Cloud dashboard.

  2. Go to the API Keys page.

  3. Click on Create API Key.

  4. Follow the on-screen prompts to create a new API key, and note it down.

  5. Edit the System - AITiny plugin.

  6. Under AI Service select Groq.

  7. Enter your API key in the API Key / Token area.

  8. Click on Save.

  9. When the page reloads, select your Model.

  10. Click on Save & Close.

Notes and Caveats

Please note that very few models support vision (required for use in the alt text generation feature). As of August 2025, we found that meta-llama/llama-4-maverick-17b-128e-instruct works fairly well for this use case.