This allows you to use TinyAI with the API of Groq, an American company providing inference (AI model execution) services.
This integration has ABSOLUTELY NOTHING to do with the Grok AI model developed by xAI.
Good to know:
Company: Groq, Inc
Model Families: various open weight models such as Llama, DeepSeek, Qwen, Mistral, Gemma, …
Free Tier: Yes
To use TinyAI with Groq follow these steps:
Log into your Groq Cloud dashboard.
Go to the API Keys page.
Click on
.Follow the on-screen prompts to create a new API key, and note it down.
Edit the System - AITiny plugin.
Under AI Service select Groq
.
Enter your API key in the API Key / Token area.
Click on
.When the page reloads, select your Model.
Click on
.Please note that very few models support vision (required for use in the alt text generation feature). As of August 2025, we found that meta-llama/llama-4-maverick-17b-128e-instruct
works fairly well for this use case.