If you are developing on a locally hosted web server such as MAMP, WAMPserver, XAMPP etc, and you are using plain HTTP (not HTTPS) to access your development site, you can set up your locally hosted AI service very easily. For example, this is the configuration required for LM Studio, assuming you are using the default LM Studio settings (remember to start its API server from the Developer tab!):
AI Service:            Custom
Endpoint URL:            http://127.0.0.1:1234/v1
Note the /v1 at the end of the endpoint URL.              LM Studio gives you an endpoint URL without              it, such as http://127.0.0.1:1234. You              must add /v1 yourself!
API Key / Token: (leave blank)
If you are using Ollama the configuration is like above, but the endpoint URL is a bit different (again, with the default Ollama settings):
Endpoint URL:            http://localhost:11434/v1/
For other services, please consult their documentation.