Jump to content

outcoldman

Member
  • Posts

    1
  • Joined

  • Last visited

Everything posted by outcoldman

  1. It would be nice to be able to override the https://api.openai.com to local model. There are many servers, that provide the same API, but you can run local models. As an example, I am running LM Studio with Mistral on my mac, and I was able to modify chatgpt script to replace https://api.openai.com to http://localhost:1234 and can communicate with my local model. Also would be nice to easily switch between pre-configured prompts/models/urls.
×
×
  • Create New...