Translate dashes to underscores in API names

so that `--backend openai-chatgpt` works like `--backend openai_chatgpt`
This commit is contained in:
Jeff Epler 2023-11-08 17:29:16 -06:00
parent 715dc2b57a
commit e0786a0fbf
No known key found for this signature in database
GPG key ID: D5BF15AB975AB4DE
2 changed files with 3 additions and 2 deletions

View file

@ -98,8 +98,8 @@ an existing session with `-s`. Or, you can continue the last session with
You can set the "system message" with the `-S` flag.
You can select the text generating backend with the `-b` flag:
* openai\_chatgpt: the default, paid API, best quality results
* llama\_cpp: Works with [llama.cpp's http server](https://github.com/ggerganov/llama.cpp/blob/master/examples/server/README.md) and can run locally with various models,
* openai-chatgpt: the default, paid API, best quality results
* llama-cpp: Works with [llama.cpp's http server](https://github.com/ggerganov/llama.cpp/blob/master/examples/server/README.md) and can run locally with various models,
though it is [optimized for models that use the llama2-style prompting](https://huggingface.co/blog/llama2#how-to-prompt-llama-2).
Set the server URL with `-B url:...`.
* textgen: Works with https://github.com/oobabooga/text-generation-webui and can run locally with various models.

View file

@ -55,6 +55,7 @@ def configure_api_from_environment(api_name, api):
def get_api(name="openai_chatgpt"):
name = name.replace("-", "_")
result = importlib.import_module(f"{__package__}.backends.{name}").factory()
configure_api_from_environment(name, result)
return result