Merge pull request #15 from jepler/issue5

Dynamically load requirements.txt in pyproject.toml
This commit is contained in:
Jeff Epler 2023-10-07 09:30:25 +01:00 committed by GitHub
commit 36a4ebf487
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23
5 changed files with 59 additions and 24 deletions

View file

@ -7,7 +7,7 @@ SPDX-License-Identifier: MIT
[![Release chap](https://github.com/jepler/chap/actions/workflows/release.yml/badge.svg?event=release)](https://github.com/jepler/chap/actions/workflows/release.yml)
[![PyPI](https://img.shields.io/pypi/v/chap)](https://pypi.org/project/chap/)
# chap - A Python interface to chatgpt, including a terminal user interface (tui)
# chap - A Python interface to chatgpt and other LLMs, including a terminal user interface (tui)
![Chap screencast](https://github.com/jepler/chap/blob/main/chap.gif)
@ -15,23 +15,33 @@ SPDX-License-Identifier: MIT
Chap is developed on Linux with Python 3.11. Due to use of the `list[int]` style of type hints, it is known not to work on 3.8 and older; the target minimum Python version is 3.9 (debian oldstable).
## installation
## Installation
Install with e.g., `pipx install chap`
Install with e.g., `pipx install chap`, or `pip install chap` in a virtual environment.
## configuration
## Installation for development
Install in developer mode e.g., with `pip install -e .`.
In this mode, you get the "chap" commandline program installed but can edit the source files in place.
This is the [recommended practice per PyPA](https://setuptools.pypa.io/en/latest/userguide/development_mode.html).
A shim script `chap.py` is included so that the older development style of `pip install -r requirements.txt` + `python chap.py` (or `./chap.py`) functions as well.
## Configuration
Put your OpenAI API key in the platform configuration directory for chap, e.g., on linux/unix systems at `~/.config/chap/openai_api_key`
## commandline usage
## Commandline usage
* `chap ask "What advice would you give a 20th century human visiting the 21st century for the first time?"`
* `chap render --last`
* `chap render --last` / `chap cat --last`
* `chap import chatgpt-style-chatlog.json` (for files from pionxzh/chatgpt-exporter)
## interactive terminal usage
* `chap grep needle`
## Interactive terminal usage
* chap tui
## Sessions & Commandline Parameters
@ -49,15 +59,18 @@ You can set the "system message" with the `-S` flag.
You can select the text generating backend with the `-b` flag:
* openai\_chatgpt: the default, paid API, best quality results
* llama_cpp: Works with (llama.cpp's http server)[https://github.com/ggerganov/llama.cpp/blob/master/examples/server/README.md] and can run locally with various models. Set the server URL with `-B url:...`.
* textgen: Works with https://github.com/oobabooga/text-generation-webui and can run locally with various models. Needs the server URL in *$configuration_directory/textgen\_url*.
* llama\_cpp: Works with [llama.cpp's http server](https://github.com/ggerganov/llama.cpp/blob/master/examples/server/README.md) and can run locally with various models,
though it is [optimized for models that use the llama2-style prompting](https://huggingface.co/blog/llama2#how-to-prompt-llama-2).
Set the server URL with `-B url:...`.
* textgen: Works with https://github.com/oobabooga/text-generation-webui and can run locally with various models.
Needs the server URL in *$configuration_directory/textgen\_url*.
* lorem: local non-AI lorem generator for testing
## Environment variables
The backend can be set with `CHAP_BACKEND`.
Backend settings can be set with `CHAP_<backend_name>_<parameter_name>`, with `backend_name` and `parameter_name` all in caps.
For instance, `CHAP_LLAMA_CPP_URL=http://server.local:8080/completion` changes the default server URL for the llama_cpp back-end.
For instance, `CHAP_LLAMA_CPP_URL=http://server.local:8080/completion` changes the default server URL for the llama\_cpp back-end.
## Importing from ChatGPT

19
chap.py Executable file
View file

@ -0,0 +1,19 @@
#!/usr/bin/env python3
# SPDX-FileCopyrightText: 2023 Jeff Epler <jepler@gmail.com>
#
# SPDX-License-Identifier: MIT
import pathlib
import sys
sys.path[0] = str(pathlib.Path(__file__).parent / "src")
if __name__ == "__main__":
# pylint: disable=import-error,no-name-in-module
from chap.core import main
main()
else:
raise ImportError(
"this script exists to facilitate running 'python -mchap' in the top directory; it should not be imported"
)

View file

@ -4,7 +4,7 @@
[build-system]
requires = [
"setuptools>=61",
"setuptools>=68.2.2",
"setuptools_scm[toml]>=6.0",
]
build-backend = "setuptools.build_meta"
@ -19,18 +19,7 @@ where = ["src"]
name="chap"
authors = [{name = "Jeff Epler", email = "jepler@gmail.com"}]
description = "Interact with the OpenAI ChatGPT API (and other text generators)"
dynamic = ["readme","version"]
dependencies = [
"click",
"dataclasses_json",
"httpx",
"lorem-text",
"platformdirs",
"simple_parsing",
"textual>=0.18.0",
"tiktoken",
"websockets",
]
dynamic = ["readme","version","dependencies"]
classifiers = [
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.9",
@ -50,3 +39,4 @@ chap = "chap.__main__:main"
write_to = "src/chap/__version__.py"
[tool.setuptools.dynamic]
readme = {file = ["README.md"], content-type="text/markdown"}
dependencies = {file = "requirements.txt"}

View file

@ -3,6 +3,6 @@
# SPDX-License-Identifier: MIT
build
setuptools>=45
setuptools>=68.2.2
twine
wheel

13
requirements.txt Normal file
View file

@ -0,0 +1,13 @@
# SPDX-FileCopyrightText: 2023 Jeff Epler
#
# SPDX-License-Identifier: Unlicense
click
dataclasses_json
httpx
lorem-text
platformdirs
simple_parsing
textual>=0.18.0
tiktoken
websockets