HandyLLM v0.8.2
Release HandyLLM v0.8.2.
Added
hprompt
: load methods now supportcls
parameter for prompt type specificationChatPrompt
andCompletionsPrompt
support optional request and metaChatPrompt
:- supports add dict
- add
add_message(...)
method
CompletionsPrompt
:- add
add_text(...)
method
- add
PromptConverter
:yaml.dump
usesallow_unicode=True
option- move all type definitions to
_types.py
- support for package development:
- add
requirement.txt
for development - add
scripts/test.sh
for running tests - add test scripts in
tests
folder
- add
Fixed
HandyPrompt.eval(...)
should not make directories for output pathsCompletionsPrompt._run_with_client(...)
: misplacedrun_config
paramPromptConverter
- fix variable replacement for
content_array
message - fix wrong return type of
stream_msgs2raw
andastream_msgs2raw
- fix variable replacement for
requestor
:httpx.Response
should usereason_phrase
to get error reasonacall()
fix missing brackets for await_call_raw()
and_acall_raw()
intercept and raise new exception without original one_acall_raw()
: read the response first to preventhttpx.ResponseNotRead
before getting error message
_utils.exception2err_msg(...)
should append error message instead of printing- change
io.IOBase
toIO[str]
for file descriptors (e.g.RunConfig.output_fd
) - fix other type hints
Changed
- move all old files in
tests
folder toexamples
folder