openllm-client 0.5.7

Creator: codyrutscher

Last updated:

Add to Cart

Description:

openllmclient 0.5.7

👾 OpenLLM Client


























OpenLLM Client: Interacting with OpenLLM HTTP/gRPC server, or any BentoML server.


📖 Introduction
With OpenLLM, you can run inference with any open-source large-language models,
deploy to the cloud or on-premises, and build powerful AI apps, and more.
To learn more about OpenLLM, please visit OpenLLM's README.md
This package holds the underlying client implementation for OpenLLM. If you are
coming from OpenLLM, the client can be accessed via openllm.client.
import openllm

client = openllm.client.HTTPClient()

client.query('Explain to me the difference between "further" and "farther"')







📔 Citation
If you use OpenLLM in your research, we provide a citation to use:
@software{Pham_OpenLLM_Operating_LLMs_2023,
author = {Pham, Aaron and Yang, Chaoyu and Sheng, Sean and Zhao, Shenyang and Lee, Sauyon and Jiang, Bo and Dong, Fog and Guan, Xipeng and Ming, Frost},
license = {Apache-2.0},
month = jun,
title = {{OpenLLM: Operating LLMs in production}},
url = {https://github.com/bentoml/OpenLLM},
year = {2023}
}


Click me for full changelog

License

For personal and professional use. You cannot resell or redistribute these repositories in their original state.

Customer Reviews

There are no reviews.