Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BFCL] proprietary_model deploied in local,what should i do? #862

Open
Marcher-lam opened this issue Dec 31, 2024 · 1 comment
Open
Labels
BFCL-New Model Add New Model to BFCL

Comments

@Marcher-lam
Copy link

If I deploy a proprietary model locally and use OSS model to access it, do I need to rewrite the file? Or can you reuse the proprietary_model Handler

@HuanzhiMao
Copy link
Collaborator

HuanzhiMao commented Dec 31, 2024

If your model is already privately hosted somewhere, something similar to the funcitonary handler should be enough. https://github.com/ShishirPatil/gorilla/blob/main/berkeley-function-call-leaderboard/bfcl/model_handler/proprietary_model/functionary.py

If you need the BFCL inference pipeline to spin up the vllm/sglang server to host the model, then you will need to implement a oss_model handler.

Don't worry about the naming of the proprietary_model handler vs oss_model handler; it's a bit confusing and will be addressed in #859. proprietary_model should mean API-based inference while oss_model means local-hosting-based inference.

@HuanzhiMao HuanzhiMao added the BFCL-New Model Add New Model to BFCL label Dec 31, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
BFCL-New Model Add New Model to BFCL
Projects
None yet
Development

No branches or pull requests

2 participants