Skip to content
This repository has been archived by the owner on Aug 25, 2024. It is now read-only.
/ tgi-kai-bridge Public archive

connect KobolAI API clients to text-generation-inference

License

Notifications You must be signed in to change notification settings

g4rg/tgi-kai-bridge

Repository files navigation

tgi-kai-bridge

Minimal API translation layer to make text-generation-inference accessible to KoboldAI clients including KoboldAI, TavernAI, SillyTavern and AI-Horde-Worker

Dockerfile (not tested) includes TGI and connects it to the AI Horde

Configuration

Environment Variables:

KAI_PORT - port to listen on for KAI clients (default 5000)
KAI_HOST - hostname to listen on (default 127.0.0.1)
TGI_ENDPOINT - URL to TGI REST API (default http://127.0.0.1:3000)
TGI_MODE - additional information to add to the model name
TGI_MODEL - model name override

Limitations

  • only supports temperature, top_p, top_k and rep_pen sampler settings
  • no (EOS) token ban

About

connect KobolAI API clients to text-generation-inference

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published