This demo shows prompt routing with AIConfig. We used streamlit to host the app so you can interact with the assistant! Alternatively, you can run the aiconfig in an ipynb
The user asks a question. The LLM decides the topic as math, physics, or general. Based on the topic, the LLM selects a different "assistant" to respond. These assistants have different system prompts and respond with varying introductions and style of response.
create_config.py
- create an AIConfig for the prompts, models, and model parameters to be used for the different assistants.assistant_aiconfig.json
- generated automatically from running create_config.pyassistant_app.py
- build app to handle prompt routing logic among the prompts (uses AIConfig). Uses streamlit to create frontend.
streamlit run assistant_app.py
Open assistant.ipynb in this folder, or in Colab, to use the aiconfig