15. Streamlit Chat with SLM
Overview
# Run Ollama server
$ ollama serve
# Install dependencies
$ ollama pull phi3
# Run a simple chat with Ollama
$ poetry run python apps/15_streamlit_chat_slm/chat.py
# Run summarization with SLM
$ poetry run python apps/15_streamlit_chat_slm/summarize.py
# Run streamlit app
$ poetry run python -m streamlit run apps/15_streamlit_chat_slm/main.py