Make a GenAI Web app in less than 40 lines of code.

Make a GenAI Web app in less than 40 lines of code.

With Ollama, LangChain & StreamLit. And, again, it runs on my 🥰 Pi5.

Table of contents

In the previous blog post, Prompts and Chains with Ollama and LangChain, I explained how to use the LangChain prompts and chains to develop a script that explains programming concepts for various languages.

I will reuse the prompt and the chain to transform the script into a "fancy" web application in a glitch, thanks to StreamLit.

StreamLit is a Python Framework like magic Legos for data scientists: it turns Python scripts into beautiful, shareable web apps in minutes, and you don't need to be a king of web development.

Let's do it 🚀

Again, I will use the Python dev environment of the Pi GenAI Stack, so there is no need to install anything. All dependencies are provided with the stack.

Once the Web IDE is launched, create a directory (03-streamlit-explain), and add a new file (app.py) in this directory with the following code:

import os
from langchain_community.llms import ollama
from langchain_community.callbacks import StreamlitCallbackHandler
from langchain.prompts import PromptTemplate

import streamlit as st

ollama_base_url = os.getenv("OLLAMA_BASE_URL")

model = ollama.Ollama(
    base_url=ollama_base_url, 
    model='tinyllama',
)

# Add a title and a subtitle to the webapp
st.title("🤓 Tell me more about this language")
st.header("👋 I'm running on a PI5")

# Text input fields
language = st.text_input("Language:")
what = st.text_input("Topic:")

# Prompt template
prompt = PromptTemplate.from_template(
    "Explain the programming concept of {what} in {language}."
)

# Executing the chain when the user 
# has entered a language and a topic  
if language and what:
    st_callback = StreamlitCallbackHandler(st.container())

    # Chain using model and formatting          
    chain = prompt | model        
    # Invoking the chain      
    response = chain.invoke({"what": what, "language": language}, {"callbacks": [st_callback]})

You can see that the code is pretty straightforward:

  • st.title() and st.header() allow to add title and subtitle to the web page

  • st.text_input() allows to add input fields to the web page

  • We kept the same prompt as in the previous blog post.

  • We kept almost the same chain, without the StrOutputParser(), and instead, we will "delegate" the display of the result to a callback. In this case, we will use StremlitCallbackHandler, which is specific to StreamLit... and magic.

To run the application, use the below command:

streamlit run app.py

Now, open the web app with the following URL http://<ip address of your Pi>:8501

Type a language (for example, golang) and a specific topic (for example, structs) and watch the magic happen:

🎉 That's it for today. Making wizards with Ollama, LangChain and StreamLit, even on a Pi, is extremely easy.

Â