Create a GenAI Rust Teacher

Create a GenAI Rust Teacher

How to learn Rust with Ollama and DeepSeek Coder Instruct

I continue my discovery of Ollama and LangChain and my experiments on my Raspberry Pi. This week, while listening to a podcast on LLMs, I discovered DeepSeek Coder and the presenter said that there were small versions which suited me perfectly for my PI tests.

On the webpage of DeepSeek Coder, it's written:

DeepSeek Coder comprises a series of code language models trained from scratch on both 87% code and 13% natural language

There is an "instruct" version of the model (this version is specifically designed to follow user instructions and produce tailored outputs).

🤔 Does it mean I can create a kind of "Copilot Chat"? Well ... Make it so!

How it begins

First, I added the model download to Pi GenAI Stack, so you must update it (if you don't have the latest version). FYI, this is the yaml file I use to download the model: https://github.com/bots-garden/pi-genai-stack/blob/main/models/deepseek-coder.yaml

Then, I started with the code from my previous blog post: Make a GenAI Conversational Chatbot with memory.

✋ there is an addendum to this previous blog post: I made some changes due to the deprecation of the chain.run() method. You can read more about this here: Addendum to the "Run the chain" paragraph (update: 2024-02-22)

How it works

The changes are very simple.

I used a new model.

This is the simplest change:

model = ollama.Ollama(
    temperature=0,
    repeat_penalty=1,
    base_url=ollama_base_url, 
    model='deepseek-coder:instruct',
)

I want to learn Rust, so I changed the prompt:

I need somebody who is a Rust programmer and who can explain me simply various concepts of Rust:

prompt_template = PromptTemplate(
    input_variables=['history', 'input'],
    template="""
    You are a technical writer, specialist with the rustlang programming languages, 
    you will write an answer to the question for the noobs,
    with some source code examples.
    {history}

    Human: {input}
    AI:
    """
)

And that's it!

Ah, wait, I added a fancy title for the web page:

st.title("🤖 I'm Pi-Lot")
st.header("I 💙 programming with 🦀 Rust")
st.header("👋 I'm running on a PI5")

The complete code of the Rust Teacher

You can see that it is pretty similar to the previous one:

import os
from langchain_community.llms import ollama
from langchain_community.callbacks import StreamlitCallbackHandler
from langchain.prompts import PromptTemplate

# All we need for a good chat
from langchain.memory import ConversationBufferMemory
from langchain.chains import ConversationChain

import streamlit as st

ollama_base_url = os.getenv("OLLAMA_BASE_URL")

memory = ConversationBufferMemory()
## session state variable
if 'chat_history' not in st.session_state:
    st.session_state.chat_history=[]
else:
    for message in st.session_state.chat_history:
        memory.save_context({'input': message['human']}, {'output': message['AI']})

prompt_template = PromptTemplate(
    input_variables=['history', 'input'],
    template="""
    You are a technical writer, specialist with the rustlang programming languages, 
    you will write an answer to the question for the noobs,
    with some source code examples.
    {history}

    Human: {input}
    AI:
    """
) 

model = ollama.Ollama(
    temperature=0,
    repeat_penalty=1,
    base_url=ollama_base_url, 
    model='deepseek-coder:instruct',
)

conversation_chain = ConversationChain(
    prompt=prompt_template,
    llm=model,
    memory=memory,
    verbose=True, # then you can see the intermediate messages
)

# Add a title and a subtitle to the webapp
st.title("🤖 I'm Pi-Lot")
st.header("I 💙 programming with 🦀 Rust")
st.header("👋 I'm running on a PI5")

# Text input fields
user_input = st.chat_input("Topic:")

# Executing the chain when the user 
# has entered a topic  
if user_input:
    st_callback = StreamlitCallbackHandler(st.container())

    result = conversation_chain.invoke(
        {"input":user_input, "history":st.session_state["chat_history"]}, 
        {"callbacks":[st_callback]}
    )

    message = {'human': user_input, 'AI': result["response"]}
    st.session_state.chat_history.append(message)

    with st.expander(label='Chat history', expanded=False):
        st.write(st.session_state.chat_history)

You can find the code here: https://github.com/bots-garden/pi-genai-stack/blob/main/python-dev-environment/workspace/samples/05-pi-lot-chat/app.py

Let's see if it works 🚀

Disclaimer: remember that everything is running on a PI5, so the compute is sometime slow

To run the application, use the below command:

streamlit run app.py

Then, I started with this prompt: What is a struct?

The answer was pretty complete, with simple examples (this is what I love. 🥰)

Then, my second prompt was: Can you remove the sign_in_count and active fields?

This is the answer:

And, by the way, it works:

Let's try a last prompt: show me how to add a method to this struct

And he did it! 🙃

I should have been more specific, so let's try again: Remove the sign_in_count and active fields and add a method hello to print the username value.

And it works very well (even if he is as talkative as me. 😆)

And it compiles and runs! (with some warnings because of unused code):

Well, that's it for today. Next time, let's see if we can chat with a document 🫨.