Function Calling with Ollama, Mistral 7B, Bash and Jq

Function Calling with Ollama, Mistral 7B, Bash and Jq

What is "Function Calling"? First, it's not a feature where a LLM can call and execute a function. "Function Calling" is the ability for certain LLMs to provide a specific output with the same format (we could say: "a predictable output format").

So, the principle is simple:

  1. You (or your GenAI application) will create a prompt with a delimited list of tools (the functions) composed by name, descriptions, and parameters: SayHello, AddNumbers, etc.

  2. Then, you will add your question ("Hey, say 'hello' to Bob!") to the prompt and send all of this to the LLM.

  3. If the LLM "understand" that the SayHello function can be used to say "hello" to Bob, then the LLM will answer with only the name of the function with the parameter(s). For example: {"name":"SayHello","arguments":{"name":"Bob"}}

Then, it will be up to you to implement the call of the function.

The LLM is able to find a match thanks to the description of the function.

Mistral 7b & Function Calling

The latest version (v0.3) of Mistral 7b supports function calling and is available for Ollama. So, it's time to test it, and we will do it with bash.

The documentation says:

Mistral 0.3 supports function calling with Ollama’s raw mode.

And this is an example of prompt:

[AVAILABLE_TOOLS] [
{"type": "function", 
  "function": {
    "name": "get_current_weather", 
    "description": "Get the current weather", 
    "parameters": {
      "type": "object", 
      "properties": {
        "location": {
          "type": "string", 
          "description": "The city and state, e.g. San Francisco, CA"
        }, 
        "format": {
          "type": "string", 
          "enum": ["celsius", "fahrenheit"], 
          "description": "The temperature unit to use. Infer this from the users location."
        }
      }, 
      "required": ["location", "format"]
    }
   }
 }
][/AVAILABLE_TOOLS]
[INST] What is the weather like today in San Francisco [/INST]

Let's write a GenAI Bash script

First, we need some bash functions.

I started with some helpers first:

Btw, you need to install Jq (https://jqlang.github.io/jq/)

Helpers

I need a Chat function to send my prompt to the Mistral LLM.

: <<'COMMENT'
Generates a chat completion using the OLLAMA API.
 Args:
    OLLAMA_URL (str): The URL of the OLLAMA API.
    DATA (str): The JSON data to be sent to the API.
 Returns:
    str: The JSON response from the API
COMMENT

function Chat() {
    OLLAMA_URL="${1}"
    DATA="${2}"

    JSON_RESULT=$(curl --silent ${OLLAMA_URL}/api/chat \
        -H "Content-Type: application/json" \
        -d "${DATA}"
    )
    echo "${JSON_RESULT}"
}

I need some functions to "format" the strings:

: <<'COMMENT'
Sanitizes the given content by removing any newlines.
It takes one argument, CONTENT, and removes any newline characters (\n) 
from it using the tr command. 
COMMENT

function Sanitize() {
    CONTENT="${1}"
    CONTENT=$(echo ${CONTENT} | tr -d '\n')
    echo "${CONTENT}"
}

: <<'COMMENT'
Escapes double quotes in the given content by adding a backslash 
before each double quote.
COMMENT

function EscapeDoubleQuotes() {
    CONTENT="${1}"
    CONTENT=$(echo ${CONTENT} | sed 's/"/\\"/g')
    echo "${CONTENT}"
}

Now we have (almost) everything we need. Let's define the list of tools.

The tools

I defined two tools (or two functions): hello and addNumbers. Don't forget that the description of the function and the parameters are important (as well as the function's name). This will help the LLM (Mistral) retrieve the appropriate tool.


read -r -d '' TOOLS_CONTENT <<- EOM
[AVAILABLE_TOOLS]
[
    {
        "type": "function", 
        "function": {
            "name": "hello",
            "description": "Say hello to a given person with his name",
            "parameters": {
                "type": "object", 
                "properties": {
                    "name": {
                        "type": "string", 
                        "description": "The name of the person"
                    }
                }, 
                "required": ["name"]
            }
        }
    },
    {
        "type": "function", 
        "function": {
            "name": "addNumbers",
            "description": "Make an addition of the two given numbers",
            "parameters": {
                "type": "object", 
                "properties": {
                    "a": {
                        "type": "number", 
                        "description": "first operand"
                    },
                    "b": {
                        "type": "number",
                        "description": "second operand"
                    }
                }, 
                "required": ["a", "b"]
            }
        }
    }
]
[/AVAILABLE_TOOLS]
EOM

TOOLS_CONTENT=$(EscapeDoubleQuotes "${TOOLS_CONTENT}")
TOOLS_CONTENT=$(Sanitize "${TOOLS_CONTENT}")

The last two lines allow to format the content of TOOLS_CONTENT to make it "JSON compliant" when I will use it with the curl command of the Chat function.

The "user question"

This is the simplest part:

USER_CONTENT='[INST] say "hello" to Bob [/INST]'
USER_CONTENT=$(EscapeDoubleQuotes "${USER_CONTENT}")

We can now "send" the prompt to Ollama and Mistral.

Give me a tool: say "hello" to Bob

When building the payload to be sent to Ollama, we need to set the raw field to true, thanks to that, no formatting will be applied to the prompt (we override the prompt template of Mistral), and we need to set the format field to "json".

Don't forget to set the temperature option to 0.0.

OLLAMA_URL=${OLLAMA_URL:-http://localhost:11434}
MODEL="mistral:7b"

read -r -d '' DATA <<- EOM
{
  "model":"${MODEL}",
  "options": {
    "temperature": 0.0,
    "repeat_last_n": 2
  },
  "messages": [
    {"role":"system", "content": "${TOOLS_CONTENT}"},
    {"role":"user", "content": "${USER_CONTENT}"}
  ],
  "stream": false,
  "format": "json",
  "raw": true
}
EOM

And now, we can query Ollama and Mistral:

jsonResult=$(Chat "${OLLAMA_URL}" "${DATA}")
messageContent=$(echo "${jsonResult}" | jq -r '.message.content')
messageContent=$(Sanitize "${messageContent}")
echo "${messageContent}"

Run the script, and you should obtain this:

{"name":"hello","arguments":{"name":"Bob"}}

With this, you know what to call and which parameters to use.

Give me a tool: add 2 and 40

Let's try another tool:

USER_CONTENT='[INST] add 2 and 40 [/INST]'

You should get this:

{"name":"addNumbers","arguments":{"a":2,"b":40}}

That's it. It's pretty simple.

📝 You can find the entire source code of this blog post here: https://github.com/parakeet-nest/blog-post-samples/blob/main/2024-05-31/00-mistral/function-call.sh

In the next blog post, I will explain how to do function calling with LLMs that do not implement this feature natively.