Skip to content

Conversation

@bfollington
Copy link
Contributor

@bfollington bfollington commented Jun 21, 2024

Wrap an LLM behind a Deno webserver.

Create a thread and append messages to it, support for tool calling on the client and server in cooperation.

Client Input: what is 3+1? add some emojis to the answer using a tool
Client Tool call: calculator { expression: "3+1" }
Client Tool result: 4
Server Tool call: addEmojis { text: "The answer is 4" }
Server Tool result: Here's the text with relevant emojis added:\n\nThe answer is 4️⃣ ✅
Server Response: Here's the text with relevant emojis added:\n\nThe answer is 4️⃣ ✅

Example client usage:

import { LLMClient } from "@commontools/llm-client";
const client = new LLMClient({
  serverUrl: "http://localhost:8000",
  tools: [{
      name: "calculator",
      input_schema: {
        type: "object",
        properties: {
          expression: {
            type: "string",
            description: "A mathematical expression to evaluate",
          },
        },
        required: ["expression"],
      },
      implementation: async ({ expression }) => {
        return `${await eval(expression)}`;
      },
    }],
  system: "use your tools to answer the request"
});

client
  .handleConversation("calculate 2+2 and then add emojis")
  .then((response) => {
    console.log(response);
  });

@bfollington bfollington marked this pull request as ready for review June 21, 2024 22:12
@bfollington bfollington requested a review from cdata June 21, 2024 23:31
@bfollington bfollington force-pushed the 2024-06-20-ai-server-claude branch from 290a96b to 455339f Compare June 21, 2024 23:32
@bfollington bfollington changed the title Claude support (Deno webserver) LLM co-operative server + client package (Claude support via deno) Jun 21, 2024
@bfollington bfollington merged commit 75a3186 into main Jun 24, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants