Unleashing the Potential of a Bolt Slack Bot: A Thorough Guide to Integrating Langchain and Notion

A picture of Richard Casemore

Richard Casemore - @skarard

July 26, 2023
A picture of a cute slack bot and the icons of the services that are integrated: langchain, notion, weaviate and redis

Introduction

In the ever-evolving digital landscape, chatbots have become a key instrument in making interactions more efficient, and boosting productivity. Especially in team collaboration tools like Slack, bots are paving the way for smarter, faster, and smoother communications. However, to add a flavour of real-time learning and data handling to these bots, connecting them with powerful APIs is crucial. That's what we'll be doing in this guide - powering up our Slack bot with Langchain and Notion!

In the first part of this series, we embarked on a thrilling journey into the world of bot development. We took our initial leap by crafting a basic Slack bot and explored the realm of bot-environment setup and code configurations. Now, it's time to kick things up a notch!

In this second part of our series, we will delve into the exciting task of transforming our rudimentary bot into an intelligent agent. We're going to teach our Slack bot to interact with Langchain and Notion. The fruits of our labour will be a bot that can swiftly fetch facts from a Notion page amd database — a study aid of your own creation, providing answers and citation to queries without missing a beat.

If you're intrigued about transforming the realm of communication and ready to equip your Slack bot with an added layer of intelligence, then let's dive right in!

Prerequisites

Before we delve into enhancing our bot, there are several prerequisites you should have to get the most out of this article. Here's what you need:

  • Git: We will be cloning the project repository using Git. If Git isn't installed on your system, refer to the Git installation instructions.
  • Node.js: Our bot will be running on a Node.js environment. If Node.js isn't installed yet, you can get it from the Node.js download page.
  • Basic Bot Set-up: It's recommented that you have read the previous post: Building a Bolt Slack Bot: A Straightforward Starter Kit Guide. In that article, I walked through setting up a basic slack bot, in this article we are building on these principles.
  • A Notion account: We'll be connecting our bot to Notion, so you need a Notion account. If you don't have one, you can easily create a free account on the Notion website.

Whether you're an experienced developer or just starting out on your coding journey, this guide aims to provide you with the knowledge and tools needed to transform a basic Slack bot into an efficient, intelligent aid. Let's get started!

Setting Up Your Workspace

To get started, you first need to clone the repository which contains all the code for our AI driven conversational Slack Bot. You can do this using the following command:

git clone https://github.com/metalumna/bolt-slack-bot-langchain-notion-integration.git

After cloning, move into the project directory:

cd bolt-slack-bot-langchain-notion-integration

Inside the directory, you should find all the essential files for our bot's operation.

The next step is to install all necessary dependencies for our project. We'll do this by running:

npm install

Now that your workspace is all set up, the next thing you need to do is create your .env file to store all your environment variables.

In the root of the project directory, you'll find a file named .env.example. This file contains all the environment variables that are necessary for the operation of the bot. You can simply make a copy of this file and rename the copy to .env like so:

cp .env.example .env

Once you've done this, you're ready to start populating the .env file with your own values. In the next section we will find all the details to fill in the environment varibles.

Configuration

OpenAI

OpenAI host a Large Language Model which functions as the intellegence in our AI slack bot. The good news for new users is that OpenAI gives a free trial during which users receive a $5 (USD) credit.

  1. Create an OpenAi account: Sign-up for an OpenAI account on OpenAI's website.
  2. Create a new API key: Navigate to the top-right corner of the page, click on your profile icon, and select "View API Keys". To generate your personal API key, simply click "Create New Secret Key". You can name the key "langchain-notion-slack", just so you know what it's for.
  3. Add the API key to .env: The key is only ever shown once, so copy it directly to OPENAI_API_KEY.

Notion

Notion is a great productivity tool for everything, from taking notes to running sprints. We're expecting you to already have a Notion account that contains data in which you want to converse. These steps will allow the bot to read all the contents on this page and the contents of any subpages/database pages.

  1. Accessing the Integration Settings: Go to your notion My intregations page
  2. Creating a New Integration: Click on the "New Integration' button. Ensure you have selected the correct "Associated Workspace". You can name the integration "langchain-notion-slack", just so you know what it's for.
  3. Add the secret token to .env: Copy the "Internal Integration Secret" directly to NOTION_TOKEN.
  4. Add integration to your page: Load up the page that you want to converse with. Click the three pips menu in the top right, go down to "Add connections" and add your newly created integration, it should be called "langchain-notion-slack".

Slack

Slack is the chat platform that will be used to interact with our AI chat bot.

  1. Create a new Slack App: Head to Slack API's app page and click on "Create New App". Select "From Scratch". Here, name your App, select the workspace where this bot would work on, and then click "Create App".
  2. Configure Permissions: Under "Features" option in the side menu, click on "OAuth & Permissions". Scroll down to the "Scopes" section. Add the necessary bot token scopes based on what commands your bot would need. For our simple echo bot, adding chat:write and users:read, will suffice.
  3. Enable Direct Messages: Under "Features" option in the side menu, click on "App Home". Scroll down to the "Show Tabs: section. Ensure the Messages Tab toggle is on, then enable the tickbox "Allow users to send Slash commands and messages from the messages tab".
  4. Configure Socket: Under "Settings" option in the side menu, click on "Socket Mode". Switch the Enable Socket Mode toggle to the on position. You will be asked to add connections:write scope to an app-level token, you can name this token "Socket". Then press Generate, copy the app-level token in the .env file, under the SLACK_APP_TOKEN varible.
  5. Adding Slash Commands: In the side menu, under "Features", click on "Slash Commands". Now you can "Create New Command". The command is /add-notion-page, the short description is Adds a Notion page to your chat bot, the usage hint is [Notion URL or ID]. You can leave "Escape channels, users, and links sent to your app" unchecked.
  6. Install App to Workspace: Once you've configured your socket, Under the "Settings" option in the side menu, click on "Install App". Install the app on to your workspace. Once installed, copy the "Bot User OAuth Token" to the .env file under the SLACK_BOT_TOKEN.
  7. Configure Events: Under "Features" option in the side menu, click on "Event Subscriptions". Toggle Enable Events to be on. Open up the "Subscribe to bot events" menu, then "Add Bot User Event". Adding the message.im event will be enough for our echo bot. Make sure to press "Save Changes" on the bottom right of the page. Go back to the "Install App" page (Under the "settings option in the side menu, click on "Install App"), this time press "Reinstall to Workspace".
  8. Signing Secret In the side menu, under "Settings", click on "Basic information". Scroll down to "App Credentials" section and locate "Signing Secret". Hit "Show" to reveal the secret and copy it to the .env file under SLACK_SIGNING_SECRET.

Weaviate

Weaviate is a vector database, it stores the Notion data in a manner that allows the AI to grasp the context of the conversation.

  1. Create a free Weaviate account: Sign-up to Weaviate.
  2. Create a cluster: On the dashboard, click "Create Cluster" and choose "Free sandbox", Enable Authentication should be "YES" and name your cluster "langchain-notion-slack".
  3. Add the details to .env: Once your new weaviate vector database is initialised, click "Details" and copy the details below.
    • WEAVIATE_TOKEN Click the key icon, this is your authentication token for accessing your Weaviate instance.
    • WEAVIATE_SCHEME is the URL scheme of your Weaviate instance, in this guide it is "https".
    • WEAVIATE_HOST is the hostname or IP address of your Weaviate instance without "https://". (something like: "langchain-notion-slack-xxxxxxxx.weaviate.network")
    • WEAVIATE_INDEX is the name of your data index in which to store and lookup data, lets set this as "NotionData".

Redis

Redis is an in memory database that is blazing fast, we use it here to store chat history.

  1. Create a Redis account: Sign-up to Redis
  2. Create a free subscription: Once logged in, go to "Subscriptions" on the left sidebar, go to Fixed plan ($0/month). The cloud vender is up to you, but I prefer Google Cloud, 30MB storage will be plenty for now it's all free! For the "Subscription name" put in "Free Subscription".
  3. Create a new database: When "Free subscription" is selected, click "New database". Set the "Database name" as "langchain-notion-slack", leave all other options as default then press Activate database.
  4. Add the details to .env: Click on the database as part of your free subscription and copy the details below.
    • REDIS_USER is the username you use to connect to your Redis database (default username is "default")
    • REDIS_PASSWORD is the password associated with the above username. (click the eye icon in the Redis password)
    • REDIS_URL is the hostname or IP address of your Redis instance also known as "Public endpoint". (prefix with redis://)

Now that you have all these tokens, keys, secrets and URLs you are all set to test out the bot.

Running and Testing the Bot

After we've configured all the settings in our local environment, Slack workspace and connected the integration to Notion, it's time to finally run our AI Slack bot server!

To get your bot up and running, navigate back to your command line, ensure you're in the correct directory, and type the following command:

npm run dev

This command activates the server in development mode. The server is configured to restart automatically every time there's a change to any of our files, thanks to nodemon.

After running this command, you should see the console print out "[server] Bolt is running" if everything is set up correctly. This means the bot is now live and listening for events!

Now, head over to the Slack workspace you installed your app to.

Our bot doesn't know anything right now, well actually it's already pretty smart, but it doesn't know anything that we've specifically asked it to know. The first step of teaching it everything there is to know about your data is to add it to the vector database with the Slack slash command:

/add-notion-page [NotionURL]

It will take a while to ingest the data of the notion page that's been added. It will send you a message on Slack when it's finished.

Once it's finished ingesting your data, now you can talk to your slack bot and it answer questions from data that you gave it.

Understanding the Code

/src/bolt.ts - The functional code of the chat bot

Adding Notion Pages: The /add-notion-page slash command is defined using the boltApp.command method. It takes an async callback function as input, which is called when the command is invoked. Inside the callback function, the Notion page ID is extracted from the command text using a regular expression. If the ID is not found, an error message is sent back to the user.
That means when you give it a Notion page URL it extracts the 32 digit page ID: https://www.notion.so/skarard/LangChain-Notion-b34ca03f219c4420a6046fc4bdfdf7b4

Example:

boltApp.command("/add-notion-page", async ({ command, ack, respond }) => {
  const id = /(?<!=)[0-9a-f]{32}/.exec(command.text)?.[0];

  if (!id) {
    await ack({
      response_type: "ephemeral",
      text: `⚠️ The notion URL \`${command.text}\` does not contain a valid notion page id`,
    });
    return;
  }

  // Rest of the code...
});

The addNotionPage function is called to add the Notion page using the extracted ID. If the function returns an empty array, it means the Notion URL was not accessible, and an error message is sent back to the user.

Example:

const docsLoaded = await addNotionPage(id);

if (docsLoaded.length === 0) {
  await respond({
    response_type: "ephemeral",
    text: `⚠️ The notion URL \`${command.text}\` was not accessible`,
  });
  return;
}

Options for the loaded documents are generated and stored in the documentTitles array. These options are used later in a message block.

Example:

const documentTitles = docsLoaded.map(
  (docTitle, i) =>
    ({
      text: {
        type: "plain_text",
        text: docTitle,
        emoji: true,
      },
      value: `value-${i}`,
    } as PlainTextOption)
);

A message is sent to the user with information about the number of loaded documents and the available options. This message is sent using the respond method.

Example:

await respond({
  response_type: "ephemeral",
  blocks: [
    {
      type: "section",
      text: {
        type: "mrkdwn",
        text: `Loading ${docsLoaded.length} Document${
          docsLoaded.length === 1 ? "" : "s"
        } Complete.`,
      },
      accessory: {
        type: "overflow",
        options: documentTitles,
        action_id: "overflow-action",
      },
    },
  ],
  text: `Loading ${docsLoaded.length} page${
    docsLoaded.length === 1 ? "" : "s"
  } complete.`,
});

The blocks array defines the layout of the message, with a section block displaying the information text and an overflow menu block displaying the available options.

Processing Slack Messages: Inside the event listener function boltApp.message, a conditional statement checks whether the message subtype is undefined or "bot_message". This condition ensures that the listener does not respond to messages sent by bots.

Example:

boltApp.message(async ({ message, say }) => {
  if (message.subtype === undefined || message.subtype === 'bot_message') {
    // Handle the message
  }
});

The code retrieves the user details from the user ID using the Bolt app's client users.info method. This retrieval is important for the ConversationQA chain chat history.

Example:

const userInfo = await boltApp.client.users.info({
  user: message.user ?? "",
});

The callChain function is called with the message text and user information. It returns either void or an object containing the result and the source documents used to generate the result.

Example:

const result = (await callChain(message.text ?? "", {
  name: userInfo.user?.name ?? "",
  userId: message.user ?? "",
}).catch((e) => {
  console.log(e);
  postEphemeral({
    message,
    text: "An error has occurred. Please try again.",
  });
})) as void | {
  text: string;
  sourceDocuments: Document[];
};

If the result is undefined, the function returns early. Otherwise, the code proceeds to process and send the response.

The sources used for context are turned into links using the Array.from function and de-duplicated using the Set function.

Example:

const sources = Array.from(
  new Set(
    result.sourceDocuments.map(
      (doc) => `<${doc.metadata.url}|[${doc.metadata.properties_title}]>`
    )
  )
).join(" ");

This creates a list of unique source document links, joined together with a space.

The response is sent back to the user using the say method. It includes a message section containing the result text and a context section containing the source document links.

Example:

await say({
  blocks: [
    {
      type: "section",
      text: {
        type: "mrkdwn",
        text: result.text,
      },
    },
    {
      type: "context",
      elements: [
        {
          type: "mrkdwn",
          text: sources,
        },
      ],
    },
  ],
  text: result.text,
});

The blocks array defines the layout of the message, with a section block displaying the result text and a context block displaying the source document links.

/src/langchain.ts - The framework for interacting with LLMs

Creating a Weaviate client: The code starts with creating a Weaviate client using the weaviate-ts-client module with the provided configuration.

Example:

const weaviateClient = weaviate.client({
  scheme: config.WEAVIATE_SCHEME,
  host: config.WEAVIATE_HOST,
  apiKey: new weaviate.ApiKey(config.WEAVIATE_TOKEN),
});

Adding a Notion page: The addNotionPage function takes an id parameter and performs the following tasks:

  • Creates an empty array response to store the response from the function.
  • Creates a new NotionAPILoader instance with the provided configuration.
  • Loads the documents using the loader object and catches any errors.
  • Extracts the title property of each document and pushes it to the response array.
  • Splits the documents using a RecursiveCharacterTextSplitter instance.
  • Calls the fromDocuments method of the WeaviateStore class to create a WeaviateStore from the split documents.
  • Resolves the response with the response array if successful or resolves an empty array if there are any errors.

Example:

export async function addNotionPage(id: string) {
  const response: string[] = [];

  const loader = new NotionAPILoader({
    clientOptions: { auth: config.NOTION_TOKEN },
    id,
    type: "page",
  });

  const docs = await loader.load().catch(console.log);

  if (docs === undefined) return response;

  response.push(...docs.map((doc) => doc.metadata.properties.title));

  // ...
  
  return result;
}

Generating prompt templates: The code defines two string templates - questionGeneratorTemplate and qaTemplate - which serve as predefined templates for generating prompts. The questionGeneratorTemplate will collect the chat history and summerise it into a standalone question. qaTemplate then collects the context from the vector database, it's question is replaced by the Standalone question generated from questionGeneratorTemplate.

Example:

const questionGeneratorTemplate = `Given the following conversation and a follow up question, rephrase the follow up question to be a standalone question. Do not invent any new information, only rephrase the question. Do include personally identifiable details from the chat history like name, age, location, etc.

Chat History:
{chat_history}

Follow Up Input: {question}
Standalone question:`;

const qaTemplate = `Use the following pieces of context to answer the question at the end. If you don't know the answer, just say that you don't know, don't try to make up an answer.

{context}

Question: {question}
Helpful Answer:`;

Calling the conversational retrieval QA chain: The callChain function takes a question parameter and an object parameter with name and userId properties. It performs the following tasks:

  • Initializes the models to use (low temperature and mid temperature models from OpenAI).
  • Initializes the vector store to use to retrieve the answer (using WeaviateStore).
  • Initializes the memory to use to store the chat history (using BufferMemory and RedisChatMessageHistory).
  • Initializes the chain (ConversationalRetrievalQAChain) with the specified models, vector store, memory, and chain options.
  • Calls the call method of the chain to get the answer to the question.

Example:

export async function callChain(
  question: string,
  { name, userId }: { name: string; userId: string }
) {
  // Initialize models, vector store, memory, and chain
  
  return chain.call({ question });
}

/src/config.ts & /src/index.ts - Further Insights

The intricacies of the /src/config.ts, and /src/index.ts files are covered in detail in an earlier guide, Building a Bolt Slack Bot: A Straightforward Starter-Kit Guide. I highly recommend referring to this guide for comprehensive information and deep understanding on these files. It's a valuable resource designed to equip you with the necessary knowledge to go into a deeper dive on these files.

Conclusion

So, there you have it! We've walked you through a thrilling journey of transforming your simple Bolt Slack bot into an intelligent helper. Powered by the impressive prowess of Langchain and Notion, your bot is now capable of fetching facts from Notion pages and providing comprehensive responses to queries, a feat that might have seemed inaccessible to novices once upon a time.

Yet, this isn't the end. Indeed, it's just the beginning. Consider the knowledge you've gleaned from this guide as a stepping-stone into the dynamic world of bot programming.

Already, you've seen how an ordinary messaging tool can turn into an augmented assistant, revolutionizing the way we communicate and exchange information on Slack. However, these capabilities are not static and continuous innovation is key to staying relevant. Look forward to our next series of articles where we will dive into more advanced features, ensuring your bot remains at the cutting-edge of digital communication technology.

Remember, the future of automation is not far-off in a distant realm, it's happening here and now. So, keep on exploring, keep on creating, and above all, keep on coding. Your journey towards mastery has just begun.

Next Guide: Coming soon...

© 2025 - MetaLumna Ltd
MetaLumna Ltd is a company registered in England and Wales.
Company No. 14940303
85 Great Portland Street, First Floor, London, W1W 7LT