What and Why Would You Want To Build a Custom Copilot?


Introduction

In my recent blog series about building a Teams AI Library-based Custom Copilot, I realised that I had not really talked about what a Custom Copilot is and why you might want to build one. Simon Sinek, would not be impressed!

So, in this blog post, I start with the why. So, why would you want to build a Custom Copilot? Plus I suppose I better explain what they are.

I talked about these aspects in a recent session I did for Chirag at M365 UK, called Custom Copilot The Options so you are welcome to watch that instead.

Session 2 (54:45) : Custom Copilots in Microsoft 365 – The Options – Simon Doy MVP

Anyway, here we go.

Why would you want to build a Custom Copilot?

So, why would you want to build a Custom Copilot and what are they?

So, a Custom Copilot is a Generative AI Chatbot that is specific in what it provides, it might cover a particular role or task and is not generic. For example, you might have a Copilot for HR which provides information for employees on HR matters.

Its power is in that it is specific. However, that can also be its downfall because it is specific and only knows information about a topic.

A Custom Copilot need not only be an information gatherer, it may also be a way to perform actions for example, the Copilot for HR might allow someone to submit a form for changing personal details for example.

So, now we have explained what a Custom Copilot is, then let’s explain why you might want to build one. As we mentioned they are great when you need to build something that provides a specific purpose or role. One of the challenges with broader Copilots, for example, Copilot for Microsoft 365 is that they can access a lot of data and information and that makes it hard for the LLM behind the Copilot to know what is important. By having a more specific dataset and providing a particular need the Copilot can be built with that in mind.

All these capabilities make the Copilot easier to use and it will likely be better at providing more relevant results. Additionally, for organisations looking to have these built, the risk is reduced of them failing. This is because their scope is smaller, so testing and getting feedback on how they perform is quicker and more targeted.

Custom Copilots can be made available in Microsoft Teams and Microsoft SharePoint, actually, they can be delivered through a huge number of different channels via the Microsoft Bot Service and Microsoft Copilot Studio.

We want to build one, where should we start looking?

Well of course you can come and chat with us at iThink 365.

However, there are lots of resources out there. I would recommend watching the Microsoft 365 Development Community calls and reading these resources found on Microsoft Learn.

Building your custom copilot on Teams with the Teams AI Library

Create copilots with Microsoft Copilot Studio – Training | Microsoft Learn

Of course, have a look at my blog post on Building Custom Copilots with Teams AI Library and Azure AI Search.

What do you need to think about?

To be honest this area is moving quickly and changing all the time. The technology behind the Custom Copilots is very new and uses GPT models such as GPT 3.5 and GPT 4.

The patterns that are used for knowledge management-based Custom Copilots need a lot of testing and development as whether they are fit for your purpose depends on how the data is structured, chunked up, and put into the search index. There are a lot of variables that need to be managed here and tried out to give the users the results that they expect and help them get their jobs done.

However, it should not be underestimated the amount of time this will take and the tweaking that is required.

Conclusion

Thanks for reading this post and I look forward to hearing how you get on with your Custom Copilot journey.

Please feel free to reach out if you need support on your journey.

Teams AI Library Blog Series: Episode #3 – Setup your Teams AI Library Application


Introduction

This blog post is part of a series of posts around delivering a Microsoft Teams app that uses Microsoft Teams Toolkit, Teams AI Library, and Azure AI Services to have an Open AI chatbot reason over Microsoft SharePoint content.

If you have not read the previous posts, I would suggest starting at the first post to get your bearings.

In the previous post, we setup the Azure AI and Azure Open AI infrastructure, so do that first.

Now we are getting to the exciting part which is where we create and set up the Teams AI Library application. This involves the following steps:

  • Create your Teams AI Library Project
  • Setup of Teams AI library.
  • Clone Teams AI Library Extension GitHub Repository.
  • Configure the environment variables.

As mentioned previously the solution is using an existing sample and modifying it to use the Teams AI Extension library that I have put together.

Create your Teams AI Library Project

The Microsoft Teams AI library GitHub repository can be found on GitHub and in this example will build on top of the Twenty Questions Bot and repurpose it for our needs.

The reason we are using this sample is that it gives us the scaffolding, I used the chef bot sample as well as inspiration for this sample which uses a local data source to augment its output.

I will be explaining how to setup the application using Microsoft Visual Studio Code.

So, to start, clone the Repository from within Microsoft Visual Studio Code.

Wait for the Git repository to be cloned and open the folder in Visual Studio Code.

Clone Teams AI Library Extension GitHub Repository

Next, get the code from the GitHub Repository.

Currently, I have not published the code to an NPM registry. I started to configure the code base to publish to a GitHub NPM package host but have not finished the setup yet.

So, clone the GitHub repository to your local machine. In the following example, we are cloning it into your c:\dev\teams-ai-library-extensions folder.

This can be done via Visual Studio Code by doing the following:

For details on how to do this using Git CLI, check out Cloning a repository – GitHub Docs.

Setup your Teams Application

Set up the Microsoft Teams application by doing the following.

With the Microsoft Teams AI Library sample Visual Studio code open do the following:

  1. In the root JavaScript folder, install and build all dependencies
    • cd teams-ai/js
    • yarn install
    • yarn build

  • In a terminal, navigate to the sample root.
cd teams-ai/tree/main/js/samples/04.e.twentyQuestions
  • Duplicate the sample.env in this folder. Rename the file to .env.
  • Add your bot’s credentials and any other related credentials to that file. We will need to use the Azure Open AI configuration. So, keep the AZURE_OPENAI_KEY, AZURE_OPENAI_ENDPOINT variables and fill them in appropriately.
  • Add the following new Environment Variables in to your .env file
    • AZURE_SEARCH_ENDPOINT=
    • AZURE_SEARCH_KEY=
    • AZURE_OPENAI_DEPLOYMENTMODEL=
    • AZURE_SEARCH_INDEXNAME=
    • BOT_ID=
    • BOT_PASSWORD=
  • Update config.json and index.ts with your model deployment name.

Install Teams AI Extension NPM Library

Now that you have your application downloaded with dependencies set up then we need to install the Teams AI Extension module using npm.

Based on that the step to clone the Teams AI Extension library has been cloned to c:\dev\teams-ai-extension then install the Teams AI Azure AI Search Extension into your ChefBot application by doing the following:

  • From VS Code.
  • Open Terminal
  • npm install c:\dev\teams-ai-library-extension\teams-ai-azure-ai-search-datasource –save

This will install the library and once installed we can reconfigure the application to use the Azure AI Search Data Source.

Update the index.ts file with the following steps.

Add the import statement.

import {AzureAISearchDataSource, AzureAISearchDataSourceOptions} from 'teams-ai-azure-ai-search-datasource';

At line 120 add the following code:

const dataSourceOptions: AzureAISearchDataSourceOptions = {

    azureOpenAiEndpoint: process.env.AZURE_OPENAI_ENDPOINT!,

    azureOpenAiKey: process.env.AZURE_OPENAI_KEY!,

    azureOpenAiDeploymentId: process.env.AZURE_OPENAI_DEPLOYMENTMODEL!,

    azureSearchEndpoint: process.env.AZURE_SEARCH_ENDPOINT!,

    azureSearchKey: process.env.AZURE_SEARCH_KEY!,

    azureSearchIndexName: process.env.AZURE_SEARCH_INDEXNAME!,  

};

At line 130 replace the code with this:

let dataSource:AzureAISearchDataSource = new AzureAISearchDataSource('teams-ai', dataSourceOptions);

planner.prompts.addDataSource(

    dataSource

);

This will remove the existing data source which is referencing a locally created vector database.

The last step is to update the prompts in Teams AI Library. The Teams AI library system prompt is augmented using prompts which are held in folders in the ./src/prompts folder.

Each folder within this prompts folder can hold a different configuration. The prompt folder that is being used by the application to configure the AI is managed by the action planner in this example.

You will see the following code:

const planner = new ActionPlanner({

    model,

    prompts,

    defaultPrompt: 'chat'

});

The defaultPrompt setting in this example uses the chat folder.

Now, let’s change the prompt to get the result that we want.

However, before we make any changes, let’s explain a bit about the files in this folder. There are two files to check out:

  • config.json
  • skprompt.txt

The config.json is used to configure the way that the prompt works and allows you to change how the interaction with the Azure Open AI Service. There is not a huge amount of information in the Teams AI Library. However, I have found that changing the type to completion improves the results.

Additionally, you may want to update the temperature. The temperature is a number between 0.0 and 1.0 and controls how precise or creative the results are. 0 is precise and 1 is very creative. The temperature will control whether you get a response as it might be that if you have a temperature of 0 and the bot does not know the answer then it will say so. If you have a temperature of 1.0 then you will always get a result back which will include results which are not factually correct.

The other file is skprompt.txt

This is used to configure your AI and augment the prompt to describe how you want it to behave. There are a lot of interesting ways to configure the prompt and I would recommend having a play but also using GitHub and see what other people are doing.

I have found that having a last sentence that states and uses the following information to formulate your response. For example:

The following is a conversation with an AI assistant called Mr Legal
Mr Legal is an expert in Legal and Law matters and is an assistant that is used by legal fee earners within law firms to help them understand how they can tackle their legal matters.
The voice of Mr Legal should be authorative and professional.

Base your response using the following text and include the citations:

Response Formatter Library

The [responseFormatter.ts](https://github.com/SimonDoy/teams-ai-library-samples/blob/main/teams-ai-app-azure-ai-search/src/responseFormatter.ts) libray is from the ChefBot sample and can be found in my the [Teams AI library sample repository](https://github.com/SimonDoy/teams-ai-library-samples/blob/main/teams-ai-app-azure-ai-search/src/responseFormatter.ts).

Copy and paste this into the index.ts file and add the line to line 25

import { addResponseFormatter } from './responseFormatter';

Add the line to line 145

addResponseFormatter(app);

Save the file and that should be the last coding change.

Configuration of Environment Variables

The environment variables are set in the env file which is found in the root of the project alongside the package.json. The environment variables should be updated using the following information:

  • AZURE_OPENAI_KEY – this is the access key found in your Azure Open AI Service settings.
  • AZURE_OPENAI_ENDPOINT – this is the URL that connects to your Azure Open AI Service.
  • AZURE_SEARCH_ENDPOINT – this is the URL that connects to your Azure AI Search Service.
  • AZURE_SEARCH_KEY – this is the access key found in your Azure AI Search Service settings.
  • AZURE_OPENAI_DEPLOYMENTMODEL – this is the name of the GPT deployment that has been created using the Azure Open AI portal
  • AZURE_SEARCH_INDEXNAME – This is the name of the search index that holds the content that your AI bot is going to use to reason over.

Let’s put it all together

Now, all being well your application will work!

So, start running the application using F5 to start Visual Studio Code deploying and debugging your code.

Wait for the application to spin up your infrastructure and bot service.

Eventually your browser will start and open up Microsoft Teams and prompt you to add the app.

Add the app to Microsoft Teams.

You should be presented with a chat. Use the chatbot and start asking for information that is related to the content that you have stored in the index.

Final Sample Code

To help you get your code running I have provided the Teams AI Library Sample Github Repository to allow you to review your code to the original.

Making tweaks

Now, you will want to try different prompts and tweak them to change the response that comes back from the AI platform.

Conclusion

I hope that you find this useful and can get this setup and working with your own data. The extension is not battle-hardened and I am sure there are plenty of issues that can come up with it but please raise issues via GitHub and if I get time, I will look.

One thing to be aware of is that the SharePoint Indexer for Azure AI Search Service is still in beta and there are issues when documents are deleted and documents are renamed as the index process does not pick up these changes, so be careful.

As an alternative the Azure Blob Storage Indexer has been released and has been made generally available.

I will look at putting in a bonus episode to explain how to configure the Azure Blob Storage Indexer but if you cannot wait, take a look at the Microsoft article, [Azure AI Search How to Index Azure Blob Storage](https://learn.microsoft.com/en-us/azure/search/search-howto-indexing-azure-blob-storage).

Happy code and let me know how you got on.