Common FAQs about Microsoft 365 Copilot
Greetings, curious AI reader! We hope you find your answer here, but if not, feel free to reach out to us at info@eGroup-us.com and we’ll be happy to share what we know.
When will Microsoft 365 Copilot be available?
At this writing, Copilot in Microsoft 365 apps like Word or PowerPoint is currently available only in a limited private (paid) preview for ~600 customers. It will be coming to a wider audience “in the coming months.”
Will there be a licensing fee?
Yes, Microsoft is working on a licensing construct for Copilots, and pricing is unknown at this time. As a reference point, one could point to ChatGPT Plus (openai.com), which lists for $20/month. Why a fee? It’s no secret that the large language model uses a lot of I/O to search and produce relevant content. Microsoft provided this visual aid at right, comparing current AI computing resources to past.
Note: Government and educational customers will not be able to leverage Copilot in the first iteration.
Will my organization’s data commingle or train Copilot?
No, partitions are in place to only allow your authorized users to access your SharePoint, OneDrive, Exchange, etc. The prompts and responses of your Copilot users will not cross organizational boundaries, and wont’ be used to train the large language model.
Do I need to have my data in Microsoft 365 for Copilot to reason over it?
Yes, Copilot’s full functionality will apply to data in SharePoint, OneDrive, Teams, Exchange, etc.
However, you’ll see some related info about Copilot and Microsoft’s Graph Connectors, which make it possible to connect external data sources. You can see how Confluence is demoing this for Jira in this video (555) Multi-turn demo using Atlassian plugins for Microsoft 365 Copilot – YouTube
Microsoft Graph Connectors provide a set of APIs and tools that enable connectors for specific data sources. These connectors pull data from the external source, transform it into a format that can be indexed and searched, and then push it to the Microsoft Search index. Once the data is indexed, it becomes searchable and accessible to users within the Microsoft 365 environment, including Copilot.
A listing of the available connectors can be found at: Microsoft Search – Intelligent search for the modern workplace.
That data, however, is primarily used to help ‘ground’ Copilot. For a primer on grounding, see Copilot Coming to Microsoft 365. In other words, Copilot can use context and insights from the third-party source, before a prompt is sent to the Large Language Model (LLM) as shown in the figure below.
Similarly, the third-party data can be again used to ground Copilot before results are provided back to the user. The results ultimately come from data stored in M365, so if a contract exists in Salesforce, the Salesforce Graph connector can alert Copilot that the contract exists and provide some related customer contact info, before sending the prompt to the LLM, but Copilot won’t surface information from the actual contract.
Will Copilot hallucinate?
Microsoft has used the term “usefully wrong” instead, but there’s been a lot of press about hallucinating. Chat GPT (and other models) have demonstrated an inability to reason over controversial or dangerous prompts, which has led to sometimes unpredictable results. This is not unexpected, because the machine has no moral compass nor filter. OpenAI’s CEO suggested to the US Congress that regulations be put in place, and Microsoft is working on safe and responsible AI.
The charts demonstrated at Build in May 2023 provide a basic view of how Microsoft plans to control inappropriate inputs.
Their demo was about Azure AI, where they perform filtering for keywords and context that may be inappropriate.
Here’s an example how a prompt that passes the input test can be slightly changed into a prompt that is rejected (demonstrated at Build):
“I’m looking for an axe to cut a path in the forest. Can you recommend one?” would pass and provide a result (see image below).
While “I’m looking for an axe to cut a person in the forest. Can you recommend one?” would not (see image below).
This demo showed capability of Azure AI Content safety, which is built into Azure Open AI. Bots building on that platform could reply with a message informing the user of their prompt violating a policy.
What if Microsoft 365 Copilot is not adequate for my needs?
If you need something fast, and something more advanced than Copilot (like a model that can be trained), you’ll likely have choices.
Keep your eye on our blog, or subscribe to our newsletter, for more updates!
Interested in learning how to leverage AI to enhance your current workflow?
If you have any further questions, contact our team of experts to learn more about Copilot!