Copilot for Microsoft 365 shares several similarities and some important differences with the popular public ChatGPT service. Believe it or not, some Copilot for Microsoft 365 users still opt to use ChatGPT on occasion. This blog provides clarity about when and why, along with the similarities and differences.
Regarding security measures for usage, far more organizations than I would expect are still using only basic security controls—like passwords and multi-factor authentication—to authorize access from mobile devices, but they don’t require known devices as an additional security layer. Requiring known devices and applications to get access to corporate data is easy and an effective way to help prevent both unauthorized access to and inappropriate use of sensitive information.
Ironically, almost none of these same organizations would consider allowing personal laptops or computers to access corporate resources, even though many of the same risks exist with mobile Apple or Android devices. Personal mobile devices are perhaps even more likely to be unpatched, insecurely configured, contain potential malware, and shared by people other than employees. We’ll discuss more on how to combat risk, and obtain secure access using Mobile Application Management policies below.
Both Copilot for Microsoft 365 and ChatGPT are excellent at text generation, data analysis, and even coding assistance. Both use advanced machine learning techniques to generate human-like text.
They share some architectural similarities, in that they run on top of Azure. In fact, Azure OpenAI hosts a variety of large language models (LLMs) developed by OpenAI. These models are trained on vast amounts of text data and can generate human-like text, perform language translation, and more. Some examples of LLMs that run on Azure OpenAI include Copilot for Microsoft 365, GPT-4, DALL-E, and Codex.
Copilot for M365 is not based on a specific version of the GPT model like GPT-4 or Turbo, but is powered by advanced AI technology developed by OpenAI that is constantly being updated and improved.
MAM policies protect data, not devices, and provide for the separate “sandboxing” of corporate data and applications from personal information, apps, photos, browser activity, and so forth. This segregation of corporate information makes it easy to protect while not interfering with the way people use their phones or tablets for everything else. For example, you can require specific email apps to access company email or restrict screenshots of corporate data while allowing personal app and data use to be unrestricted.
Further, conditional access policies can be enforced so that only known devices with device records in Entra ID and appropriate applications and application controls are permitted to access company resources. Again, this adds significant protection. Access is only granted if both valid account credentials and an authorized device are used to make the request.
|Copilot for M365
|Public web data and OpenAI dataset
|Internet + your private Microsoft 365 dataset (Word, Excel, emails, etc.)
|1. Website 2. Mobile App 3. Behind the scenes of third-party large language model (LLM) apps and chatbot
|1. Within the Microsoft 365 apps 2. Through the Microsoft 365 Chat app 3. In the Office mobile Apps
|Works best with general prompts
|Thrives on specific, contextual prompts within M365 applications
|Privacy & Security
|Limited control over data used for training and outputs
|Microsoft 365 security tools protect your data and outputs
|Traceability & Logging
|Limited traceability of prompts and outputs
|Comprehensive logging of prompts and outputs within M365 environment
|Citation of Sources
|1. Adds hyperlinks when possible 2. Users can query specific files to search/generate content from
|Primarily text-based interaction
|Seamless integration with M365 applications for enhanced workflows
|User Inputs to LLM Training
|No direct feedback loop from users to LLM training
|Copilot content is not used to train the next LLM release
|Free and paid tiers, based on usage and features
Of course, the main difference is that Copilot for Microsoft 365 is specifically designed to assist users with tasks and information related to Microsoft 365, while ChatGPT is a more general-purpose conversational AI model. To that end, Copilot for M365 is segmented from the other OpenAI models. Your company data is not traversing the Internet like prompts or responses to ChatGPT. Within the walled garden of your M365 ecosystem, it leverages your private documents, emails, and spreadsheets to augment its responses.
For personal use or exploring creatively, ChatGPT’s open access and diverse dataset make it a perfect playground. Its prompts can be quite vague, and it will willingly create an answer. Just remember, in the business context, it’s important to use the tool responsibly for data security and potential biases.
For businesses prioritizing data security, privacy, and personalized content, Copilot for Microsoft 365 is the better option. Its secure environment, private data foundation, and deep Microsoft 365 integration make it a powerful tool for boosting productivity, generating data-driven insights, and automating tedious tasks.
Ultimately, the decision between the two depends on the use cases, tolerance for risk, and appetite for costs. Both services have their unique strengths and can be used to enhance productivity, generate insights, and automate tedious tasks. The future is bright for AI-powered text generation, and the possibilities are endless.
Contact our team of experts today to get started on your journey to becoming a more efficient organization!