Does ChatGPT pose a security issue?

AI tools like ChatGPT and Microsoft 365 Copilot are transforming businesses. To protect your sensitive information, learn how their security and data handling differ.

Generative AI tools like ChatGPT and Microsoft 365 Copilot are revolutionising how businesses interact with technology. However, their approaches to security and data handling differ significantly, and it’s crucial for your organisation to understand these distinctions to protect your sensitive information.

Understanding Security in ChatGPT and Microsoft 365 Copilot

How ChatGPT Works: The Open Model

ChatGPT is powered by a Large Language Model (LLM), trained on vast amounts of publicly available data such as websites, articles, and books. When you input a prompt, ChatGPT generates a response based on patterns it has learned from this training data. While it’s an excellent tool for general queries and brainstorming, it comes with a significant security caveat: anything you input into ChatGPT could potentially be accessed by a third party.

If you share sensitive or confidential details in a ChatGPT prompt, that information becomes part of the model’s training data and could inadvertently surface in another user’s interactions. For example, imagine an organisation asking ChatGPT to summarise a confidential project specification. That project’s details could later be referenced in responses to unrelated queries from other users.

In short, while ChatGPT is powerful, it’s not secure for handling proprietary or sensitive business data.

Microsoft 365 Copilot: A Secure Bubble for Your Data

Microsoft 365 Copilot operates very differently. It uses AI tailored specifically for your organisation, running securely within your Microsoft 365 tenant environment. Here’s how it ensures data security:

  1. Data Stays Within Your Tenant
    Unlike ChatGPT, Copilot doesn’t train its LLM on your data. All interactions remain within your organisation’s secure boundary—your “security bubble.” Whether you’re accessing documents on SharePoint, retrieving emails, or referencing Teams chats, the data stays private and protected.
  2. No External Training
    Microsoft 365 Copilot doesn’t learn from your data to improve its model. This guarantees that your sensitive information won’t be used in responses to other users.
  3. Integration with Microsoft Graph
    Copilot leverages Microsoft Graph to retrieve data from your organisation’s environment, including SharePoint, OneDrive, and Teams. For instance, if you need a document discussed in a team meeting, Copilot can locate it for you while ensuring that the data retrieval process respects your organisation’s security policies.
  4. Inherited Governance
    Copilot inherits your organisation’s existing data governance policies. If your data policies and access controls are robust, Copilot will align with them. However, if governance protocols are lacking, Copilot won’t be able to enforce them on its own.

Why Governance and Compliance Matter

Even with a secure tool like Microsoft 365 Copilot, organisations must maintain strong security protocols.

For example:

  • Access Restrictions
    • Ensure employees can only access files and data relevant to their roles.
  • Data Governance
    • Update and enforce policies around data storage, sharing, and retention.
  • Regulatory Compliance
    • Verify that your organisation adheres to industry standards and regulations for data protection.

If your organisation’s governance isn’t up to standard, Copilot won’t know any better. It’s essential to audit and refine your data policies before implementing advanced tools like Copilot.

Making the Right Choice

Both ChatGPT and Microsoft 365 Copilot have their strengths. ChatGPT is an excellent tool for general queries and creative tasks where sensitive data isn’t involved. Microsoft 365 Copilot, on the other hand, is ideal for securely leveraging your internal data to enhance productivity and collaboration.

The key takeaway? Understand the tools’ differences and use them accordingly. When security is a priority, Microsoft 365 Copilot is the clear choice—just ensure your organisation’s governance framework is ready to support it.

ChatGPT v Copilot – Key Differences

ChatGPTCopilot
A versatile conversational AI, adept at generating creative text formats, answering questions in detail, and engaging in discussions. Think of it as a general-purpose language model.Designed to be a productivity assistant within the Microsoft 365 ecosystem. It excels at assisting with tasks within those applications, such as writing emails in Outlook, drafting documents in Word, or creating presentations in PowerPoint.
Trained on a massive dataset of public text and code. While it can access and process information you provide, this data might be used to further train the model.Primaily uses data from within your Microsoft 365 environment (your documents, emails, calendar, etc.). The focus on your private data makes it more suitable for businesss settings with sensitive information.
Typically accessed through a web interface or API. While it can be integrated into other platforms, this usually requires developer effort.Deeply integrated into Microsoft 365 apps. It appears as a sidebar assistant or provides inline suggestions, making it a seamless part of your workflow within those applications.
Excels at creative writing, summarisation and general research.Streamlines your workflow within Microsoft 365.
Can generate different creative text formats (poems, code, scripts, musical pieces, email, letters, etc.)Automates tasks like writing emails, summarising meetings, or creating presentations.
Offers a broader range of applications due to its general-purpose nature. Prioritises data privacy and security within the Microsoft ecosystem.
ChatGPT learns from the data it processes, including user inputs. While OpenAI states they work to remove personal information, there’s still a risk that sensitive data could be inadvertently included in the model’s training data. Copilot primarily uses data from within your Microsoft 365 environment. This focus on your private data, combined with Microsoft’s robust security infrastructure, makes it a strong choice for businesses with sensitive information.
OpenAI offers options to opt-out of having your conversations used for training and they have paid plans with more granular control. However, compared to Copilot, the privacy controls are less tailored for enterprise level security needs.Copilot adheres to enterprise compliance standards like GDPR, CCPA and ISO/IEC 27001, ensuring secure data handling and compliance with data protection regulations.
While generally safe, ChatGPT can sometimes generate outputs that are inaccurate, biased, or even harmful. Relying on it for critical decisions without human oversight could pose risks.As part of Microsoft ecosystem, Copilot can integrate with other security tools and services, enhancing overall security position.

If you’d like to introduce Microsoft Copilot into your business, please contact us.

Have a question? Give us a call.

Don’t let an IT problem slow you down. One of our friendly and helpful nTrust engineers is waiting to answer your question.

Ned Cerazy - nTrust IT Helpdesk
Supporting you to the
nth degree

Contact us today.

Contact us today and receive a reply back within 24 hours

Quick Contact

For us, nothing is too much trouble. So please do get in touch.



    cyber security

    How Cyber Secure
    is your Business?