AI tools being used in the workplace is no longer a question of if but how and what. The question is what if you could allow this in a safe and controlled way without compromising security and compliance and without leaking copirate data.?
What is BYO-Copilot to Work?
Whether your organisation invests and has deployed AI tools for employees or not, your people are already experimenting or using daily AI tools to boost their productivity, search and get things done at home but also at work, often outside the guardrails of IT. This is bad for IT and organisational security and privacy of a few levels.
- Free AI tools such as ChatGPT, Claude etc that are consumer products do not have the data protection in place that organisations need to ensure corporate data is not used to train the AI models.
- When we use free tools, the user “is the product” and many employees (and some organisations) simply don’t know the risk of using unsanctioned AI tools especially when with corporate data.
Microsoft introduces “bring your own Copilot to work”
Microsoft’s latest announcement delivers a way to address (some of) these challenges by enabling a secure, and safe way employees who don’t have a Microsoft 365 Copilot license at work to use their personal Copilot from their personal Microsoft 365 subscriptions (Personal, Family, or Premium) directly at work and with work documents – with enterprise-grade data protections organisations demand and expect.
This “bring your own Copilot” (BYO-Copilot) model provides a safe way (if permitted) for individuals to use AI at work without compromising organisational security, while giving IT the levers to stay firmly in control.
So how does it work?
How does Bring Your Own Copilot Work?
It’s simple really..
- Employees can sign into so Microsoft 365 apps with both work and personal accounts (similar to work and personal profiles in Edge).
- All Copilot features from a the employees personal subscription can be used on work files – even if the employee’s work account doesn’t have a Copilot licence.
- Microsoft’s Enterprise data protection remains intact since Copilot only works within the permissions of the user’s work identity so identity, data protection and governance remains in tact.
- IT remains in control. Admins can enable or disable this capability via policy, and all Copilot interactions are auditable.
In short: BYO-Copilot provides a safe and secure way for employees to get access to Copilot, whilst IT keeps governance, and organisational data protected.
This provides a much better method than just saying no. This of course needs the user to be using Copilot and not other non Microsoft AI tools.
Empowering employees
From an employee’s perspective, the process is straightforward (they will of course need a personal Microsoft 365 subscription with Copilot) first.
- Sign into their work Microsoft 365 apps with both business and personal accounts.
- Open Word, Excel, or another Microsoft 365 app.
- Use the account switcher to add your personal Microsoft 365 account alongside your work account.
- Open a work file from Word, PowerPoint, Excel etc which could be a document stored in OneDrive for Business or SharePoint that they have access too.
- Invoke Copilot and use it as they normally would – for example, ask it to summarise a document, draft a response, or analyse some excel data.
Copilot will of course only work with the content you can access through your work identity,but Microsoft will not use the data to train their models, learn or share anything with the wider Microsoft environment. Organisations remain protected by Microsoft’s Responsible AI and Enterprise Data Protection.
Limitations of BYO Copilot at work?
It is worth noting that this is not the full Microsoft 365 Copilot so their are limitations.
- The employees personal Copilot subscription only works on the open document or explicitly referenced file.
- Broader capabilities (like querying across multiple files or your organisational data) still require a corporate Microsoft 365 Copilot licence.
How can organisations enable or block BYO Copilot?
Microsoft has of course designed this capability with IT control at the centre and no need for additional tools or controls. Here are the key considerations:
- Policy Control
- A tenant-level setting called “Multiple account access to Copilot for work documents” determines whether personal Copilot can be used on work files.
- IT can enable or disable this capability for all users or specific groups.
- Identity and Permissions
- Copilot always respects the user’s work account permissions and data, identity and compliance settings on your tenant.
- The employees personal account provides the Copilot entitlement, but it does not gain access to organisational data. You still control this as always.
- Data Protection
- All interactions remain within the Microsoft 365 service boundary.
- Prompts and responses are encrypted, governed by existing compliance commitments, and never used to train the underlying AI models or shared for advertising etc.
- Auditing and Monitoring
- Copilot actions are logged and auditable, just like other user activities and can be surfaced via SEIM, audit logs and the Copilot Control System.
- Other settings such as retention policies and compliance tools (such as Purview and DLP) apply to Copilot interactions.
- Governance of Web Access
- If Copilot’s web search grounding is disabled in your tenant, that restriction still applies – even when using a personal subscription.
Why
For IT leaders, this is a pragmatic response to the reality of AI adoption. Employees want (and often demand) to use AI, and many already have personal subscriptions to Microsoft Copilot at home. Rather than pushing this activity into the shadows, Microsoft has created a safe and protected way for employees to use their own “Microsoft” AI tools that:
- Empowers employees to use AI productively.
- Keeps IT in control with clear policy levers.
- Protects organisational data with enterprise-grade safeguards.
- Ensures IT are in the know, providing IT teams the ability to safely allow shadow IT on their copirate environment.
Of course, IT should take the time to review their policies, communicate the boundaries, and decide whether to enable or restrict BYO-Copilot in their environment.
What about other AI Tools like ChatGPT personal and Perplexity?
No (well not yet anyway). Since Microsoft have control over the Copilot experience across commercial, enterprise and personal/family experiences this is something they can do whilst guaranteeing nothing compromises or impacts their existing security and data goverance.
Whilst this won’t stop a poorly configured and governed Microsoft 365 environment allowing third party apps and AI tools on their network, it provides a safe way to empower many more unlicensed Microsoft 365 Copilot users to bring their Copilot to work without organisational expense.
Organisations can then use existing controls to block scan and block the use of unsanctioned AI tools and give employees a viable “self funded” option to BYO Copilot AI to work.
What do you think of BYO Copilot?
Personally I think this is great. Unless your controls are strong, personal AI is already at coming to work, whether sanctioned or not.
With BYO-Copilot, at least IT can provide a way of safely allowing it with the right guardrails in place.
Of course it’s also a clever approach as it gets more people (potentially) paying for Copilot themselves and then convincing their work to buy them a Microsoft 365 Copilot license.
What do you think?