Frequently asked questions about Microsoft 365 Copilot Notebooks
Applies To
Last updated: April 2026​​​​​​​
Note: A Microsoft 365 Copilot license is required to use Copilot Notebooks. Additionally, accounts must have a SharePoint or OneDrive license (service plan) in order to create notebooks. Copilot Notebooks is also available for Microsoft 365 Personal, Family, and Premium subscribers. Learn more about Microsoft 365 Copilot licensing and Microsoft 365 Copilot plans.
Getting started and basics
What’s the difference between Copilot Notebooks and OneNote?
Copilot Notebooks are AI-powered workspaces designed to gather and selectively ground on the references in the notebook, using Copilot to answer questions or create context-aware content.Â
OneNote notebooks focus on collecting and organizing notes, ink, and other information that you can share with others. They serve as long-term knowledge repositories.Â
To learn more about how Copilot Notebooks and OneNote compare, see Compare Microsoft 365 Copilot Notebooks and Microsoft OneNote.
Do I need Microsoft Loop to use Microsoft 365 Copilot Notebooks?
You don't need the Loop app or Loop components to create or use Copilot Notebooks. Copilot Notebooks are part of Microsoft 365 Copilot and also appear in OneNote, neither of which requires the use of Loop. Learn more about governance of Copilot Pages and Copilot Notebooks (for Compliance Managers and IT administrators).
Licensing and accessÂ
Do I need a Microsoft 365 Copilot license to use Copilot Notebooks?
A Microsoft 365 Copilot license is required to use Copilot Notebooks. Additionally, accounts must have a SharePoint or OneDrive license (service plan) in order to create notebooks. Copilot Notebooks is also available for Microsoft 365 Personal, Family, and Premium subscribers. Learn more about Microsoft 365 Copilot licensing and Microsoft 365 Copilot plans.Â
Is Copilot Notebooks available for Personal, Family, or Premium plans?
Yes, Copilot Notebooks is available for Microsoft 365 Personal, Family, and Premium subscribers
Sharing and permissions in Copilot NotebooksÂ
What happens when I share a Copilot Notebook?
Sharing a notebook doesn’t just share the notebook—it can also give others access to files inside it. Learn more about sharing a Microsoft 365 Copilot Notebook.
Can I have read-only members in my notebook?
Not currently. Anyone you invite to your notebook will have editing access to it.
Can I add a group as a member to a shared notebook?Â
No. Currently, notebooks are stored in a personal SharePoint Embedded container that belongs to the creator and follows that individual’s lifecycle, so groups can't be added as members. A solution for group-managed notebooks is in progress, and more details will be shared soon. Learn more in an overview of Copilot Pages and Copilot Notebooks storage.
If a notebook is shared with me, can I request access to any references in it that I can’t currently view?Â
Yes, and this can be done outside the Microsoft 365 Copilot app for the notebook’s individually linked files. If the automatic access grant fails, this option is available today. The ability to request access directly within the Notebook application will be added soon.
Storage, ownership, and lifecycle
Where are Copilot Notebooks stored?
View an overview of Copilot Pages and Copilot Notebooks storage for more information about where Copilot Notebooks are stored.
Where is Copilot-generated data stored for Copilot Notebooks?
View an overview of Copilot Pages and Copilot Notebooks storage for more information about where Copilot-generated data is stored for Copilot Notebooks.
What happens to notebooks if the owner leaves the organization?
View a summary of governance, lifecycle, and compliance capabilities for Copilot Pages and Copilot Notebooksfor more information about what happens to a notebook if the owner leaves the organization.
Privacy, data use, and trust
Does Copilot Notebooks use my data to train the large language model?
Copilot Notebooks doesn't use user data to train the large language model.
Where can I learn more about privacy with Copilot?
Copilot and Microsoft 365 are built on Microsoft's comprehensive approach to security, compliance, and privacy.
For more information about privacy, see the following information:
-
If you're using Microsoft 365 Copilot in your organization (with your work or school account), see Data, Privacy, and Security for Microsoft 365 Copilot.
-
If you're using Copilot in Microsoft 365 apps for home (with your personal Microsoft account), see Copilot in Microsoft 365 apps for home: your data and privacy.
Can I trust that the answers are always accurate?
Generative AI features strive to provide accurate and informative responses, based on the data available. However, answers may not always be accurate as they are generated based on patterns and probabilities in language data. Use your own judgment and double check the facts before making decisions or taking action based on the responses.
While these features have mitigations in place to avoid sharing unexpected offensive content in results and prevent displaying potentially harmful topics, you may still see unexpected results. We're constantly working to improve our technology to proactively address issues in line with our responsible AI principles.
What should I do if I see inaccurate, harmful, or inappropriate content? Â
Copilot includes filters to block offensive language in the prompts and to avoid synthesizing suggestions in sensitive contexts. We continue to work on improving the filter system to more intelligently detect and remove offensive outputs. If you see offensive outputs, please submit feedback in using the thumbs-up/thumbs-down icons so that we can improve our safeguards. Microsoft takes this challenge very seriously and we are committed to addressing it.​​​​​​​
How is Copilot Notebooks evaluated?
Microsoft 365 Copilot Notebooks was evaluated through extensive manual and automatic testing on top of Microsoft internal usage and public data. More evaluation was performed over custom datasets for offensive and malicious prompts (user questions) and responses. In addition, Copilot Notebooks is continuously evaluated with user online feedback. For more information, please visit Application Card: Microsoft 365 Copilot to learn more.
What operational factors and settings allow for effective and responsible use of Copilot?
Copilot Notebooks has been reviewed by our Responsible AI (RAI) team. We follow RAI principles and have implemented: Â
-
Responsible AI handling pipeline to mitigate risks like harmful, inappropriate content.
-
In product user feedback with which users can report offensive content back to Microsoft
For more information, please review the Microsoft Responsible AI Transparency Report and our commitment to ISO/IEC 42001:2023 Artificial Intelligence Management System Standards.
More ways to work with Copilot Notebooks​​​
​Get started with Microsoft 365 Copilot Notebooks
How Microsoft 365 Copilot Notebooks works