From f003b2db5c31b18b2091bfe0898a7ab1d5a10417 Mon Sep 17 00:00:00 2001 From: judeper Date: Sat, 14 Mar 2026 06:55:19 +0000 Subject: [PATCH] Learn Monitor: Update Microsoft Learn documentation tracking - Updated baseline or detected changes in Microsoft Learn documentation - See reports/monitoring/ for detailed change reports - Automated update from learn-monitor workflow (unified monitoring framework) --- data/monitor-state.json | 514 ++--- data/monitor-state.json.backup | 1654 ++++++++++------- .../monitoring/learn-changes-2026-03-14.md | 555 ++++++ 3 files changed, 1770 insertions(+), 953 deletions(-) create mode 100644 reports/monitoring/learn-changes-2026-03-14.md diff --git a/data/monitor-state.json b/data/monitor-state.json index 16eb9239c..2d460774e 100644 --- a/data/monitor-state.json +++ b/data/monitor-state.json @@ -3,12 +3,12 @@ "sources": { "learn": { "schema_version": 2, - "last_run": "2026-03-11T12:59:29.613756+00:00", + "last_run": "2026-03-14T06:51:10.083468+00:00", "urls": { "https://learn.microsoft.com/en-us/power-platform/admin/managed-environment-overview": { "content_hash": "sha256:61c48d093d6ebf3f304ff8ab9ad527d3fc956be07f436e1a0e219624069df30c", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nManaged Environments overview\nFeedback\nSummarize this article for me\nManaged Environments is a suite of premium capabilities that allows admins to manage Power Platform at scale with more control, less effort, and more insights. Admins can use Managed Environments with any type of environment. Certain features can be configured upon enabling a Managed Environment. Once an environment is managed, it unlocks more features across the Power Platform.\nLearn how to use Managed Environments\n.\nA Managed Environment encompasses, but isn't limited to, the following features:\nEnvironment groups\nLimit sharing\nWeekly usage insights\nData policies\nPipelines in Power Platform\nMaker welcome content\nSolution checker\nIP Firewall\nIP cookie binding\nCustomer Managed Key (CMK)\nLockbox\nExtended backup\nData policies for desktop flow\nExport data to Azure Application Insights\nAdminister the catalog\nDefault environment routing\nCreate an app description with Copilot\nVirtual Network support for Power Platform\nConditional access on individual apps\nControl which apps are allowed in your environment\nCreate and manage masking rules\nNote\nManaged Environments is included as an entitlement with standalone Power Apps, Power Automate, Microsoft Copilot Studio, Power Pages, and Dynamics 365 licenses. Trial licenses can be used to license users in Managed Environments, with the restrictions specific to these types of licenses. To learn more about Managed Environment licensing, see\nLicensing\nand\nLicensing overview for Microsoft Power Platform\n.\nManaged Environment isn't included as an entitlement in the Developer Plan when users run their assets. For more information about Managed Environments and the Developer Plan, see\nAbout the Power Apps Developer Plan\n.\nRelated content\nEnable Managed Environments\nUsage insights\nLimit sharing\nData policies\nLicensing\nView license consumption (preview)\nTenant settings\nDefault environment routing\nConsiderations for using Managed Environments\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Managed Environments", @@ -17,7 +17,7 @@ "https://learn.microsoft.com/en-us/power-platform/admin/managed-environment-enable": { "content_hash": "sha256:34c59858260c34971883a6a71e852fcf29ef14e1bf0048dc625738325b7598a1", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nEnable Managed Environments\nFeedback\nSummarize this article for me\nAdmins enable, disable, and edit Managed Environments in the Power Platform admin center. Admins can also use PowerShell to disable Managed Environments. This article explains the permissions you need to manage environments and the steps to get started in the Microsoft Power Platform admin center or with PowerShell.\nPermissions\nTo enable or edit Managed Environments, you need the Power Platform Administrator or Dynamics 365 Administrator role in Microsoft Entra ID. You can learn more about these roles in\nUse service admin roles to manage your tenant\n.\nAny user with permission to view environment details can see the Managed Environments property for an environment.\nUsers with the Delegated Admin role or the Environment Admin security role can't change the Managed Environments property in an environment.\nImportant\nThe Managed Environments property must be the same in the source and destination before you can start to copy and restore environment lifecycle operations.\nDataverse is required to use Managed Environments in an environment type.\nEnable or edit Managed Environments in the admin center\nSign in to the\nPower Platform admin center\n.\nIn the navigation pane, select\nManage\n, and then in the\nManage\npane, select\nEnvironments\n.\nSelect the ellipsis next to an environment, and then in the menu, select\nEnable Managed Environments\n. If the environment is already managed, select\nEdit Managed Environments\n.\nConfigure the settings, and then select\nEnable\nor\nSave\n.\nEnable Managed Environments using PowerShell\nAdmins can also use PowerShell to enable Managed Environments. The following PowerShell script enables it for a single environment.\n$GovernanceConfiguration = [pscustomobject] @{ \n protectionLevel = \"Standard\" \n settings = [pscustomobject]@{ \n extendedSettings = @{} \n }\n} \n\nSet-AdminPowerAppEnvironmentGovernanceConfiguration -EnvironmentName -UpdatedGovernanceConfiguration $GovernanceConfiguration\nCopy Managed Environment settings using PowerShell\nAdmins can use PowerShell to copy settings from one Managed Environment to another environment. If the target environment isn't a Managed Environment, copying settings also enables it as a Managed Environment.\n#Get settings from the source Managed Environment\n$sourceEnvironment = Get-AdminPowerAppEnvironment -EnvironmentName \n\n# Copy the settings from the source Managed Environment above to the target environment\nSet-AdminPowerAppEnvironmentGovernanceConfiguration -EnvironmentName -UpdatedGovernanceConfiguration $sourceEnvironment.Internal.properties.governanceConfiguration\nDisable Managed Environments using PowerShell\nAdmins can use PowerShell to remove the Managed Environments property from an environment. Before you disable Managed Environments, make sure none of the Managed Environments capabilities are in use.\nHere's an example PowerShell script that calls the API to set the Managed Environments property:\n$UpdatedGovernanceConfiguration = [pscustomobject]@{\n protectionLevel = \"Basic\"\n}\nSet-AdminPowerAppEnvironmentGovernanceConfiguration -EnvironmentName -UpdatedGovernanceConfiguration $UpdatedGovernanceConfiguration\nRelated content\nManaged Environments overview\nUsage insights\nLimit sharing\nData policies\nLicensing\nView license consumption (preview)\nTenant settings\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Enable Managed Environments", @@ -26,7 +26,7 @@ "https://learn.microsoft.com/en-us/power-platform/admin/managed-environment-sharing-limits": { "content_hash": "sha256:739d51e0173ef1bb7b61603b4076904844b4450777b86e0717423c73d3ffb0b1", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nLimit sharing\nFeedback\nSummarize this article for me\nIn Managed Environments, admins can limit how broadly users can share canvas apps, flows, and agents.\nTo configure these rules:\nSign in to the\nPower Platform admin center\n.\nIn the navigation pane, select\nManage\n.\nIn the\nManage\npane, select\nEnvironments\n.\nOn the\nEnvironments\npage, select a managed environment.\nIn the command bar,\nEdit Managed Environments\n.\nThe sharing rules are located in the\nManage sharing\nsection.\nChoose the desired settings, then select\nSave\nto apply the changes.\nCanvas app sharing rules\nCanvas app sharing rule\nDescription\nDon't set limits\nSelect to not limit sharing canvas apps.\nExclude sharing with security groups\nSelect if users aren't allowed to share canvas apps with any security groups or with everyone.\nLimit total individuals who can be shared to\nIf\nExclude sharing with security groups\nis selected, you can control the maximum number of users with whom a canvas app can be shared.\nSolution-aware cloud flow sharing rules\nSolution-aware cloud flow sharing rules\nDescription\nLet people share solution-aware cloud flows\nWhen selected:\nUsers can share solution-aware cloud flows and agent flows with any number of individuals or security groups.\nWhen not selected:\nUsers can't share their cloud flows or agent flows with any individual or security group.\nAgent sharing rules\nAgent sharing rule\nDescription\nLet people grant\nEditor\npermissions when agents are shared\nWhen selected:\nOwners and editors can share with any individual as an editor.\nWhen not selected:\nOwners and editors can't share with an individual as an editor. This control doesn't affect the ability of owners or editors to share with viewers.\nLet people grant\nViewer\npermissions when agents are shared\nWhen selected:\nOwners and editors can share with any individual as a viewer and any security group.\nWhen not selected:\nOwners and editors can't share with an individual as a viewer, nor can they share with a security group. This control doesn't prevent them from sharing their copilots with individuals as editors.\nOnly share with individuals (no security groups)\nIf this setting is selected, owners and editors can only share with individuals as viewers. They can't share with a security group. This control doesn't affect an owner's or editor's ability to share with individuals as editors.\nLimit number of viewers who can access each agent\nIf\nOnly share with individuals (no security groups)\nis selected, you can control the maximum number of viewers with whom an agent can be shared with.\nTo learn more about\nEditor\nand\nViewer\npermissions on agents, go to\nCopilot Studio security and governance\n.\nNote\nSharing rules are enforced when users try to share an app, flow, or agent. This restriction doesn't impact any existing users who already have access to the app, flow, or agent before the application of the sharing rules. However, if an app, flow, or agent is out of compliance after rules are set, only unsharing is allowed until the app, flow, or agent is compliant with the new rules.\nAfter sharing rules are set in the Power Platform admin center, it may take up to an hour for them to start getting enforced.\nSharing rules in Dataverse for Teams environments don't impact sharing to a Team when you select\nPublish to Teams\n. However, when a user attempts to share with individuals or groups in a Team other than the one bound to the environment, the sharing limits are enforced.\nIf a user tries to share a canvas app, solution-aware cloud flow, or agent that contradicts the sharing rules, they're informed as shown.\nUse PowerShell to set sharing limits\nYou can also use PowerShell to set and remove sharing limits.\nSet sharing limits\nHere's a PowerShell script that prevents canvas apps from being shared with security groups and limits the number of individuals that the canvas app can be shared with to 20.\n# Retrieve the environment\n$environment = Get-AdminPowerAppEnvironment -EnvironmentName \n\n# Update the Managed Environment settings\n$governanceConfiguration = $environment.Internal.properties.governanceConfiguration\n$governanceConfiguration.settings.extendedSettings | Add-Member -MemberType NoteProperty -Name 'limitSharingMode' -Value \"excludeSharingToSecurityGroups\" -Force\n$governanceConfiguration.settings.extendedSettings | Add-Member -MemberType NoteProperty -Name 'maxLimitUserSharing' -Value \"20\" -Force\n\n# Save the updated Managed Environment settings\nSet-AdminPowerAppEnvironmentGovernanceConfiguration -EnvironmentName -UpdatedGovernanceConfiguration $governanceConfiguration\nHere's a PowerShell script that turns off sharing for solution-aware cloud flows.\n# Retrieve the environment\n$environment = Get-AdminPowerAppEnvironment -EnvironmentName \n\n# Update the Managed Environment settings\n$governanceConfiguration = $environment.Internal.properties.governanceConfiguration\n$governanceConfiguration.settings.extendedSettings | Add-Member -MemberType NoteProperty -Name 'solutionCloudFlows-limitSharingMode' -Value \"disableSharing\" -Force\n\n# Save the updated Managed Environment settings\nSet-AdminPowerAppEnvironmentGovernanceConfiguration -EnvironmentName -UpdatedGovernanceConfiguration $governanceConfiguration\nHere's a PowerShell script that prevents agents from being shared with security groups and limits the number of viewers that can access an agent to 20.\n# Retrieve the environment\n$environment = Get-AdminPowerAppEnvironment -EnvironmentName \n\n# Update the Managed Environment settings\n$governanceConfiguration.settings.extendedSettings | Add-Member -MemberType NoteProperty -Name 'bot-limitSharingMode' -Value \"ExcludeSharingToSecurityGroups\" -Force\n$governanceConfiguration.settings.extendedSettings | Add-Member -MemberType NoteProperty -Name 'bot-maxLimitUserSharing' -Value \"20\" -Force\n\n# Save the updated Managed Environment settings\nSet-AdminPowerAppEnvironmentGovernanceConfiguration -EnvironmentName -UpdatedGovernanceConfiguration $governanceConfiguration\nHere's a PowerShell script that turns off the ability to share your agents with individuals as Editors.\n# Retrieve the environment\n$environment = Get-AdminPowerAppEnvironment -EnvironmentName \n\n# Update the Managed Environment settings\n$governanceConfiguration = $environment.Internal.properties.governanceConfiguration\n$governanceConfiguration.settings.extendedSettings | Add-Member -MemberType NoteProperty -Name 'bot-authoringSharingDisabled' -Value True -Force\n\n# Save the updated Managed Environment settings\nSet-AdminPowerAppEnvironmentGovernanceConfiguration -EnvironmentName -UpdatedGovernanceConfiguration $governanceConfiguration\nSet 'bot-authoringSharingDisabled' to False to enable sharing with individuals as Editors.\nRemove sharing limits\nHere's a PowerShell script that removes the canvas app sharing limits.\n# Retrieve the environment\n$environment = Get-AdminPowerAppEnvironment -EnvironmentName \n\n# Update the Managed Environment settings\n$governanceConfiguration = $environment.Internal.properties.governanceConfiguration\n$governanceConfiguration.settings.extendedSettings | Add-Member -MemberType NoteProperty -Name 'limitSharingMode' -Value \"noLimit\" -Force\n$governanceConfiguration.settings.extendedSettings | Add-Member -MemberType NoteProperty -Name 'maxLimitUserSharing' -Value \"-1\" -Force\n\n# Save the updated Managed Environment settings\nSet-AdminPowerAppEnvironmentGovernanceConfiguration -EnvironmentName -UpdatedGovernanceConfiguration $governanceConfiguration\nTo remove sharing limits for solution-aware cloud flows, run the following script.\n# Retrieve the environment\n$environment = Get-AdminPowerAppEnvironment -EnvironmentName \n\n# Update the Managed Environment settings\n$governanceConfiguration = $environment.Internal.properties.governanceConfiguration\n$governanceConfiguration.settings.extendedSettings | Add-Member -MemberType NoteProperty -Name 'solutionCloudFlows-limitSharingMode' -Value \"noLimit\" -Force\n\n# Save the updated Managed Environment settings\nSet-AdminPowerAppEnvironmentGovernanceConfiguration -EnvironmentName -UpdatedGovernanceConfiguration $governanceConfiguration\nTo remove limits on sharing your agent with security groups or individuals as Viewers, run the following script.\n# Retrieve the environment\n$environment = Get-AdminPowerAppEnvironment -EnvironmentName \n\n# Update the Managed Environment settings\n$governanceConfiguration = $environment.Internal.properties.governanceConfiguration\n$governanceConfiguration.settings.extendedSettings | Add-Member -MemberType NoteProperty -Name 'bot-limitSharingMode' -Value \"noLimit\" -Force\n\n# Save the updated Managed Environment settings\nSet-AdminPowerAppEnvironmentGovernanceConfiguration -EnvironmentName -UpdatedGovernanceConfiguration $governanceConfiguration\nSurface your organization’s governance error content\nIf you specify governance, error message content to appear in error messages, it's included in the error message displayed to users. Learn more in\nPowerShell governance error message content commands\n.\nRelated content\nManaged Environments overview\nEnable Managed Environments\nUsage insights\nData policies\nLicensing\nView license consumption (preview)\nTenant settings\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Managed Environment Sharing Limits", @@ -35,7 +35,7 @@ "https://learn.microsoft.com/en-us/power-platform/admin/managed-environment-solution-checker": { "content_hash": "sha256:2a84d465d6c1c16b979dd5c188060cbdf2c323723c727a3acc2a1cbdb9543282", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nSolution checker enforcement in Managed Environments\nFeedback\nSummarize this article for me\nThe solution checker is a powerful tool that performs a comprehensive static analysis of your solution objects against a set of best practice rules. By using solution checker, you can quickly identify problematic patterns in solution components and receive detailed reports that highlight issues, affected components, and provide links to documentation on how to resolve each issue.\nAdministrators can use solution checker to enforce checks to identify problematic patterns on solutions when the solution is imported in the Managed Environment.\nSolution checker settings\nWhen you turn on solution checker for a Managed Environment, there are different levels to choose from that are enforced during solution import.\nSetting\nDescription\nNone\nTurns off the automatic solution validations during solution import. There aren't any experience or behavioral changes to solution authoring, exports, or imports.\nWarn\nAll custom solutions are automatically verified during solution import. When a solution with highly-critical issues is being imported, you're warned about the action but the import itself continues, and if everything else with the import is fine, the solution is imported into the environment. After a successful import, a message stating that the imported solution had validation issues is shown. Additionally, a summary email is sent with details of the solution validation.\nBlock\nAll custom solutions are automatically verified during solution import. When a solution has highly-critical issues, the import process is canceled, and a message stating that the imported solution had validation issues is shown. This happens before the actual import, so there aren't any changes to the environment due to the import failure. Additionally, a summary email is sent with details of the solution validation.\nFor more information on what to do when encountering a warn or block, see the\ntroubleshooting guide\n.\nFor more information about solution checker and the list of rules used, go to\nSolution checker overview\n.\nTurn on solution checker in a Managed Environment\nTo turn on solution checker enforcement for your Managed Environment:\nSign in to the\nPower Platform admin center\n.\nIn the navigation pane, select\nManage\n.\nIn the\nManage\npane, select\nEnvironments\n.\nSelect a managed environment.\nOn the command bar, select\nEdit Managed Environments\n, and then select the appropriate\nenforcement setting\nunder\nSolution checker enforcement\n.\nNote\nSolution checker enforcement is\nnot available\nwhen the environment is in the\nAdministration mode\n.\nEmail messages to the admin\nWhen the validation mode is set to\nWarn\nor\nBlock\n, a summary email is sent when a solution is imported or blocked. When the solution is imported into an environment, the summary email shows the count of issues by severity in the solution. The contents of the email may include a link to the solution analysis results. In some instances, the link to the results may have expired. To get new results, submit the solution to solution checker.\nSolutions checked from Power Apps\nmake.powerapps.com\nhave the results stored in the source environment. Solutions imported to an environment with solution checker enforcement turned on may have results stored in the target, import environment.\nThe email is sent to all users with the roles of\nPower Platform administrator\nand\nDynamics 365 service administrator\n. It's also sent to recipients of the\nweekly digest emails\n.\nSuppress validation emails\nBy default, emails are sent when a solution contains medium and above severities. When the checkbox is selected, emails aren't sent in warn mode. Emails aren't sent in block mode, as well, except for critical violations which block solution import.\nRule exclusions\nYou can select to exclude solution checker rules from enforcement. For example, a particular rule might take significant time and effort to fix across the solution, but you would still like the rest of the rules to be enforced. Use the\nExcluded Rules\ndropdown list to select the rules to exclude from enforcement.\nThe list contains rule names and descriptions grouped by category and sorted by severity. As a reminder, only critical severity rules block a solution from being imported.\nUse PowerShell to turn on solution checker enforcement\nYou can use PowerShell to turn on solution checker enforcement. These functions are defined in the\nPowerApps-Samples repo\n, which must be imported before invoking.\nTurn on solution checker enforcement in block mode\nHere's an example PowerShell script that turns on solution checker enforcement in block mode. After you run it, the slider shows block mode in the\nSolution checker\nsection of the Managed Environments settings.\nSetManagedEnvironmentSolutionCheckerEnforcementLevel -EnvironmentId 8d996ece-8558-4c4e-b459-a51b3beafdb4 -Level block\nTurn on solution checker enforcement in warn mode\nHere's an example PowerShell script that turns on solution checker enforcement in warn mode. After you run it, the slider shows warn mode in the\nSolution checker\nsection of the Managed Environments settings.\nSetManagedEnvironmentSolutionCheckerEnforcementLevel -EnvironmentId 8d996ece-8558-4c4e-b459-a51b3beafdb4 -Level warn\nTurn off solution checker enforcement\nHere's an example PowerShell script that turns off solution checker enforcement. After you run it, the slider shows\nOff\nin the\nSolution checker\nsection of the Managed Environments settings.\nSetManagedEnvironmentSolutionCheckerEnforcementLevel -EnvironmentId 8d996ece-8558-4c4e-b459-a51b3beafdb4 -Level none\nSet rule exclusions\nHere's an example PowerShell script that turns on solution checker enforcement in block mode and adds rule exclusions. After you run it, the slider shows block mode in the\nSolution checker\nsection of the Managed Environments settings, and the rule exclusions are set.\nSetManagedEnvironmentSolutionCheckerEnforcementLevel -EnvironmentId 8d996ece-8558-4c4e-b459-a51b3beafdb4 -Level none -RuleExclusions \"web-use-async,web-use-offline\"\nRelated content\nManaged Environments overview\nImport solutions\nSolution checker enforcement in Managed Environments blocks or warns on import\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Solution Checker Enforcement", @@ -44,7 +44,7 @@ "https://learn.microsoft.com/en-us/power-platform/admin/managed-environment-usage-insights": { "content_hash": "sha256:722c99f183b565b54127d9fbc4bcfc775edb236cd36f8549dd3b0a335bba3418", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nUsage insights\nFeedback\nSummarize this article for me\nStay informed about what’s happening in your managed environments with Power Platform’s weekly admin digest. Analytics about your top apps, your most impactful makers, and inactive resources you can safely clean up are distilled and delivered to your mailbox once a week.\nTo enable a weekly email digest, do the following.\nSign in to the\nPower Platform admin center\n.\nIn the navigation pane, select\nManage\n.\nIn the\nManage\npane, select\nEnvironments\n.\nSelect a managed environment.\nOn the command bar, select\nEdit Managed Environments\n, select the settings under\nUsage insights\n, and then select\nInclude insights for this environment in the weekly email digest\n.\nNote\nYou must\nturn on tenant-level analytics\nto get usage insights.\nCurrently, usage insights aren’t available in sovereign clouds, such as Government Community Cloud (GCC), Government Community Cloud – High (GCC High), Department of Defense (DoD), and Power Platform and Dynamics 365 services in China.\nWhat information is provided in the weekly digest?\nThe first section of the weekly digest shows the number of apps used and active users in your managed environments in the past month.\nThe second section lists apps and flows that haven't been launched in a while. The\nLast launch\ncolumn shows the last date a user launched the application or flow. If the application or flow has never been launched, the column contains “None.\" If an app or flow isn't being used, we recommend that you work with its owner to update or remove it.\nThe third section shows the most popular apps and flows in your managed environments in the past month, indicated by the number of sessions and runs. When a user launches and interacts with an application, that's considered a session. It also shows the top makers over the past month, as measured by total sessions of apps they own.\nWhich environments are included in the weekly digest?\nThe weekly digest provides insights into all managed environments in your tenant that you haven't excluded from reporting.\nTo include a managed environment in the weekly digest, select\nInclude insights for this environment in the weekly email digest\nin the\nUsage insights\nsection of the Managed Environment settings. If you exclude all your managed environments, Power Platform won't send a weekly digest.\nNote\nClear the check box to exclude a managed environment. If you exclude all your managed environments, Power Platform won't send a weekly digest.\nWho can receive the weekly digest?\nThe weekly digest is sent to all users with the roles of\nPower Platform administrator\nand\nDynamics 365 service administrator\n.\nTo add more recipients, select\nAdd additional recipients for the weekly email digest\n, and then select\nWeekly digest\n. Enter email addresses in the Additional recipients box.\nYou can also select\nSettings\nfrom the left-side menu, and then select\nWeekly digest\nto add additional recipients.\nUse PowerShell to add and remove recipients\nYou can also use PowerShell to add and unsubscribe email addresses.\nAdd email recipients\nHere's an example PowerShell script that adds two recipients. After you run it, the new addresses appear in the\nAdditional recipients\nbox in the\nUsage insights\nsection of the Managed Environments settings.\n$tenantSettings = Get-TenantSettings \n($tenantSettings.powerPlatform.governance) | Add-Member -MemberType NoteProperty -Name additionalAdminDigestEmailRecipients -Value 'fakeEmail@contoso.com;otherFakeEmail@contoso.com' \nSet-TenantSettings -RequestBody $tenantSettings\nRemove email recipients\nHere's an example PowerShell script that unsubscribes your entire organization from the weekly digest.\n$tenantSettings = Get-TenantSettings \n$tenantSettings.powerPlatform.governance.disableAdminDigest = $True \nSet-TenantSettings -RequestBody $tenantSettings\nTo resubscribe everyone, set the value for\n$tenantSettings.powerPlatform.governance.disableAdminDigest\nto\n$False\n.\nSee also\nManaged Environments overview\nEnable Managed Environments\nLimit sharing\nData policies\nLicensing\nView license consumption (preview)\nTenant settings\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Usage Insights", @@ -53,7 +53,7 @@ "https://learn.microsoft.com/en-us/power-platform/admin/environment-groups": { "content_hash": "sha256:f4b5163efc431db14d4238cd1d602252b68d005d4f683621d0ef2a3539d144f8", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nEnvironment groups\nFeedback\nSummarize this article for me\nManaging the Power Platform on a large scale across numerous environments, ranging from hundreds to tens of thousands, poses a significant challenge for both startup and enterprise IT teams. To address these complexities, environment groups offer a premium governance solution designed to streamline management tasks by organizing environments into logical collections and enforcing uniform policies and configurations.\nThink of an environment group as a \"folder\" for your environments. Administrators can cluster a flat list of environments into structured groups based on criteria such as business unit, project, geographic region, or purpose. By creating these logical collections, IT teams gain the ability to manage multiple environments simultaneously and efficiently implement security, governance, and compliance policies on a large scale through centrally managed rules. This centralized approach eliminates the need to configure each environment one-by-one, ensures consistency, significantly reduces administrative overhead, and prevents issues such as configuration drift and chaotic management practices common in extensive deployments.\nNote\nEnvironment groups can only contain Managed Environments.\nEach environment can belong to only one group, and groups can't overlap or be nested.\nEnvironments in a group can span different regions and types as long as each is managed.\nEnvironments can be transferred between groups by removing them from one and adding them to another.\nRules\nA key advantage of environment groups is their ability to enforce governance at scale through\nrules\n. Environment groups allow tenant administrators to define rules that automatically apply standardized settings or policies across all member environments. These rules span critical areas of environment management, such as security and sharing, AI feature enablement, data retention policies, and application lifecycle management (ALM).\nWhen a rule is published at the environment group level, it's enforced across every environment within that group. This means the corresponding setting or policy becomes locked (read-only) within individual environments, ensuring that local system administrators can't modify or override these centrally defined rules. Any subsequent changes can only be made by a tenant administrator with appropriate edit rights at the environment group level.\nLearn more about the rules available in\nRules for environment groups\n.\nNote\nPer-environment exceptions aren't currently supported.\nWhen an environment is added to the group, it inherits the group's published rules.\nWhen an environment is removed, it retains the last applied configuration from the group's rules but becomes unlocked, allowing a local admin to modify it going forward.\nUse cases and scenarios for environment groups\nEnvironment groups are flexible. Whether you need to enforce compliance by region, provide personal sandbox spaces for makers, roll out AI features selectively, or standardize development and testing vs. production practices, environment groups can be adapted to fit. Some common use cases and scenarios where environment groups add value include:\nPersonal productivity environments\nWhen using default environment routing, each maker can automatically get their own personal developer environment. It's best practice to place these environments into a dedicated group as they're created. For example, you may want a group named\nPersonal Productivity\n. Within this group, apply rules that treat each environment as a safe, individual sandbox. For instance, restrict agent sharing to prevent accidental exposure of in-progress work, and include productivity aids like the\nmaker welcome content\n. This approach isolates each user's work, similar to each person having their own OneDrive, and helps keep the default environment clean and secure.\nAI feature management\nOrganizations exploring AI capabilities can use environment groups to roll out features in a controlled and intentional way. For example, an enterprise might create a\nCopilot Pilot\ngroup with sandbox environments where AI features are turned on for early testing and feedback. At the same time, production or sensitive environments can remain in a separate group with a more gradual rollout timeline. This setup supports safe, phased adoption while giving teams space to experiment and build readiness. As confidence grows, admins can update the rules to expand Copilot access to more groups or move environments between them. This ensures a clear and manageable path toward broader AI use.\nGlobal environment strategy\nLarge organizations with many environments can group the environments by organizational units, such as by department, region, or subsidiary. For example, a global enterprise might have separate groups for North America, Europe, and APAC environments to enforce region-specific compliance and data residency rules. Each region’s group can have rules aligning with local regulations or business policies, like turning on certain features only where allowed. This structure brings order to a sprawling environment landscape and makes it easier to apply updates or policy changes en masse.\nDevelopment vs. production environments\nIn an ALM strategy, you might separate environments by lifecycle such as development, test, and production. Using environment groups, an admin can create a\nDev/Test Group\nwith relaxed policies such as one that allows some preview features or unmanaged customizations for agility, and a\nProduction Group\nwith stricter rules such as one that forces solution checker, blocks previews or unmanaged changes, or that has a longer backup retention for safety. This approach maintains high standards in production environments while giving development teams the flexibility they need to innovate. It helps strike a strong balance between governance and productivity.\nCreate an environment group\nComplete the following steps to create a new environment group in the Power Platform admin center.\nSign in to the\nPower Platform Admin center\nas a\nPower Platform tenant administrator\n.\nSelect\nManage\nin the navigation pane.\nIn the\nManage\npane, select\nEnvironment groups\n.\nOn the\nEnvironment groups\npage, select\nNew group\n.\nIn the\nCreate group\npane that appears:\nAdd a name for your group in the\nName\nfield such as\nPersonal Productivity\n.\nAdd a brief description of the group in the\nDescription\nfield.\nSelect\nCreate\n.\nAfter a few moments, the new group appears in your Environment groups list. At this point, the group is empty (contains no environments) and none of its rules are configured. You can now add environments and configure rules, as needed.\nNote\nIf you prefer to operate outside of the Power Platform admin center, the\nPower Platform for Admins V2 (Preview) connector\noffers an alternative solution. It allows the creation and deletion of environment groups and the ability to add or remove environments from these environment groups, facilitating opportunities for automation.\nConfigure the rules for your environment group\nAfter you create the environment group, Power Platform tenant administrators can immediately add Managed Environments or configure the group's rules. Both approaches work, but keep in mind that only published rules are enforced across environments.\nSign in to\nPower Platform Admin center\nas a\nPower Platform tenant administrator\n.\nSelect\nManage\nin the navigation pane.\nIn the\nManage\npane, select\nEnvironment groups\n.\nOn the\nEnvironment groups\npage, select the group you created.\nSelect the\nRules\ntab for that group. You see a list of available rules.\nSelect a rule to open its configuration panel. Adjust it as needed, then\nSave\nthe rule.\nRepeat this step for all the rules you want to configure in this group.\nSelect the\nPublish rules\nbutton in the command bar.\nThe following screenshot shows an environment-level setting that is locked by an environment group rule.\nNote\nConfigure only the rules relevant to your scenario.\nUntouched rules are managed at the environment level.\nUpdated rules appear in bold with an asterisk (*) until published. Remember to republish rules to apply changes across environments.\nRoute environments to your environment group\nOne powerful way to use environment groups is in combination with default environment routing. Instead of having new makers build in the shared Default environment, environment routing provisions a dedicated developer environment for each maker and optionally assigns it to an environment group of your choice. If you want all new developer environments to be automatically placed under a specific group—and thus immediately governed by its rules—set up environment routing to point to that group.\nSelect\nManage\nin the navigation pane.\nIn the\nManage\npane, select\nEnvironment groups\n.\nSelect the\nEnvironment Routing\nbutton in the command bar.\nUnder the\nEnvironment group\nsection, choose the group you want your new developer environments to be created in.\nSelect\nSave\n.\nGoing forward, whenever a new maker triggers the creation of a personal developer environment, the platform automatically creates their environment inside the specified group. The environment comes preconfigured as a Managed Environment with all the group’s rules already applied from the start. The maker doesn't need to choose an environment or set anything up. The maker is routed directly into a governed space that IT has predefined. Admins gain peace of mind knowing that even automatically created environments follow organizational policies, and makers get a ready-to-use environment without needing to worry about configuration.\nNote\nIf an environment group is selected for routing but later you decide to change it, you can update the Environment routing settings to point new environments to a different group. Existing developer environments remain in whichever group they were originally placed, unless moved manually.\nAdd environments to your environment group\nIn addition to using routing for new environments, you can manually add existing environments to a group at any time.\nSelect\nManage\nin the navigation pane.\nIn the\nManage\npane, select\nEnvironment groups\n.\nSelect the target group (the group you want to add environments into).\nSelect the\nAdd environments\nbutton in the command bar.\nSelect one or more environments from the list.\nSelect\nAdd\n.\nNote\nEnvironments without Dataverse can't be selected in the picker.\nIf you select an environment that has Dataverse, but it's not managed, you can upgrade it automatically as part of adding it to the group.\nManually create environments in the group\nWhen manually creating a new environment, you can choose to place it into a group at creation time.\nSelect\nManage\nin the navigation pane.\nGo to the\nEnvironments\npage.\nSelect\nNew\nin the command bar.\nSelect a\ngroup\nfor your created environment.\nEnter the other details.\nSelect\nSave\n.\nBy selecting a group here, the environment is created as a Managed Environment within that group, automatically inheriting the group's rules upon creation. If no group is selected, the environment is created outside of any group. You can always add it to a group later.\nRemove an environment from your environment group\nYou can remove an environment from a group if it needs unique governance or if you created it by accident.\nSelect\nManage\nin the navigation pane.\nIn the\nManage\npane, select\nEnvironment groups\n.\nSelect the group.\nSelect the environment you wish to remove.\nSelect\nRemove from group\nin the command bar.\nAfter removal, the environment retains the configuration previously applied by the group. However, its settings and policies are now unlocked, allowing the local environment admin to manage them directly. The environment remembers the last known state from the group, but is now free to evolve independently.\nDelete an environment group\nIf an environment group is no longer needed, administrators can delete it to avoid clutter.\nSelect\nManage\nin the navigation pane.\nIn the\nManage\npane, select\nEnvironment groups\n.\nSelect the environment group that you wish to delete.\nSelect\nDelete group\nin the command bar.\nImportant\nWhen you delete a group, first remove all of its environments and ensure no developer environments are routed to it. If a group still has environments, you see a warning that prevents you from deleting the group.\nKnown limitation\nIf you've published any of the following rules within your environment group, the corresponding settings at the environment level are overridden when added added to the group: sharing limits, maker welcome content, solution checker, usage insights, backup retention, and generative AI settings. For example, if you've published sharing limits in your environment group, but already had maker welcome content and sharing limits set at the environment level, upon adding the environment to the group, the sharing limits are updated to match the group's sharing limits and the maker welcome content is reset.\nRelated content\nManaged Environments overview\nUsage insights\nLimit sharing\nData policies\nLicensing\nView license consumption (preview)\nTenant settings\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Environment Groups", @@ -62,7 +62,7 @@ "https://learn.microsoft.com/en-us/power-platform/admin/environment-groups-rules": { "content_hash": "sha256:55df45453c5596e76e2a0c4ba66fe6d80f61903365c99167eab8876e7841eb94", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nRules for environment groups\nFeedback\nSummarize this article for me\nThe following\nrules\ncan be applied to\nenvironment groups\n.\n#\nRules (in alphabetical order)\n1\nAccessing transcripts from conversations in Copilot Studio agents\n2\nAdvanced connector policy (preview)\n3\nAI prompts\n4\nAI-generated descriptions (preview)\n5\nAI-powered Copilot features\n6\nBack-up retention\n7\nDefault deployment pipeline (preview)\n8\nGenerative AI settings\n9\nExternal models\n10\nMaker welcome content\n11\nPower Apps component framework for canvas apps\n12\nPreview and experimental AI models\n13\nRelease channel\n14\nSharing agents with Editor permissions\n15\nSharing agents with Viewer permissions\n16\nSharing controls for canvas apps\n17\nSharing controls for solution-aware cloud flows\n18\nSharing data between Copilot Studio and Viva Insights\n19\nSolution checker enforcement\n20\nUnmanaged customizations\n21\nUsage insights\nNote\nThe rules that have \"(preview)\" in their name are in public preview, while rules without it are considered generally available.\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Environment Group Rules", @@ -71,7 +71,7 @@ "https://learn.microsoft.com/en-us/power-platform/admin/default-environment-routing": { "content_hash": "sha256:7ad69a80ff08c5496392cac3a7ed8f5ba85c4c816f35031e12eb3fb9cf709aee", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nEnvironment routing\nFeedback\nSummarize this article for me\nEnvironment routing is a premium governance feature. This feature allows Power Platform admins to automatically direct new or existing makers into their own personal developer environments when they visit\nCopilot Studio\n,\nPower Apps\n,\nPower Automate\n, or Power Automate for desktop. Environment routing offers makers a personal, safe space to build with Microsoft Dataverse without the fear of others accessing their apps or data.\nIn this video, check out what's new with environment routing in the Power Platform admin center.\nWhen the\nEnvironment routing\nsetting is enabled in\nPower Platform admin center\n, the maker lands in their own personal developer environment instead of the default environment. Personal developer environments are the makers' own spaces, like OneDrive, for personal productivity where they can start building apps and solutions in their own workspace. Makers don't need to know which environment to work in, since the personal developer environment appears automatically.\nWhen the feature is turned on, the selected maker type (that is, new or existing makers), are directed into their own, personal developer environment. If the maker has access to one or more existing developer environments that aren't owned by them, they're routed to a new developer environment.\nDataverse is available in developer environments, and these environments are\nManaged Environments\nwith the admin settings preconfigured according to the assigned environment group rules. Admins no longer need to worry that their makers are working in the default environment, where their work can conflict with others.\nImportant\nBy default, all developer environments created through environment routing are managed.\nManaged Environments isn't included as an entitlement in the Developer Plan when users run their assets. For more information about Managed Environments and the Developer Plan, see\nAbout the Power Apps Developer Plan\n.\nNon-managed\ndeveloper environments are\nunaffected\nby this feature. Learn more about the developer environment and developer plan in\nAbout the Power Apps Developer Plan\n.\nMulti-rule environment routing\nMulti-rule environment routing is an advanced governance feature in Power Platform that allows tenant administrators to define multiple routing rules to control how makers are directed to development environments across various portals, such as Power Apps, Power Automate, and Copilot Studio.\nThis capability builds on the original environment routing feature, which routed makers to a single environment group. The multi-rule enhancement introduces flexibility by allowing routing to multiple environment groups based on rule logic. This feature is especially useful for organizations where governance, security, and scalability are critical. It allows:\nFine-grained control over where makers build.\nConsistent policy enforcement across environments.\nReduced risk of conflicts in shared or default environments.\nAll routed environments are Managed Environments, meaning they inherit standardized policies like data retention, AI features, and application lifecycle management (ALM) settings defined by the admin through environment groups.\nPrerequisites\nEnvironment routing is a tenant-level admin setting. Understand that:\nOnly Power Platform admins can enable environment routing.\nIt requires the use of Managed Environments, since all of the newly created environments are managed. Users in a\nmanaged\ndeveloper environment must have premium licenses to run Power Platform assets.\nA personal developer environment is automatically created for new or existing makers (depending on the configured user type) when accessing a supported product's maker portal.\nRouted makers land in their existing developer environment if they already have a developer environment that they own.\nMakers are assigned to the admin role in their newly created developer environments.\nBy default, all developer environments created through environment routing are managed.\nTurn on environment routing in the admin center\nThe\nEnvironment routing\nsetting is turned off by default and must be turned on using the Power Platform admin center.\nGo to the\nPower Platform admin center\n.\nIn the navigation pane, select\nManage\n.\nIn the\nManage\npane, select\nTenant settings\n.\nIn the\nTenant settings\npage, select\nEnvironment routing\n. The\nCreate and manage environment routing rules\npane is displayed.\nIn the\nTurn on environment routing for\nsection, select the product portals for which you want to allow routing.\nSelect\nNew rule\nto define a new rule. The\nCreate a new routing rule\npane appears. Take the following action:\nIn the\nName\nfield, enter a name for the rule.\nApply the routing rule to\nEveryone\nor specific security groups.\nSelecting\nEveryone\nroutes all makers into existing or new personal developer environments. Selecting a security group to limit routing only to the member makers of the configured security group.\nSelect an environment group to which the newly created developer environments are automatically assigned. This environment group inherits all the defined, environment group rules. Learn more in\nEnvironment groups\n.\nSelect\nSave\n. The\nCreate and manage environment routing rules\npane is displayed again.\nUse the arrow icons to change the priority of the rules.\nWhen a maker accesses a portal, the system evaluates the rules in order and applies the first matching rule.\nIf a matching rule is found, the maker is routed to an existing or newly provisioned developer environment.\nIf no rule matches, or if environment routing isn't turned on, the maker is routed to the default environment.\nSelect\nSave\n.\nTurn on environment routing using PowerShell\nSign in to your tenant account.\nAdd-PowerAppsAccount -Endpoint \"prod\" -TenantID \nRetrieve and store your tenant settings in\nTenantSettings\n.\n$tenantSettings = Get-TenantSettings\nSet the\nenableDefaultEnvironmentRouting\nflag to\nTrue\n.\ntenantSettings.powerPlatform.governance.enableDefaultEnvironmentRouting = $True\nSet-TenantSettings -RequestBody $tenantSettings\nSet the\nenvironmentRoutingAllMakers\nflag to\nTrue\nto allow routing for all makers or\nFalse\nto limit routing to new makers.\ntenantSettings = Get-TenantSettings\ntenantSettings.powerPlatform.governance | Add-Member -MemberType NoteProperty -Name 'environmentRoutingAllMakers' -Value $True -Force\n(Optional) Set the\nenvironmentRoutingTargetEnvironmentGroupId\nto the desired Environment Group ID.\ntenantSettings.powerPlatform.governance | Add-Member -MemberType NoteProperty -Name 'environmentRoutingTargetEnvironmentGroupId' -Value \"\" -Force\n(Optional) Set the\nenvironmentRoutingTargetSecurityGroupId\nto the desired Security Group.\ntenantSettings.powerPlatform.governance | Add-Member -MemberType NoteProperty -Name 'environmentRoutingTargetSecurityGroupId' -Value \"\" -Force\nSave\nTenantSettings\n.\nSet-TenantSettings -RequestBody $tenantSettings\nTurn off environment routing using PowerShell\ntenantSettings = Get-TenantSettings  \n\ntenantSettings.powerPlatform.governance.enableDefaultEnvironmentRouting = $False\n\nSet-TenantSettings -RequestBody $tenantSettings\nFor more information about using PowerShell in Power Apps, see the\nOverview\n.\nFrequently asked questions (FAQs)\nAre the developer environments managed?\nYes, all the newly created developer environments are Managed Environments by default.\nWhat environment types are created when environment routing is enabled?\nThe created environments are developer environments.\nWhat roles do the makers get assigned in the developer environments?\nThe makers get assigned the admin security role in the developer environments.\nCan new makers switch to the default environment or other environments after launching their own developer environment?\nYes, makers can always switch to other environments.\nDoes the developer environment affect my tenant Dataverse quota?\nNo, the developer environments don't affect your tenant Dataverse quota.\nWhat happens if the developer environment creation fails?\nIf the creation of the developer environment fails, makers are automatically routed to the default environment.\nWhat data policies are applied for the developer environment?\nNo specific data policies are assigned to the developer environment. The developer environment inherits existing, tenant-level data policies.\nWhat are the preconfigured Managed Environments settings for the newly created developer environments?\nAll developer environments have the following Managed Environments settings preconfigured:\nSharing limits\n: Set to exclude sharing with security groups, and preconfigured to share with five individuals.\nSolution Checker\n: Set to\nWarn\n.\nUsage insights\n: Is selected.\nMaker welcome message\n: Not established.\nIs the environment routing also available for Power Pages?\nEnvironment routing is currently available for Microsoft Copilot Studio, Power Apps, and Power Automate cloud and desktop workflows.\nDo I need to be a Power Platform tenant admin to enable this feature?\nYes, you need to have a Power Platform tenant admin privilege to enable this feature in your tenant, or you can ask your tenant admin to turn it on for you.\nDoes creating an app or flow in a managed developer environment require a premium license?\nA premium license isn't required for the creation or preview of an app or flow in a managed developer environment. However, a user or maker needs a premium license to\nrun\nan app or flow in a managed developer environment.\nDoes the default environment need to be managed to enable environment routing?\nNo, the default environment doesn't need to be managed to enable environment routing.\nWhich development environment is the maker routed to if they have more than one developer environment?\nThe maker is always routed to their own existing personal developer environment, such as the developer environment created by them or on their behalf. If they created multiple developer environments, they're routed to the first one in alphabetical order.\nWhat happens if the Power Platform admin changes the developer environment assignments setting from \"Everyone\" to \"Only specific admins\" while environment routing is on?\nChanging the developer environment assignments setting has no impact on environment routing.\nWhere are makers routed to if they don’t have an existing developer environment?\nIf new or existing makers don’t have their own developer environment, they're routed to a new developer environment.\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Environment Routing", @@ -80,7 +80,7 @@ "https://learn.microsoft.com/en-us/power-platform/developer/create-developer-environment": { "content_hash": "sha256:08011fd849421a100feabe95740a07d6430cdfa1da8e7f90addc98c75f6b7ad4", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nCreate a developer environment with the Power Apps Developer Plan\nFeedback\nSummarize this article for me\nTo fully use the\nPower Apps Developer Plan\nas a developer, you need an Azure account and a work account. This article guides you through the process for creating a Power Platform environment and a test tenant if needed.\nWhere do I start?\nIf you have a\nwork account\n, and want to use it to learn Power Platform, go to the\nnext section\n.\nIf you don't have a work account or prefer a Sandbox tenant to learn Power Platform, read information in the\ncreate a test tenant\nsection later in this article before signing up for the developer environment.\nSign up for the Power Apps Developer Plan\nThe Power Apps Developer Plan gives you a free development environment to build and test with Power Apps, Power Automate, and Microsoft Dataverse.\nIt's simple to sign up for the Power Apps Developer Plan:\nEnsure that you have a work account. If you don't,\ncreate a test tenant\nfirst.\nSign up on the\nPower Apps Developer Plan website\n.\nAfter signing up for the Developer Plan, you'll be redirected to\nPower Apps\n. The environment uses your name, for example\nJohn Doe's environment\n. If there's already an environment with that name, the developer new environment is named\nJohn Doe's (1) environment\n.\nImportant\nUse the developer environment instead of your tenant's default environment to work with certain capabilities such as premium and custom connectors.\nYou might need to select your developer environment from the top-right corner of the screen in Power Apps.\nIt might take a couple of minutes for the new environment to be provisioned and become available in the list of the environments. You can see the progress of the environment creation in the\nPower Platform admin center\n.\nIn some situations, your admin might have turned off the sign up process. In this case, please contact your administrator, or create a test tenant.\nFor detailed information about the developer plan, go to\nSign up for the Power Apps Developer Plan\n.\nHow to create a test tenant?\nIf you don't already have a dedicated test tenant, you might qualify for one through the\nMicrosoft 365 Developer Program\n; for details, see the\nFAQ\n. Alternatively, you can\nsign up for a one-month free trial or purchase a Microsoft 365 plan\n.\nYou can also\nmanually create a test tenant\n.\nNow that you have your test tenant, sign up for the Power Apps Developer Plan as explained earlier in this article.\nSee also\nPower Platform for developers\nFusion Development\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Developer Environments", @@ -89,7 +89,7 @@ "https://learn.microsoft.com/en-us/power-platform/admin/advanced-connector-policies": { "content_hash": "sha256:1bea3fda31b17b1c5654ee2dbb5c97e18facb456d57a9ab448e8edf1346aaf43", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nAdvanced connector policies (preview)\nFeedback\nSummarize this article for me\n[This article is prerelease documentation and is subject to change.]\nOverview\nAdvanced connector policies (ACP) represent the next generation of securing connector usage within the Power Platform. This feature provides a modern and flexible approach to managing all\ncertified connectors\n, aligning with the broader governance strategy of per-environment security controls paired with\nenvironment group support\n.\nBy adopting advanced connector policies, administrators gain greater control and granularity in securing and managing connector usage while enhancing the overall governance of their Power Platform environments.\nImportant\nThis is a preview feature.\nPreview features aren’t meant for production use and might have restricted functionality. These features are subject to\nsupplemental terms of use\n, and are available before an official release so that customers can get early access and provide feedback.\nKnown limitations\nWhile advanced connector policies (ACP) offer robust capabilities, there are a few limitations to consider:\nEnvironment group dependencies\n: Per-environment support isn't yet available. When it becomes available, we'll update this article.\nEndpoint filtering\n: Endpoint filtering will be replaced by a broader connection parameter filtering capability of which isn't yet available.\nManaged Environments\n: This feature requires Managed Environments to be enabled. In the future, you'll be able to use it on non-Managed Environments if you're not limiting the nonblockable connectors.\nConfigure an advanced connector policy\nTo configure an advanced connector policy, complete the following steps.\nSign in to the\nPower Platform admin center\n.\nIn the navigation pane, select\nManage\n.\nIn the\nManage\npane, select\nEnvironment groups\n.\nIn the\nEnvironment groups\npage, select the environment group where you want the policy applied.\nThe environment group's page is displayed. Select the\nRules\ntab.\nSelect\nAdvanced connector policies (preview)\n. The\nAdvanced connector policies (preview)\npane is displayed.\nDefine the policy. Keep the following points in mind:\nBy default, the nonblockable connectors are preloaded as\nallowed\n.\nTo add new connectors, select\nAdd connectors\nto choose from all certified connectors.\nTo remove connectors, select them and then select\nRemove connector\n. You can remove any connector to block it.\nWhen all connectors are set as you require, select\nSave\n.\nThe environment group's page is redisplayed. After all rules are updated to your requirements, select\nPublish rules\nin the command bar.\nDuring publishing, an environment lifecycle operation is performed on every environment that's part of the group, or the individual environment depending on where you're configuring the policy. This operation is available in environment history as\nUpdate Managed Environment Settings\nand cascades the new connector policy to the design time and runtime infrastructure.\nMore visibility and control\nIn\ndata policies\n, customers couldn't see triggers, internal actions, or if an action is deprecated. By adding these tags across all certified connectors, administrators can quickly decide to block specific triggers from use or turn off actions that are deprecated and no longer supported by the connector publisher.\nEasier management experience\nBased on customer feedback, the management experience is drastically simplified by making the policy a strict\nallowlist\n. When configured, all new connectors are blocked. If you configure the allowed actions on a given connector, then no new actions, triggers, or internal actions are allowed. The concept of the business and nonbusiness categories in data policies isn't brought forward, as it wasn't deemed effective in policy management.\nProactive policy management\nAdvanced connector policies are available as part of environment groups and rules. The\nPower Platform API\nprovides publicly documented APIs so you can build automated scenarios such as creating new policies, updating policies, and moving environments into groups for management at scale.\nModel Context Protocol (MCP) server management\nAdvanced connector policies now support visibility and management of Model Context Protocol (MCP) servers. MCP servers are special connector endpoints that expose MCP-enabled APIs and tooling capabilities within Power Platform.\nWithin advanced connector policies, administrators can now see MCP servers listed alongside other connector types and can choose to block an entire MCP server. As of now, granular control over individual MCP tools (endpoints and actions) within an MCP server isn't available. Blocking the entire MCP server is supported.\nData policy mixed mode\nUse advanced connector policies (ACP) in mixed mode with classic data policies. This approach allows you to complement configurations so that data policies can achieve action control and endpoint filtering until such time as those features are native to ACP. In addition, you can use ACP to block any connector that isn't possible in classic data policies.\nAt runtime, when a connector operation is invoked, it queries the effective policy for the current hosting environment. This query includes a combined policy that merges the most restrictive settings from both classic data policies and ACP to provide full enforcement.\nIn the future, a separate rule will become available to allow you to skip data policy evaluation in favor of only relying upon connector policy.\nProvide feedback\nTrying out the new advanced connector policies? The product team would love your feedback! Join the Viva Engage network for keeping the conversation going under non-disclosure agreement:\nPublic Preview - Advanced Connector Policies\n.\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Advanced Connector Policies", @@ -98,7 +98,7 @@ "https://learn.microsoft.com/en-us/power-platform/admin/wp-data-loss-prevention": { "content_hash": "sha256:3a3f0f51455af2930e3a5c85a6cbba4f7e8abf2c924685ee311db0083d3e5636", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nData policies\nFeedback\nSummarize this article for me\nData policies are a critical aspect of maintaining data security and compliance within the Microsoft Power Platform ecosystem.\nYou can create data policies that can act as guardrails to help reduce the risk of users from unintentionally exposing organizational data. A core component of Power Apps, Power Automate, and Microsoft Copilot Studio is the use of connectors to enumerate, populate, push, and pull data. Data policies in Power Platform admin center allow administrators to control access to these connectors in various ways to help reduce risk in your organization.\nThis overview describes some high-level concepts related to connectors and several important considerations to take into account when setting up your policies or making policy changes.\nConnectors\nConnectors, at their most basic level, are strongly typed representations of restful, application programming interfaces, also known as APIs. For example, the Power Platform API provides several operations related to functionality in Power Platform admin center.\nWhen wrapping the Power Platform API in to a connector, it becomes easier for makers and citizen developers to utilize the API in their low-code apps, workflows, and chatbots. For example, the Power Platform for Admins V2 connector is the representation of the Power Platform API and we see the 'Get Recommendations' action is simply drag and dropped on to the flow:\nThere are several types of connectors mentioned in this article, and each has capabilities within data policies.\nCertified connectors\nCertified connectors refer to connectors that have undergone rigorous testing and certification processes to ensure they meet Microsoft's standards for security, reliability, and compliance. These connectors provide users with a reliable means of integrating with other Microsoft services and external services, all while maintaining data integrity and security.\nFor more information on certified connectors, see\nCertification Submission Guidelines\n.\nCustom connectors\nCustom connectors allow makers to create their own connectors to integrate with external systems or services not covered by the standard set of certified connectors. While offering flexibility and customization options, custom connectors require careful consideration to ensure that they comply with data policies and don't compromise data security.\nLearn more about\ncreating and managing custom connectors\n.\nVirtual connectors\nVirtual connectors are connectors that are shown in data policies for administrators to control, however they're not based on a restful API. The proliferation of virtual connectors has stemmed from data policies being one of the most popular governance controls in Power Platform. More of these types of \"on/off\" capabilities are expected to surface as rules within\nEnvironment groups\n.\nSeveral virtual connectors are provided for governing Microsoft Copilot Studio. These connectors facilitate the ability to turn off various features of Copilots and chatbots.\nExplore virtual connectors and their role in\ndata loss prevention in Microsoft Copilot Studio\n.\nModel Context Protocol (MCP) connectors\nModel Context Protocol (MCP) connectors are a class of connectors that provide more metadata to expose MCP-enabled API endpoints, known as\ntools\n. MCP connectors extend typical connector functionality and enable richer experiences for generative AI in Microsoft Copilot Studio.\nMany of the nonblockable connectors in Microsoft Power Platform now support MCP. These connectors and their MCP servers can be managed and restricted through\nadvanced connector policies\n.\nConnections\nWhen a maker is building an app or a flow and needs to connect to data, they can use one of the above connector types. When a connector is first added to an app, a connection is established using the authentication protocols supported by that particular connector. These connections represent a saved credential and are stored within the environment that is hosting the app or flow. or more information about authenticating to connectors, see\nConnecting and authenticating to data sources\n.\nDesign-time versus runtime\nWhen an administrator chooses to limit access to either a whole connector or specific actions of a connector, there are impacts both to the maker experience and to the execution of previously created apps, flows, and chatbots.\nMaker experiences, often referred to as\ndesign-time\nexperiences, limit what connectors makers can interact with. If a data policy blocked the use of MSN Weather connector, then a maker can't save their flow or app that utilizes this, and instead receives an error message that the connector has been blocked by policy.\nExperiences where an app is being run or a flow is executing on a predefined schedule, such as every day at 3:00 AM, are often referred to as\nruntime\nexperiences. Continuing with the example earlier, if the connection was inactivated by the background process outlined below, then the result is that the app or flow provides an error message that the MSN Weather connection is broken and needs resolution. When the maker attempts to update their connection to fix it, they get an error in the design-time experience that the connector is blocked by policy.\nProcess for policy changes\nAs new data policies are created, or when existing policies are updated, there's a specific process that's triggered within the Power Platform ecosystem of services that helps to get those policies enforced across the entire set of resources a customer has in their tenant. This process involves the following steps.\nData policy configuration is saved at the customer management level.\nConfigurations are cascaded down to each environment in the customer tenant.\nResources in each environment (such as apps, flows, and chatbots) periodically check for updated policy configurations.\nWhen a configuration change is detected, each app, flow, and chatbot is evaluated to see if it violates the policy.\nIf a violation occurs, the app, flow, or chatbot is put in to a\nsuspended\nor\nquarantine\nstate so that it can't operate.\nConnections are scanned. If the policy blocks the whole connector then the connection is set to a\ndisabled\nstate so that it can't operate.\nAny resources that are running and attempting to use an inactive connection, action, trigger, or MCP server that is blocked, fail at runtime.\nLatency considerations\nThe time it takes to effectively implement data policies varies from customer to customer based on their volume of environments and resources within those environments. The more apps, flows, and chatbots a customer has, the longer it takes for policy changes to take full effect. For the most extreme cases, the latency for full enforcement is 24 hours. In most cases, it is within an hour.\nRelated content\nManage data policies\nData policies for Power Automate\nAdvanced connector policies\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "DLP Policies (Power Platform)", @@ -107,7 +107,7 @@ "https://learn.microsoft.com/en-us/power-platform/admin/dlp-connector-classification": { "content_hash": "sha256:31b178a4b5057456fd430d3bd564844e50c0904c38d40be5a711210513063d88", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nConnector classification\nFeedback\nSummarize this article for me\nData groups are a simple way to categorize connectors within a data policy. The three data groups available are the\nBusiness\ndata group, the\nNon-Business\ndata group, and the\nBlocked\ndata group.\nA good way to categorize connectors is to place them in groups based on the business-centered or personal-use-centered services that they connect to in the context of your organization. Connectors that host business-use data should be classified as\nBusiness\n, and connectors that host personal-use data should be classified as\nNon-Business\n. Any connectors that you want to keep from being used at all across one or more environments should be classified as\nBlocked\n.\nWhen a new policy is created, by default all connectors are placed in the\nNon-Business\ngroup. From there they can be moved to\nBusiness\nor\nBlocked\nbased on your preference. You manage the connectors in a data group when you create or modify the properties of a data policy from the admin center. See\nManage data policies\n. You can also change the initial classification of connectors by editing your data policy. More information:\nEdit a data policy\nNote\nUntil recently, some HTTP connectors weren't readily available for data policy configuration by using the data policy UI or PowerShell. As of May 2020, the following HTTP connectors can now be classified by using the data policy UI and PowerShell, like any other Power Platform connector:\nHTTP\n,\nHTTP Webhook\n, and\nWhen an HTTP request is received\n. If legacy data policies are being updated by using the new data policy UI, a warning message is displayed to admins indicating that these three HTTP connectors are now being added to the data policies purview and that they should ensure that these connectors are placed in the right data policies grouping.\nBecause child flows share an internal dependency with the HTTP connector, the grouping that admins choose for HTTP connectors in a data policy might affect the ability to run child flows in that environment or tenant. Make sure your HTTP connectors are classified in the appropriate group for your child flows to function. If there are any concerns in classifying the connector as\nBusiness\nin shared environments such as the default environment, our advice is to classify it as\nNon-Business\nor to block it. Then, create dedicated environments where makers can use HTTP connectors, but restrict the maker list so that you can unblock makers from building child flows.\nThe\nContent Conversion\nconnector is an integral feature of Microsoft Power Platform, used to convert an HTML document to plain text. It applies both to\nBusiness\nand\nNon-Business\nscenarios and doesn't store any data context of the content converted through it; therefore, it's not available for classification through data policies.\nHow data is shared among data groups\nData can't be shared among connectors that are located in different groups. For example, if you place SharePoint and Salesforce connectors in the\nBusiness\ngroup and you place Gmail in the\nNon-Business\ngroup, makers can't create an app or flow that uses both the SharePoint and Gmail connectors. This in turn restricts data flows between these two services in Microsoft Power Platform.\nAlthough data can't be shared among services in different groups, it can be shared among services within a specific group. From the earlier example, because SharePoint and Salesforce were placed in the same data group, makers can create an app or flow that uses both SharePoint and Salesforce connectors together. This in turn allows data flows between these two services in Microsoft Power Platform.\nThe key point is that connectors in the same group can share data in Microsoft Power Platform, whereas connectors in different groups can't share data.\nThe effect of the Blocked data group\nData flow to a specific service can be blocked altogether by marking that connector as\nBlocked\n. For example, if you place Facebook in the\nBlocked\ngroup, makers can't create an app or flow that uses the Facebook connector. This in turn restricts data flows to this service in Microsoft Power Platform.\nAll third-party connectors can be blocked. All Microsoft-owned premium connectors (except Microsoft Dataverse) can be blocked.\nList of connectors that can't be blocked\nAll connectors driving core Microsoft Power Platform functionality (like Dataverse, Approvals, and Notifications), in addition to connectors that enable core Office customization scenarios like Microsoft Enterprise Plan standard connectors, remain nonblockable to ensure that core user scenarios remain fully functional.\nNote\nThese connectors can be limited or blocked using\nadvanced connector policies\n.\nHowever, these nonblockable connectors can be classified into\nBusiness\nor\nNon-Business\ndata groups. These connectors broadly fall into the following categories:\nMicrosoft Enterprise Plan standard connectors (with no other licensing implications).\nMicrosoft Power Platform–specific connectors that are part of the base platform capabilities. Within this, Dataverse connectors are the only premium connectors that can't be blocked because Dataverse is an integral part of Microsoft Power Platform.\nThe following connectors can't be blocked by using data policies.\nMicrosoft Enterprise Plan standard connectors\nCore Power Platform connectors\nDefender for Cloud Apps\nApprovals\nDynamics 365 Customer Voice\nNotifications\nExcel Online (Business)\nDataverse (legacy)\nKaizala\nDataverse\nMicrosoft 365 Groups\nPower Apps Notifications (\nv1\nand\nv2\n)\nMicrosoft 365 Groups Mail (Preview)\nMicrosoft Copilot Studio\nMicrosoft 365 Outlook\nMicrosoft 365 Users\nMicrosoft Teams\nMicrosoft To-Do (Business)\nOneDrive for Business\nOneNote (Business)\nPlanner\nPower BI\nSharePoint\nShifts\nSkype for Business Online\nYammer\nNote\nIf a currently unblockable connector is already in the\nBlocked\ngroup (for example, because it was blocked when restrictions were different), it remains in the same group until you edit the policy. You get an error message stopping you from saving the policy until you move the unblockable connector to a\nBusiness\nor\nNon-Business\ngroup.\nViewing the classification of connectors\nWhen editing data policies in the Power Platform admin center, all available and visible connectors are shown, regardless of whether they have been classified in a policy. However, when viewing a data policy in PowerShell or through the Power Platform for Admins connector, you see only the connectors that have been explicitly classified in the Business, Non-business, or Blocked categories. data policies viewed from PowerShell or the Power Platform for Admins connector may include stale references to connectors that are no longer available or visible.\nIn general, the list of Power Platform connectors can differ depending on where you're viewing them, and there are several reasons for this. Some connectors may require specific licensing, and if your license doesn't include them, they're not visible. Different environments can also have different connectors available due to compliance and regulatory requirements. Microsoft may release updates to connectors, which may not be immediately available across all Power Platform components. Some connectors may only be available in Power Automate and not in Power Apps. Depending on your role and permissions, you may not have access to all connectors.\nCustom connector classification\nEnvironment-level data policies\nEnvironment admins can now find all the custom connectors in their environments alongside prebuilt connectors on the\nConnectors\npage in\nData Policies\n. Similar to prebuilt connectors, you can classify custom connectors into\nBlocked\n,\nBusiness\n, or\nNon-Business\ncategories. Custom connectors that aren't explicitly classified will be put under the default group (or\nNon-Business\n, if no default group setting is explicitly chosen by admins).\nYou can also use data policy PowerShell commands to set custom connectors into\nBusiness\n,\nNon-Business\n, and\nBlocked\ngroups. More information:\nData policy commands\nTenant-level data policies\nThe Power Platform admin center also has support for tenant admins to classify custom connectors by their Host URL endpoints by using a pattern-matching construct for tenant-level data policies. Because the scope of custom connectors is environment-specific, these connectors don't show up on the\nConnectors\npage for you to classify. Instead, you see a new page in\nData Policies\nnamed\nCustom connectors\n, which you can use to specify an ordered list of Allow and Deny URL patterns for custom connectors.\nThe rule for the wildcard character (\n*\n) is the last entry in the list, which applies to all custom connectors. Admins can tag the\n*\npattern to\nBlocked\n,\nBusiness\n,\nNon-business\n, or\nIgnore\n. By default, the pattern is set as\nIgnore\nfor new data policies.\nIgnore\nignores data polciy classification for all connectors in this tenant-level policy, and defers evaluation of a pattern to other environments or tenant-level policies to attribute them into the\nBusiness\n,\nNon-Business\n, or\nBlocked\ngrouping as appropriate. If no specific rule exists for the custom connectors, an\nIgnore *\nrule allows custom connectors to be used with both\nBusiness\nand\nNon-Business\nconnector groupings. Except for the last entry in the list,\nIgnore\nas an action isn't supported for any other URL pattern added to the custom connector pattern rules.\nYou can further add new rules by selecting\nAdd connector pattern\non the\nCustom connectors\npage.\nThis opens a side panel where you can add custom connector URL patterns and classify them. New rules are added to the end of the pattern list (as the second-to-the-last rule, because\n*\nis the last entry in the list). However, you can update the order while adding a new pattern.\nYou can also update the order of the patterns by using the\nOrder\ndropdown list or selecting\nMove up\nor\nMove down\n.\nAfter a pattern has been added, you can edit or delete these patterns by selecting a specific row and selecting\nEdit\nor\nDelete\n.\nDefault data group for new connectors\nOne data group must be designated as the default group to automatically classify any new connectors added to Microsoft Power Platform after your policy has been created. Initially, the\nNon-Business\ngroup is the default group for new connectors and all services. You can\nchange the default data group\nto the\nBusiness\nor\nBlocked\ndata group, but we don't recommend that you do so.\nAny new services that are added to apps are placed in the designated default group. For this reason, we recommend that you keep\nNon-Business\nas the default group, and manually add services into the\nBusiness\nor\nBlocked\ngroup after your organization has evaluated the impact of allowing business data to be shared with the new service.\nNote\nMicrosoft 365 Enterprise license connectors and a few core Microsoft Power Platform connectors are exempt from being marked as\nBlocked\n, and can only be classified as\nBusiness\nor\nNon-Business\n. If Microsoft adds any new connectors that can't be blocked and you've set the default group for the data policy as\nBlocked\n, these connectors will be automatically marked as\nNon-Business\ninstead of\nBlocked\n.\nRelated information\nPower Platform data policies\nData policies for Power Automate\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Connector Classification", @@ -123,18 +123,18 @@ "section": "Power Platform Administration" }, "https://learn.microsoft.com/en-us/connectors/connector-reference/": { - "content_hash": "sha256:9ac2bdaedaeb377e7622899e7e465719d26df2ca39eb5ea39082e1471ea681ca", - "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nConnector reference overview\nFeedback\nSummarize this article for me\nThis page summarizes key information of all connectors currently provided for Microsoft Power Automate, Microsoft Power Apps, and Azure Logic Apps. In addition to the connector icon and name, the following information is provided:\nAvailable in Azure Logic Apps.\nAvailable in Power Automate.\nAvailable in Power Apps.\nThis is an MCP Server connector.\nThis is a Preview connector.\nThis is a Premium connector for Power Automate and Power Apps or a Standard connector for Azure Logic Apps.\nYou can select a connector to view more detailed connector-specific documentation including its functionality and region availability. You can also filter all connectors by a certain category. Note that filters do not stack and each link will take you to another page within the documentation site.\nFilter all connectors by:\nTier\nRelease Status\nProduct\nPublisher\nStandard\nPreview\nPower Apps\nMicrosoft\nPremium\nProduction\nPower Automate\nNon-Microsoft\nLogic Apps\nMCP Server\nList of Connectors\n}exghts gen. Document & more\nBy: }exghts\n10to8 Appointment Scheduling\nBy: 10to8 Ltd\n1DocStop\nBy: 1DocStop\n1Me Corporate\nBy: 1Me\n1pt (Independent Publisher)\nBy: Troy Taylor\n24 pull request (Independent Publisher)\nBy: Bernard Karaba\n365 Training\nBy: We Speak You Learn, LLC\n3E Events\nBy: Elite Technology\n9A Raptor Document Warehouse\nBy: 9altitudes\nAbbreviations\nBy: Troy Taylor\nAbortion Policy (Independent Publisher)\nBy: That API Guy\nabsentify\nBy: BrainCore Solutions\nAbstract Company Enrichment (Independent Publisher)\nBy: Fördős András\nAbstract Email Validator (Independent Publisher)\nBy: Fördős András\nAbstract Exchange Rates (Independent Publisher)\nBy: Fördős András\nAbstract Holidays (Independent Publisher)\nBy: Fördős András\nAbstract IBAN Validator (Independent Publisher)\nBy: Fördős András\nAbstract IP Geolocation (Independent Publisher)\nBy: Fördős András\nAbstract Phone Validator (Independent Publisher)\nBy: Fördős András\nAbstract Timezones (Independent Publisher)\nBy: System Administrator\nAbstract VAT Validator (Independent Publisher)\nBy: Fördős András\nAccuWeather (Independent Publisher)\nBy: troystaylor\nAct!\nBy: Swiftpage ACT!\nActivityInfo\nBy: ActivityInfo\nAcumatica\nBy: Acumatica\nAddress Labs (Independent Publisher)\nBy: Richard Wilson\nAdobe Acrobat Sign\nBy: ADOBE INC.\nAdobe Acrobat Sign Sandbox\nBy: Adobe Inc.\nAdobe Creative Cloud\nBy: Adobe Inc\nAdobe Experience Manager\nBy: Adobe\nAdobe PDF Services\nBy: Adobe Inc.\nAdvanced Data Operations\nBy: State Solutions\nAdvanced Scraper (Independent Publisher)\nBy: Troy Taylor, Hitachi Solutions\nAffirmations (Independent Publisher)\nBy: Troy Taylor\nAfrica's Talking Airtime\nBy: Africa's Talking\nAfrica's Talking SMS\nBy: Africa's Talking\nAfrica's Talking Voice\nBy: Africa's Talking\nAfterShip (Independent Publisher)\nBy: Taiki Yoshida\nAgilePoint NX\nBy: AgilePoint Inc\nAgilite\nBy: Agilit-e\nAhead\nBy: ahead AG\nAhead (Intranet)\nBy: ahead AG\nAI or Not (Independent Publisher)\nBy: Fördős András\nAIForged\nBy: Larc AI (PTY) Ltd\nAIHW MyHospitals (Independent Publisher)\nBy: Paul Culmsee\nAikiDocs\nBy: Aiki-Mind Services Inc.\nAirlabs\nBy: Fördős András\nAirly (Independent Publisher)\nBy: Tomasz Poszytek\nAirmeet\nBy: Airmeet\nairSlate\nBy: airSlate Inc.\nAirtable (Independent Publisher) [DEPRECATED]\nBy: Woong Choi\nAlemba ITSM\nBy: Alemba Ltd\nAletheia\nBy: Aletheia\nAlisQI\nBy: AlisQI BV\nAlkymi\nBy: Alkymi\nallGeo\nBy: Abaqus\nAlly\nBy: Aliru\nAlmabase\nBy: Almabase, Inc.\nAlmanac (Independent Publisher)\nBy: Troy Taylor\nALVAO\nBy: ALVAO\nAmazon Redshift\nBy: Microsoft\nAmazon S3\nBy: Microsoft\nAmazon S3 Bucket (Independent Publisher)\nBy: Michael Megel\nAmazon SQS\nBy: Microsoft\nAmbee (Independent Publisher)\nBy: Troy Taylor, Hitachi Solutions\nAMEE Open Business (Independent Publisher)\nBy: Paul Culmsee\nAnnature (Independent Publisher)\nBy: Dr Adrian Colquhoun (Strategik)\nAnt Text Automation\nBy: Insight Office\nAnthropic (Independent Publisher)\nBy: Troy Taylor\nANY.RUN Threat Intelligence\nBy: ANYRUN FZCO\nApache Impala\nBy: Microsoft\nAPITemplate (Independent Publisher)\nBy: Troy Taylor\nAPlace.io (Independent Publisher)\nBy: Troy Taylor\nApp Power Forms\nBy: App Power Solutions LLC\nApp Store Connect - App Store (Independent Publisher)\nBy: Farhan Latif\nAppfigures\nBy: Microsoft\nAppsForOps Timeline\nBy: AppsForOps\nApptigent PowerTools\nBy: Apptigent\nApptigent PowerTools LITE\nBy: Apptigent Limited\nApyHub (Independent Publisher)\nBy: Troy Taylor\nApyHub Document Readability (Independent Publisher)\nBy: Troy Taylor\nApyHub Generate iCal (Independent Publisher)\nBy: Troy Taylor\nAquaforest PDF\nBy: Aquaforest Limited\nAranda Service Management\nBy: Aranda Software Corporation\nArcGIS\nBy: Esri, Inc.\nArcGIS Enterprise\nBy: Esri, Inc.\nArcGIS PaaS\nBy: Esri, Inc.\nAS2\nBy: Microsoft\nAsana\nBy: Microsoft\nAsite\nBy: Asite Solutions Pvt Ltd\nAsite (Canada)\nBy: Asite Solutions Limited\nAsite (Hong Kong)\nBy: Asite Solutions Limited\nAsite (KSA)\nBy: Asite Solutions Limited\nAsite (UAE)\nBy: Asite Solutions Limited\nAsite (US Gov.)\nBy: Asite Solutions Limited\nASPSMS\nBy: Vadian .Net AG\nAssemblyAI\nBy: AssemblyAI\nAssently E-Sign\nBy: Assently AB\nAtBot Admin\nBy: H3 Solutions Inc.\nAtBot Logic\nBy: H3 Solutions Inc.\nAutenti E-Signature Workflow\nBy: Autenti sp. z o.o.\nAutodesk Data Exchange\nBy: Autodesk, Inc.\nAutoReview\nBy: Power DevBox\nAutoSeller\nBy: Microsoft Corporation\nAvePoint Cloud Governance\nBy: AvePoint, inc.\nAviationstack (Independent Publisher)\nBy: Troy Taylor, Hitachi Solutions\nAWeber\nBy: Microsoft\nAzure AD Identity and Access\nBy: Microsoft, Daniel Laskewitz\nAzure AI Document Intelligence (form recognizer)\nBy: Microsoft\nAzure AI Foundry Agent Service\nBy: Microsoft\nAzure AI Foundry Inference\nBy: Microsoft\nAzure AI Search\nBy: Microsoft\nAzure App Service\nBy: Microsoft\nAzure Application Insights\nBy: Microsoft\nAzure Automation\nBy: Microsoft\nAzure Batch Speech-to-text\nBy: Microsoft\nAzure Blob Storage\nBy: Microsoft\nAzure Cognitive Service for Language\nBy: Microsoft\nAzure Communication Chat\nBy: Microsoft\nAzure Communication Email\nBy: Microsoft\nAzure Communication Services Identity\nBy: Microsoft\nAzure Communication Services SMS\nBy: Microsoft\nAzure Communication Services SMS Events\nBy: Microsoft\nAzure Confidential Ledger\nBy: Microsoft Corporation\nAzure Container Instance\nBy: Microsoft\nAzure Cosmos DB\nBy: Microsoft\nAzure Data Explorer\nBy: Microsoft\nAzure Data Factory\nBy: Microsoft\nAzure Data Lake\nBy: Microsoft\nAzure Database for MySQL\nBy: Microsoft\nAzure Databricks\nBy: Databricks Inc.\nAzure DevOps\nBy: Microsoft\nAzure Digital Twins\nBy: Microsoft Corporation\nAzure Event Grid\nBy: Microsoft\nAzure Event Grid Publish\nBy: Microsoft\nAzure File Storage\nBy: Microsoft\nAzure IoT Central V2\nBy: Microsoft Corporation\nAzure IoT Central V3\nBy: Microsoft Corporation\nAzure Key Vault\nBy: Microsoft\nAzure Log Analytics [DEPRECATED]\nBy: Microsoft\nAzure Log Analytics Data Collector\nBy: Microsoft\nAzure Monitor Logs\nBy: Microsoft\nAzure OpenAI\nBy: Microsoft\nAzure Queues\nBy: Microsoft\nAzure Resource Manager\nBy: Microsoft\nAzure Speech Pronunciation Assessment\nBy: Microsoft\nAzure SQL Data Warehouse\nBy: Microsoft\nAzure Table Storage\nBy: Microsoft\nAzure Text to speech\nBy: Microsoft\nAzure VM\nBy: Microsoft\nBadgr (Independent Publisher)\nBy: Troy Taylor, Hitachi Solutions\nBasecamp 2\nBy: Microsoft\nBasecamp 3\nBy: Microsoft\nBBC News (Independent Publisher)\nBy: krautrocker\nBeauhurst (Independent Publisher)\nBy: Axazure\nBenchmark Email\nBy: Microsoft\nBenifex\nBy: Benefex Ltd\nBigdata-com\nBy: RAVENPACK INTERNATIONAL SL.\nBillsPLS\nBy: IN-D by Intain\nBIN Checker (Independent Publisher)\nBy: Troy Taylor\nBinance.us (Independent Publisher)\nBy: Roy Paar\nBing Maps\nBy: Microsoft\nBing Search\nBy: Microsoft\nBitbucket\nBy: Microsoft\nBitly\nBy: Microsoft\nBitlyIP (Independent Publisher)\nBy: Troy Taylor\nBitskout\nBy: Bitskout\nBitvore Cellenus\nBy: Bitvore Corp.\nBizTalkServer\nBy: Microsoft\nBKK Futar (Independent Publisher)\nBy: Fördős András\nBlackbaud Altru Constituent\nBy: Blackbaud, Inc.\nBlackbaud Church Management [DEPRECATED]\nBy: Blackbaud, Inc.\nBlackbaud CRM Constituent\nBy: Blackbaud, Inc.\nBlackbaud CRM Prospect\nBy: Blackbaud, Inc.\nBlackbaud FENXT General Ledger\nBy: Blackbaud, Inc.\nBlackbaud FENXT Payable\nBy: Blackbaud, Inc.\nBlackbaud FENXT Query\nBy: Blackbaud. Inc\nBlackbaud Raisers Edge NXT\nBy: Blackbaud, Inc.\nBlackbaud Raisers Edge NXT Constituents\nBy: Blackbaud, Inc.\nBlackbaud Raisers Edge NXT Documents\nBy: Blackbaud, Inc.\nBlackbaud Raisers Edge NXT Events\nBy: Blackbaud, Inc.\nBlackbaud Raisers Edge NXT Fundraising\nBy: Blackbaud, Inc.\nBlackbaud Raisers Edge NXT Interactions\nBy: Blackbaud, Inc.\nBlackbaud Raisers Edge NXT Lists\nBy: Blackbaud, Inc.\nBlackbaud Raisers Edge NXT Prospects\nBy: Blackbaud, Inc.\nBlackbaud RENXT Gifts\nBy: Blackbaud, Inc.\nBlackbaud RENXT Query\nBy: Blackbaud. Inc\nBlackbaud RENXT Reports\nBy: Blackbaud, Inc.\nBlackbaud SKY Add-ins\nBy: Blackbaud. Inc\nBlogger\nBy: Microsoft\nBloomflow\nBy: Bloomflow\nBlueInk\nBy: Blueink\nBluesky Social (Independent Publisher)\nBy: krautrocker\nBoldSign\nBy: Syncfusion-Inc\nboomapp connect\nBy: Boomerang I-Comms Ltd\nBox\nBy: Microsoft\nBox MCP Server\nBy: Box.\nBrave Search (Independent Publisher)\nBy: Troy Taylor\nbttn\nBy: Microsoft\nBttn ONE\nBy: Bttn\nBuffer\nBy: Microsoft\nBuildingMinds DigitalTwin Core\nBy: BuildingMinds\nBulkSMS\nBy: BulkSMS.com\nBureau of Labor Statistics (Independent Publisher)\nBy: krautrocker\nBusiness Assist [DEPRECATED]\nBy: Microsoft\nBusinessmap\nBy: Businessmap\nBuy Me A Coffee (Independent Publisher)\nBy: Troy Taylor\nBuzz\nBy: Skyscape\nByword (Independent Publisher)\nBy: Troy Taylor\nCalculate Working Day\nBy: Tweed Technology Ltd\nCalendar Pro\nBy: Witivio\nCalendarific (Independent Publisher)\nBy: Fordos Andras\nCalendly\nBy: Calendly\nCalendly (legacy)\nBy: Microsoft\nCampfire\nBy: Microsoft\nCandidateZip Resume/Job Parser\nBy: CandidateZip CV/Job Parser\nCapsule CRM\nBy: Microsoft\nCaptisa Forms\nBy: Connect Captisa\nCarbon Intensity (Independent Publisher)\nBy: Hasan Unlu\nCarbonFootprint (Independent Publisher)\nBy: Troy Taylor\nCardPlatform Adaptive Cards\nBy: CardPlatform\nCarsXE (Independent Publisher)\nBy: Troy Taylor\nCascade\nBy: Cascade\nCascade Strategy New\nBy: Nicolas Durik-Ha\nCasper365 for Education\nBy: Microsoft\nCB Blockchain Seal\nBy: Connecting Software s.r.o. & Co. KG\nCData Connect AI\nBy: CData Software Inc\nCDC Content Services (Independent Publisher)\nBy: Troy Taylor\nCDK Drive Customer\nBy: CDK Global\nCDK Drive Service Vehicles\nBy: CDK Global\nCelonis\nBy: Celonis\nCelonis MCP Server\nBy: Celonis GmbH\nCentrical\nBy: Centrical\nCertinal eSign\nBy: Certinal Inc.\nCertopus\nBy: DevSquirrel Technologies Private Limited\nCGTrader\nBy: Microsoft\nChainpoint [DEPRECATED]\nBy: Chainpoint\nChatter\nBy: Microsoft\nCheckly (Independent Publisher)\nBy: Troy Taylor\nChuck Norris IO (Independent Publisher)\nBy: Daniel Laskewitz\ncioplenu\nBy: cioplenu GmbH\nCireson Service Manager Portal\nBy: Cireson\nCisco Webex Meetings\nBy: Cisco\nCitymapper (Independent Publisher)\nBy: Troy Taylor\nCivicPlus Transform\nBy: OneBlink\nClearbit (Independent Publisher)\nBy: Troy Taylor, Hitachi Solutions\nCleverTap\nBy: CleverTap Pvt. ltd.\nClickSend\nBy: Sinch Sweden AB\nClickSend Postcards\nBy: ClickSend Postcards\nClickUp Team Manager (Independent Publisher)\nBy: Duke DeVan\nClimatiq (Independent Publisher)\nBy: Troy Taylor\nClinical Trials (Independent Publisher)\nBy: Troy Taylor\nClockify (Independent Publisher)\nBy: Dr Adrian Colquhoun (Strategik)\nCloud BOT\nBy: C-RISE Ltd.\nCloud Connect Studio\nBy: Fuji Xerox\nCloud PKI Management\nBy: 509 Solutions Pty Ltd\nCloudConvert\nBy: Lunaweb GmbH\nCloudmersive Barcode\nBy: Cloudmersive, LLC\nCloudmersive CDR\nBy: Cloudmersive, LLC\nCloudmersive Currency\nBy: Cloudmersive, LLC\nCloudmersive Data Validation\nBy: Cloudmersive, LLC\nCloudmersive Document Conversion\nBy: Cloudmersive, LLC\nCloudmersive File Processing\nBy: Cloudmersive, LLC\nCloudmersive Image Processing\nBy: Cloudmersive, LLC\nCloudmersive NLP\nBy: Cloudmersive, LLC\nCloudmersive PDF\nBy: Cloudmersive, LLC\nCloudmersive Security\nBy: Cloudmersive, LLC\nCloudmersive Video and Media\nBy: Cloudmersive, LLC\nCloudmersive Virus Scan\nBy: Cloudmersive, LLC\nCloudTools for Salesforce\nBy: Apptigent\nCloverly (Independent Publisher)\nBy: Troy Taylor\nCluedIn\nBy: CluedIn Official\nCMI\nBy: CM Informatik AG\nCO2 Signal (Independent Publisher)\nBy: Paul Culmsee\nCobbleStone - Contract Insight\nBy: Cobblestone Software\nCognito Forms\nBy: Cognito Forms\nCognizant Automation Center\nBy: Cognizant\nCohere (Independent Publisher)\nBy: Troy Taylor\nCohesity Gaia\nBy: Cohesity, Inc.\nCoinbase (Independent Publisher)\nBy: Roy Paar\nCommercient\nBy: Commercient LLC\nCompanies House (Independent Publisher)\nBy: Matt Collins\nCompany Connect\nBy: InSpark\nComposer by Tachytelic\nBy: Accendo Solutions Ltd\nComputer Vision API\nBy: Microsoft\nConfluence\nBy: Microsoft\nConnect2All\nBy: GAC Business Solutions\nConnect2All on-premises\nBy: GAC Business Solutions\nConnective eSignatures\nBy: Connective\nConnectWise PSA (Independent Publisher)\nBy: howellchrisj\nconnpass (Independent Publisher)\nBy: Miyake Hideo\nConsenSys Ethereum (Deprecated) [DEPRECATED]\nBy: ConsenSys\nContacts Pro\nBy: Witivio\nContent Conversion\nBy: Microsoft\nContent Gate\nBy: SignUp Software Netherlands B.V\nContent Manager Power Connect\nBy: Kapish Services Pty Ltd\nContent Moderator\nBy: Microsoft\nContoso Hub\nBy: Microsoft\nConverter by Power2Apps\nBy: Power2Apps P2A GmbH\nConvertKit (Independent Publisher)\nBy: Troy Taylor, Hitachi Solutions\nCopilot for Finance\nBy: Microsoft\nCopilot for Sales\nBy: Microsoft Corporation\nCopilot for Service extension (preview)\nBy: Microsoft Corporation\nCopy.ai\nBy: Troy Taylor\nCorda Blockchain [DEPRECATED]\nBy: Microsoft\nCornerstone Learning vILT\nBy: Cornerstone On Demand\nCorporate Buzzword Generator (Independent Publisher)\nBy: Troy Taylor\nCOSMO Bot\nBy: COSMO CONSULT GmbH\nCoupa (Independent Publisher)\nBy: NovaGL\nCourier (Independent Publisher)\nBy: Troy Taylor\nCOVID-19 JHU CSSE (Independent Publisher)\nBy: Woong Choi\nCPQSync\nBy: Cincom Systems\nCPSC Recalls Retrieval (Independent Publisher)\nBy: Troy Taylor\nCQC Data (Independent Publisher)\nBy: Martyn Lesbirel\nCradl AI\nBy: Cradl AI\nCraftMyPDF (Independent Publisher)\nBy: Troy Taylor\nCRM Bot\nBy: CRM Bot Ltd\nCronofy MCP\nBy: Cronofy Ltd\nCrossbeam\nBy: Crossbeam\nCSC Corptax\nBy: Corptax\nCSV Converter by Power2Apps\nBy: Power2Apps P2A GmbH\nCustom Vision\nBy: Microsoft\nCustomJS\nBy: TechnologyCircle GmbH\nCX Cards by Surveyapp\nBy: VOC Metrics Limited\nCyberday\nBy: Agendium Ltd\nCyberProof\nBy: CyberProof Inc.\nD&B Optimizer [DEPRECATED]\nBy: Dun & Bradstreet\nd.velop\nBy: d.velop AG\nD365 Contact Center Admin MCP\nBy: Microsoft\nD7Messaging\nBy: Signtaper Technologies FZCO\nD7SMS\nBy: Signtaper Technologies FZCO\nDad Jokes (Independent Publisher)\nBy: Troy Taylor\nDadJokesIO (Independent Publisher)\nBy: Troy Taylor\nDaffy (Independent Publisher)\nBy: Troy Taylor\nDailyMed (Independent Publisher)\nBy: Troy Taylor\nDandelion (Independent Publisher)\nBy: Troy Taylor\nData Activator\nBy: Microsoft, Data Activator\nData Activator Early Access\nBy: Microsoft\nData8 Data Enrichment\nBy: Data8 Limited\nDatablend\nBy: DataBlend\nDatabook C4S\nBy: Databook Labs, Inc.\nDatabox (Independent Publisher)\nBy: Troy Taylor\nDatabricks\nBy: Databricks Inc.\nDataMotion\nBy: DataMotion, Inc.\nDatamuse (Independent Publisher)\nBy: Troy Taylor\nDataScope Forms\nBy: DataScope\nDB2\nBy: Microsoft\nDBF2XML\nBy: SMART\nDe Lijn (Independent Publisher)\nBy: Lenard Schockaert\nDecentraland (Independent Publisher)\nBy: Roy Paar\nDeck of Cards (Independent Publisher)\nBy: Troy Taylor\nDeepgram (Independent Publisher)\nBy: Troy Taylor\nDeepL\nBy: DeepL\nDeepLIP (Independent Publisher)\nBy: Michal Romiszewski\nDeepSign\nBy: DeepCloud\nDefault title\nBy: WordLift\nDefender for Cloud Apps\nBy: Microsoft\nDeprecated Integration [DEPRECATED]\nBy: Maximizer\nDerdack SIGNL4\nBy: Derdack GmbH\nDesk365\nBy: Kani Technologies Inc\nDeskDirector\nBy: DeskDirector\nDesktop flows\nBy: Microsoft\nDexcom (Independent Publisher)\nBy: FlowJoe\nDHL Tracking (DEPRECATED) (Independent Publisher) [DEPRECATED]\nBy: Rapid Circle\nDiceBear (Independent Publisher)\nBy: Troy Taylor\nDid You Mean This (Independent Publisher)\nBy: Troy Taylor\nDiffchecker\nBy: Fördős András\nDigiDates (Independent Publisher)\nBy: Troy Taylor\nDigiLEAN Connect\nBy: DigiLEAN AS\nDigitalHumani (Independent Publisher)\nBy: Troy Taylor\nDime.Scheduler\nBy: Dime Software\nDime.Scheduler (on-prem)\nBy: Dime Software\nDiscord (Independent Publisher)\nBy: Daniel Laskewitz | Microsoft & Michael Guzowski | Developico\nDisqus\nBy: Microsoft\nDo Not Call Reported Calls (Independent Publisher)\nBy: Troy Taylor, Hitachi Solutions\nDoc To PDF\nBy: Spot Solutions, Inc\nDocFusion365 – SP\nBy: Assimilated Information Systems\nDocJuris\nBy: DocJuris\nDocparser\nBy: Docparser\nDocugami\nBy: Docugami.com\nDocuGenerate\nBy: DocuGenerate\nDocument AI\nBy: Cloudmersive, LLC\nDocument AI Konfuzio\nBy: Helm & Nagel GmbH\nDocument Drafter\nBy: Document Drafter\nDocument Merge\nBy: CIRRUS SOFT LTD\nDocumentero\nBy: Documentero\nDocumentsCorePack\nBy: mscrm-addons.com ( PTM EDV Systeme )\nDocuMotor\nBy: Omnidocs\nDocurain\nBy: root42 Inc.\nDocusign\nBy: DocuSign, Inc.\nDocusign Demo\nBy: DocuSign, Inc.\nDocuWare\nBy: DocuWare\nDokobit Portal\nBy: Dokobit\nDokobit Universal API\nBy: Dokobit\nDomainTools Iris Enrich\nBy: DomainTools, LLC\nDomainTools Iris Investigate\nBy: DomainTools, LLC\nDoppler Farhan Latif (Independent Publisher)\nBy: Farhan Latif\ndox42\nBy: dox42\nDPIRD Radar - West Australia (Independent Publisher)\nBy: Paul Culmsee\nDPIRD Science - West Australia (Independent Publisher)\nBy: Paul Culmsee\nDPIRD Weather - West Australia (Independent Publisher)\nBy: Paul Culmsee\nDQ on Demand\nBy: DQ Global\nDraup\nBy: Draup\nDraup MCP Server\nBy: Draup\nDropbox\nBy: Microsoft\nDuration Calculator (Independent Publisher)\nBy: Troy Taylor\nDVLA Vehicle Enquiry Service (Independent Publisher)\nBy: Gulshan Khurana and Pranav Khurana\nDynamic Signal\nBy: Dynamic Signal\nDynamicDocs (Independent Publisher)\nBy: Troy Taylor\nDynamics 365 (deprecated)\nBy: Microsoft\nDynamics 365 Business Central\nBy: Microsoft\nDynamics 365 Business Central (on-premises)\nBy: Microsoft\nDynamics 365 Commerce - Ratings and Reviews\nBy: Microsoft\nDynamics 365 Commerce Merchandising [DEPRECATED]\nBy: Microsoft\nDynamics 365 Customer Insights\nBy: Microsoft\nDynamics 365 Customer Voice\nBy: Microsoft\nDynamics 365 Fraud Protection\nBy: Microsoft\nDynamics 365 Sales Insights\nBy: Microsoft\nDynamics NAV\nBy: Microsoft\nDynamics Translation Service\nBy: Microsoft Corporation\nDynatrace\nBy: Dynatrace\nEasy Redmine\nBy: Microsoft\nEasyPost Mail\nBy: Bing Technologies\nEasyship (Independent Publisher)\nBy: Troy Taylor\nEasyvista Self Help\nBy: Easyvista\nEasyVista Service Manager\nBy: Easyvista\neBay (Independent Publisher)\nBy: Artesian Software Technologies LLP\nEBMS\nBy: Eagle Business Software\neCFR (Independent Publisher)\nBy: Dan Romano\nEcologi (Independent Publisher)\nBy: Troy Taylor\nedatalia Sign Online (Independent Publisher)\nBy: Victor Sanchez Olaya\nEden AI\nBy: Eden AI\nEdgility\nBy: Edgility\nEdifact\nBy: Microsoft\nEduframe\nBy: Microsoft\nEgain\nBy: eGain Corporation\nEgnyte\nBy: Egnyte\nE-goi\nBy: E-goi\nEigen Events\nBy: Eigen Ltd\nElastic Forms\nBy: Workai\nElasticOCR [DEPRECATED]\nBy: ElasticOCR\nElead Product Reference Data\nBy: CDK Global\nElead Sales Customers\nBy: CDK Global\nElead Sales Opportunities\nBy: CDK Global\nElectricity Maps (Independent Publisher)\nBy: Vitalii Sorokin\nElfsquad Data\nBy: Elfsquad B.V.\nElfsquad Product Configurator\nBy: Elfsquad\nEmail Domain Checker\nBy: Mightora.io\nEmail Veritas – URL Checker\nBy: eVeritas\nemfluence Marketing Platform\nBy: emfluence, llc\nEmigo\nBy: Sagra Technology Sp. z o.o.\nEmojiHub (Independent Publisher)\nBy: Troy Taylor\nEMT ATLAS AIMS\nBy: Enable My Team\nEnadoc\nBy: Enadoc Pte Ltd\nEncodian - Barcode\nBy: Encodian\nEncodian - Convert\nBy: Encodian\nEncodian - Excel\nBy: Encodian\nEncodian - General\nBy: Encodian\nEncodian - Image\nBy: Encodian\nEncodian - PDF\nBy: Encodian\nEncodian - PowerPoint\nBy: Encodian\nEncodian - Utilities\nBy: Encodian\nEncodian - Word\nBy: Encodian\nEncodian [DEPRECATED]\nBy: Encodian\nEncodian Filer\nBy: Encodian\nEncodian Trigr\nBy: Encodian\nEngagement Cloud\nBy: dotdigital\nEnlyft Insights\nBy: Enlyft.\nEnlyft MCP\nBy: Enlyft.\nEntegrations.io\nBy: entegrations.io inc\nEntersoft\nBy: Entersoft SA\nEnveloop (Independent Publisher)\nBy: Troy Taylor\nEnvoy\nBy: Envoy, Inc.\nEONET by NASA (Independent Publisher)\nBy: Troy Taylor, Hitachi Solutions\nEphesoft Semantik For Invoices\nBy: Ephesoft Inc.\nE-Sign\nBy: E-Sign\nEthereum Blockchain [DEPRECATED]\nBy: Microsoft\nEtsy (Independent Publisher)\nBy: Troy Taylor, Hitachi Solutions\nEvent Hubs\nBy: Microsoft\nEvent Tickets\nBy: The Events Calendar\nEventbrite\nBy: Microsoft\nEvery (Independent Publisher)\nBy: Troy Taylor\nEvocom\nBy: Evocom Informationssysteme GmbH\neWay-CRM\nBy: eWay-CRM\nExact Online Premium [DEPRECATED]\nBy: Exact MKB Software BV\nExact Time & Billing (Independent Publisher)\nBy: Indocs\nExasol\nBy: Exasol AG\nExcel [DEPRECATED]\nBy: Microsoft\nExcel Online (Business)\nBy: Microsoft\nExcel Online (OneDrive)\nBy: Microsoft\nExchange Rate (Independent Publisher)\nBy: Fördős András\nExpensya\nBy: EXPENSYA SA\nExperlogix CPQ\nBy: Experlogix US\nExperlogix Smart Flows\nBy: Experlogix US\nExpiration Reminder\nBy: SkyXoft Technologies, Inc.\nEXPOCAD\nBy: EXPOCAD\nEzekia-MCP\nBy: Ezekia\nFace API\nBy: Microsoft\nFactSet\nBy: FactSet Research Systems\nFantasy Premier League (Independent Publisher)\nBy: Joe Unwin (FlowJoe)\nFarsight DNSDB\nBy: Farsight Security\nFBI Most Wanted (Independent Publisher)\nBy: Richard Wilson\nFCA (Independent Publisher)\nBy: Gulshan Khurana\nFeathery\nBy: Troy Taylor\nFeathery Forms\nBy: Feathery\nFederal Reserve Economic Data (Independent Publisher)\nBy: Troy Taylor, Hitachi Solutions\nFederal Reserve Markets (Independent Publisher)\nBy: Dan Romano\nFEMA (Independent Publisher)\nBy: Troy Taylor\nFestivo (Independent Publisher)\nBy: Troy Taylor\nFHIRBase\nBy: Microsoft\nFHIRClinical\nBy: Microsoft\nFHIRlink\nBy: Microsoft Cloud for Healthcare\nFieldEquip\nBy: FieldEquip\nFile System\nBy: Microsoft\nFile.io (Independent Publisher)\nBy: Troy Taylor\nFiles.com\nBy: Files.com\nFin & Ops Apps (Dynamics 365)\nBy: Microsoft\nFinalcad One Connect 4.0\nBy: FINALCAD\nFinancial Edge NXT Query [DEPRECATED]\nBy: Blackbaud. Inc\nFinnish BIS (Independent Publisher)\nBy: Timo Pertila\nFinnish Railway Traffic (Independent Publisher)\nBy: Timo Pertilä\nFINRA (Independent Publisher)\nBy: Dan Romano\nFireText\nBy: FireText\nFiscal Data Service (Independent Publisher)\nBy: Dan Romano\nFishWatch (Independent Publisher)\nBy: Fordos Andras\nFitbit (Independent Publisher)\nBy: Ashwin Ganesh Kumar\nFlic\nBy: Microsoft\nFliplet\nBy: Fliplet\nFlotiq headless CMS\nBy: CodeWave LLC\nFlowForma\nBy: FlowForma Limited\nFlowForma V2\nBy: FlowForma Limited\nFocusmate (Independent Publisher)\nBy: Phil Cole\nFORCAM FORCE Bridge\nBy: FORCAM GmbH\nForceManager CRM\nBy: Tritium Software S.L.\nForem (Independent Publisher)\nBy: Daniel Laskewitz\nFormstack Documents\nBy: Formstack LLC\nFormstack Forms\nBy: Formstack LLC\nFraudLabs Pro (Independent Publisher)\nBy: Troy Taylor\nFreeAgent (Independent Publisher)\nBy: Nirmal Kumar\nFreshBooks\nBy: Microsoft\nFreshdesk\nBy: Microsoft\nFreshservice\nBy: Microsoft\nFTP\nBy: Microsoft\nFulcrum\nBy: Spatial Networks, Inc.\nFun Translations (Independent Publisher)\nBy: Troy Taylor\nFuseLagNotam1.1 (Independent Publisher)\nBy: Falana Kidd\nFuxsy-ADSKFusionManagePaid (Independent Publisher)\nBy: Fuxsy.eu\nGenerative actions\nBy: Microsoft\nGeoDB (Independent Publisher)\nBy: Troy Taylor\nGerman Federal Parliament (Independent Publisher)\nBy: Dan Romano\nGetAccept\nBy: GetAccept, Inc.\nGetMyInvoices\nBy: GetMyInvoices\nGieni TS Server MCP\nBy: Orderfox-Gieni\nGIPHY (Independent Publisher)\nBy: Priyanshu Srivastav\nGIS Cloud\nBy: HandyGeo Solutions\nGitHub\nBy: Microsoft\nGithub Data (Independent Publisher)\nBy: Nathalie-Leenders\nGitHub Gists (Independent Publisher)\nBy: Troy Taylor\nGitHub Utils (Independent Publisher)\nBy: Daniel Laskewitz\nGitLab (Independent Publisher)\nBy: Roy Paar\nGivebutter (Independent Publisher)\nBy: Troy Taylor, Hitachi Solutions\nGlaass Pro\nBy: Glaass Pty Ltd\nGlobal Exchange Rates\nBy: MEMENTO SRL\nGlobalGiving Project (Independent Publisher)\nBy: Troy Taylor, Hitachi Solutions\nGmail\nBy: Microsoft\nGMO Sign\nBy: GMO GlobalSign Holdings K.K.\nGoFileRoom\nBy: Thomson Reuters\nGoogle BigQuery - Dev (Independent Publisher)\nBy: Ashwani Kumar\nGoogle Books (Independent Publisher)\nBy: Fördős András\nGoogle Calendar\nBy: Microsoft\nGoogle Cloud Translation (Independent Publisher)\nBy: Troy Taylor, Hitachi Solutions\nGoogle Contacts\nBy: Microsoft\nGoogle Drive\nBy: Microsoft\nGoogle Gemini (Independent Publisher)\nBy: Priyaranjan KS , Vidya Sagar Alti [Tata Consultancy Services]\nGoogle PaLM (Independent Publisher)\nBy: Priyaranjan KS , Vidya Sagar Alti [Tata Consultancy Services]\nGoogle Photos (Independent Publisher)\nBy: Julia Muiruri\nGoogle Sheets\nBy: Microsoft\nGoogle Tasks\nBy: Microsoft\nGoQR (Independent Publisher)\nBy: Rui Santos\nGoToMeeting\nBy: LogMeIn Inc\nGoToTraining\nBy: Microsoft\nGoToWebinar\nBy: Microsoft\nGovee (Independent Publisher)\nBy: Richard Wilson\nGratavid\nBy: Gratavid\nGravity Forms by reenhanced\nBy: Reenhanced LLC\nGravity Forms Professional\nBy: Reenhanced, LLC\nGroopit\nBy: Groopit\nGroupMgr\nBy: GroupMgr\nGSA Analytics (Independent Publisher)\nBy: Richard Wilson\nGSA Per Diem (Independent Publisher)\nBy: Richard Wilson\nGSA Public Comment (Independent Publisher)\nBy: Dan Romano\nGSA Site Scanning (Independent Publisher)\nBy: Richard Wilson\nHarness PDFx\nBy: Harness Data Intelligence Ltd.\nHarvest\nBy: Microsoft\nHash Generator (Independent Publisher)\nBy: Troy Taylor, Jeffrey Irwin, Ramiro Melgoza\nHashify (Independent Publisher)\nBy: Troy Taylor, Hitachi Solutions\nHashtag API (Independent Publisher)\nBy: Troy Taylor\nHave I Been Pwned (Independent Publisher)\nBy: Troy Taylor\nHelloSign\nBy: Microsoft\nHHS Media Services (Independent Publisher)\nBy: Troy Taylor\nHighGear Workflow\nBy: HighGear Software, Inc.\nHighQ\nBy: Thomson Reuters Incorporated\nHighspot\nBy: Highspot\nHighspot MCP\nBy: Highspot\nHipChat\nBy: Microsoft\nHitHorizons\nBy: FinStat, s. r. o.\nHive CPQ Product Configurator\nBy: NimbleOps NV\nHolopin\nBy: Troy Taylor\nHolopin (Independent Publisher)\nBy: troystaylor\nHoneywell Forge\nBy: Honeywell International\nHost.io (Independent Publisher)\nBy: Troy Taylor\nHotProfile\nBy: Hammock corporation\nHoudin.io\nBy: Houdin.io\nHouseRater QA\nBy: HouseRater, LLC\nHR Cloud\nBy: HR Cloud\nHrFlow.ai\nBy: HrFlow.ai\nHTML to PDF by Pascalcase\nBy: Pascalcase\nhttp garden (Independent Publisher)\nBy: Troy Taylor\nHTTP With Microsoft Entra ID\nBy: Microsoft\nHTTP with Microsoft Entra ID (preauthorized)\nBy: Microsoft\nHubSpot CMS (Independent Publisher)\nBy: Hitachi Solutions\nHubSpot CMS V2 (Independent Publisher)\nBy: Troy Taylor\nHubSpot Conversations V2 (Independent Publisher)\nBy: Troy Taylor\nHubSpot CRM (Independent Publisher)\nBy: Troy Taylor, Hitachi Solutions\nHubSpot CRM V2 (Independent Publisher)\nBy: Troy Taylor\nHubSpot Engagements V2 (Independent Publisher)\nBy: Troy Taylor\nHubSpot Files V2 (Independent Publisher)\nBy: Troy Taylor\nHubSpot Marketing (Independent Publisher)\nBy: Hitachi Solutions\nHubSpot Marketing V2 (Independent Publisher)\nBy: Troy Taylor\nHubSpot Settings V2 (Independent Publisher)\nBy: Troy Taylor\nHuddle\nBy: Huddle\nHuddle for US Gov & Healthcare\nBy: Huddle\nHuddo Boards\nBy: Huddo by ISW Development Pty Ltd\nHUE Datagate\nBy: Works Applications Co., Ltd.\nHugging Face (Independent Publisher)\nBy: Troy Taylor\nHume (Independent Publisher)\nBy: Troy Taylor\nHunter (Independent Publisher)\nBy: Troy Taylor\nHVI Vehicle Inspection V1.2\nBy: JRS Innovation/Ram Upadhayay\nHYAS Insight\nBy: HYAS Infosec\nIA-Connect Dynamic Code\nBy: Ultima Business\nIA-Connect Java\nBy: Ultima Labs\nIA-Connect JML\nBy: Ultima Business\nIA-Connect Mainframe\nBy: Ultima Labs\nIA-Connect SAP GUI\nBy: Ultima Business\nIA-Connect Session\nBy: Ultima Business\nIA-Connect to Microsoft Office\nBy: Ultima Business\nIA-Connect UI\nBy: Ultima Business\nIA-Connect Web Browser\nBy: Ultima Business\niAuditor\nBy: SafetyCulture Pty Ltd\nIBM 3270\nBy: Microsoft\nIBM Watson Assistant (Independent Publisher)\nBy: Lucas Titus\nIBM Watson Text to Speech (Independent Publisher)\nBy: Lucas Titus\nicanhazdadjoke (Independent Publisher)\nBy: Daniel Laskewitz\nIce and Fire (Game of Thrones) (Independent Publisher)\nBy: Troy Taylor\nIcon Horse (Independent Publisher)\nBy: Troy Taylor, Hitachi Solutions\nID Analyzer\nBy: Evith Techology\nIdeanote\nBy: Ideanote ApS\niFacto Proof Of Delivery\nBy: iFacto Business Solutions NV\niLovePDF\nBy: i Love PDF\niLoveSign\nBy: i Love PDF\niManage AI\nBy: iManage Power Platform Connector\niManage Data Marts\nBy: iManage Power Platform Connector\niManage Insight Plus\nBy: iManage LLC\niManage Tracker\nBy: iManage LLC\niManage Work\nBy: iManage Power Platform Connector\niManage Work for Admins\nBy: iManage Power Platform Connector\niMIS\nBy: Computer System Innovations, Inc.\nImpexium\nBy: Impexium Corporation\nImpower ERP\nBy: Impower GmbH\nImprezian360-CRM\nBy: KnowTia Concepts Corporation\nIN-D Aadhaar Number Masking\nBy: IN-D by Intain\nIN-D Face Match\nBy: IN-D by Intain\nIN-D Insurance (ICD10 & CPT)\nBy: IN-D by Intain\nIN-D Invoice Data Capture\nBy: IN-D AI\nIN-D KYC India\nBy: IN-D by Intain\nIN-D Payables\nBy: IN-D by Intain\nIndustrial App Store\nBy: Intelligent Plant\nInEight\nBy: InEight\nInfluenza and Covid-19 (Independent Publisher)\nBy: Kevin Comba Gatimu, Denis Wachira Kathuri\nInfobip\nBy: Infobip\nInfoQuery\nBy: InfoQuery LLC\nInformix\nBy: Microsoft\nInfoShare\nBy: Kendox AG\nInfoVetted\nBy: InfoVetted\nInfura Ethereum (Independent Publisher)\nBy: Sebastian Zolg\nInfusionsoft\nBy: Microsoft\nInLoox\nBy: InLoox\nInoreader\nBy: Microsoft\ninQuba Journey\nBy: Inquba Customer Intelligence Pty Ltd\nInsightly\nBy: Microsoft\nInstagram Basic Display (Independent Publisher)\nBy: Reshmee Auckloo\nInstapaper\nBy: Microsoft\nInstatus (Independent Publisher)\nBy: Troy Taylor\nIntegrable PDF\nBy: Integrable, LLC\nIntegration Toolbox [DEPRECATED]\nBy: LF Software Engineering\nIntelix IOC Analysis MCP\nBy: Sophos Ltd.\nintelliHR\nBy: intelliHR\nIntentional Data Sources\nBy: Microsoft\nInterAction\nBy: LexisNexis Legal and Professional\nIntercom\nBy: Microsoft\niObeya\nBy: iObeya\nIP2LOCATION (Independent Publisher)\nBy: Fördős András\nIP2WHOIS (Independent Publisher)\nBy: Fordos Andras\nIPQS Fraud and Risk Scoring\nBy: IPQualityScore\nIQAir (Independent Publisher)\nBy: Fordos Andras\nISOPlanner\nBy: REDLAB\nITautomate\nBy: ITautomate LTD\nITGlue (Independent Publisher)\nBy: Nirmal Kumar\nJasper (Independent Publisher)\nBy: Troy Taylor\nJBHunt\nBy: Microsoft\nJedox OData Hub\nBy: Jedox\nJG Integrations\nBy: JG Software Solutions Limited\nJira\nBy: Microsoft\nJIRA Search (Independent Publisher)\nBy: Paul Culmsee\nJotForm\nBy: JotForm Inc.\nJotform Enterprise\nBy: JotForm Admin\nJservice (Independent Publisher) [DEPRECATED]\nBy: Troy Taylor\nJungleMail 365\nBy: EnovaPoint, UAB\nJupyrest\nBy: Microsoft\nK2 Workflow\nBy: K2\nKagi (Independent Publisher)\nBy: Troy Taylor\nKanban Tool\nBy: Shore Labs\nKhalibre LMS Test\nBy: Khalibre\nkintone\nBy: Kintone\nKnowledgeLake\nBy: KnowledgeLake\nKnowledgeone RecFind6\nBy: Knowledgeone Corporation\nKORTO V2\nBy: Korto\nKroki\nBy: Troy Taylor\nKrozu PM (Independent Publisher)\nBy: Osazee Odigie\nKyndryl mainframe\nBy: Ryan Treacy\nLang.ai\nBy: Lang.ai\nLanguage - Question Answering\nBy: Microsot\nLanguageTool (Independent Publisher) (deprecated) [DEPRECATED]\nBy: Fordos Andras\nLansweeper App For Sentinel\nBy: Lansweeper\nLasso X\nBy: Lasso X A/S\nLatinShare Documents\nBy: LatinShare\nLatinShare SHP Management\nBy: LatinShare\nLatinShare SHP Permissions\nBy: LatinShare\nLaunch Library 2 (Independent Publisher)\nBy: Troy Taylor\nLawlift\nBy: Lawlift GmbH\nLawVu\nBy: LAWVU LIMITED\nLCP - iCordis\nBy: LCP nv\nLeadDesk\nBy: LeadDesk\nLeanKit\nBy: Microsoft\nLeap (Independent Publisher)\nBy: Chandra Sekhar Malla, Troy Taylor\nLeave Dates (Independent Publisher)\nBy: Tiago Ramos (novalogica)\nLegalBot AI Tools\nBy: LegalBot.io\nLegalesign\nBy: Legalesign\nLegiScan (Independent Publisher)\nBy: krautrocker\nLetterdrop (Independent Publisher)\nBy: Troy Taylor\nLettria (Independent Publisher)\nBy: Troy Taylor\nLettria GDPR Compliance\nBy: lettria\nLex Power Sign\nBy: Lex Persona\nLexica (Independent Publisher)\nBy: Troy Taylor\nLexoffice (Independent Publisher)\nBy: LowCodeInvestigator\nLibrary of Congress\nBy: Troy Taylor\nLibreBor (Independent Publisher)\nBy: Mario Trueba and Marco Amoedo\nLIFX\nBy: Microsoft\nLine Message (Independent Publisher)\nBy: Felaray Ho\nLINK Mobility\nBy: LINK Mobility\nLinkedIn [DEPRECATED]\nBy: Microsoft\nLinkedIn V2\nBy: Microsoft\nLit Ipsum (Independent Publisher)\nBy: Troy Taylor\nLitera Search\nBy: Litera\nLiveChat\nBy: Microsoft\nLiveTiles Bots\nBy: LiveTiles Pty Ltd.\nLMS365\nBy: Zensai International Aps\nLnk.Bio\nBy: Lnk.Bio\nLoginLlama\nBy: Troy Taylor\nLoopio\nBy: Loopio\nLoopio-EU\nBy: Loopio\nLoopio-Int01\nBy: Loopio\nLoripsum (Independent Publisher)\nBy: Troy Taylor, Hitachi Solutions\nLSEG\nBy: LSEG Financial Analytics\nLSEG Financial Analytics\nBy: LSEG Financial Analytics\nLUIS\nBy: Microsoft\nLuware Nimbus\nBy: Luware\nM365 Search (Deprecated) [DEPRECATED]\nBy: Microsoft\nMaersk (Independent Publisher)\nBy: Dan Romano\nMail\nBy: Microsoft\nMailboxValidator (Independent Publisher)\nBy: Troy Taylor, Hitachi Solutions\nMailChimp\nBy: Microsoft\nMailform\nBy: Mailform, Inc.\nMailinator\nBy: Troy Taylor\nMailJet (Independent Publisher)\nBy: Clement Olivier\nMailParser\nBy: SureSwift Capital, Inc.\nMaintenance Request - Oxmaint (Independent Publisher)\nBy: JRS Innovation/Ram Upadhayay\nMandrill\nBy: Microsoft\nMap Pro\nBy: Witivio\nMapbox (Independent Publisher)\nBy: Simone Lin\nMarkdown Converter (Independent Publisher)\nBy: troystaylor\nMarketing Content Hub\nBy: Stylelabs\nMarketo MA\nBy: Microsoft Inc.\nMavim-iMprove\nBy: Mavim\nMaximizer CRM\nBy: Maximizer\nMCP Hive.T Integration\nBy: Tesselate\nMeaningCloud (Independent Publisher)\nBy: Clement Olivier\nMedallia\nBy: Medallia, Inc.\nMediastack (Independent Publisher)\nBy: Fördős András\nMedium\nBy: Microsoft\nMeekou Share (Independent Publisher)\nBy: Meekou\nMeetingRoomMap\nBy: TNS Holding ApS\nMeisterplan\nBy: itdesign GmbH\nMeme (Independent Publisher)\nBy: Troy Taylor\nMensagia\nBy: Mensagia\nMensagia (Independent Publisher)\nBy: Sistemas Informaticos ICON, S.L.\nMessageBird SMS (Independent Publisher)\nBy: Troy Taylor\nMetatask\nBy: Build My Team LLC\nMichael Scott Quotes (Independent Publisher) [DEPRECATED]\nBy: Troy Taylor\nMicrosoft 365 Admin Center MCP\nBy: Microsoft\nMicrosoft 365 compliance\nBy: Microsoft\nMicrosoft 365 message center\nBy: Microsoft\nMicrosoft 365 Self-Help\nBy: Microsoft\nMicrosoft Acronyms\nBy: Troy Taylor\nMicrosoft Bookings\nBy: Microsoft Corporation\nMicrosoft Copilot Studio\nBy: Microsoft\nMicrosoft D365CE v9 OnPrem (Independent Publisher)\nBy: Roy Paar\nMicrosoft Dataverse\nBy: Microsoft\nMicrosoft Dataverse [DEPRECATED]\nBy: Microsoft\nMicrosoft Defender ATP\nBy: Microsoft\nMicrosoft Defender for Cloud Alert\nBy: Microsoft\nMicrosoft Defender for Cloud Recommendation\nBy: Microsoft\nMicrosoft Defender for Cloud Regulatory Compliance\nBy: Microsoft\nMicrosoft Entra ID\nBy: Microsoft\nMicrosoft Entra ID App Registrations\nBy: Paul Culmsee (Rapid Circle) and Microsoft\nMicrosoft Entra ID Protection\nBy: Microsoft\nMicrosoft Forms\nBy: Microsoft\nMicrosoft Graph Add Users (Independent Publisher)\nBy: Troy Taylor\nMicrosoft Graph Security (deprecated) [DEPRECATED]\nBy: Microsoft\nMicrosoft Kaizala\nBy: Microsoft\nMicrosoft Learn Catalog (Independent Publisher)\nBy: Sean Kelly\nMicrosoft Learn Docs MCP\nBy: Microsoft\nMicrosoft Loop [DEPRECATED]\nBy: Microsoft\nMicrosoft Partner Center [DEPRECATED]\nBy: Microsoft\nMicrosoft School Data Sync V2\nBy: Microsoft\nMicrosoft Security Copilot\nBy: Microsoft\nMicrosoft Sentinel\nBy: Microsoft\nMicrosoft Sentinel MCP\nBy: Microsoft\nMicrosoft Teams\nBy: Microsoft\nMicrosoft Teams Virtual Events (deprecated) [DEPRECATED]\nBy: Microsoft\nMicrosoft To-Do (Business)\nBy: Microsoft\nMicrosoft To-Do (Consumer)\nBy: Microsoft\nMicrosoft Translator [DEPRECATED]\nBy: Microsoft\nMicrosoft Translator V2\nBy: Microsoft\nMicrosoft Translator V3\nBy: Microsoft Translator\nMime Automation (Independent Publisher)\nBy: Andreas Cieslik\nMiniSoup HTML Parser (Independent Publisher)\nBy: Shogo Shindo\nMintlify (Independent Publisher)\nBy: Troy Taylor\nMintNFT (Independent Publisher)\nBy: Shreyan J D Fernandes\nMiro (Independent Publisher)\nBy: Michal Romiszewski\nMistral (Independent Publisher)\nBy: Troy Taylor\nMitto\nBy: Mitto AG\nMobile Text Alerts MCP Server\nBy: Mobile Text Alerts\nMobili Stotele\nBy: Tele2\nMobilyWS\nBy: MobilyWS\nMOBSIM Send SMS\nBy: MOBSIM Comunicacao Mobile SMS\nMockaroo (Independent Publisher)\nBy: Richard Wilson\nMockster\nBy: Mockster\nModuleQ [DEPRECATED]\nBy: ModuleQ, Inc.\nmonday\nBy: Plugin Genie\nmonday.com\nBy: monday.com ltd\nmondaycom (Independent Publisher)\nBy: Woong Choi\nMongoDB\nBy: MongoDB Corp\nMonster API (Independent Publisher)\nBy: Troy Taylor\nMoosend (Independent Publisher)\nBy: Troy Taylor\nMoreApp Forms\nBy: MoreApp International\nMorf\nBy: AFTIA Solutions\nMorningstar\nBy: Morningstar Test\nMorta\nBy: Morta\nMotaWord Translations\nBy: MotaWord\nMotimate\nBy: Motimate AS\nMQ\nBy: Microsoft\nMS Graph Groups and Users\nBy: Jay Jani\nMSN Weather\nBy: Microsoft\nMtarget SMS\nBy: Mtarget SAS\nMuhimbi PDF\nBy: Muhimbi trading as Nutrient\nMURAL\nBy: MURAL\nMy Acclaro\nBy: Acclaro Inc.\nMy Hours\nBy: Spica International\nMySQL\nBy: Microsoft\nmyStrom (Independent Publisher)\nBy: Tomasz Poszytek\nN-able Cloud Commander\nBy: N-Able Technologies Ltd.\nN-able Cloud User Hub\nBy: N-able Cloud User Hub B.V.\nNameAPI (Independent Publisher)\nBy: Fördős András\nNarvar\nBy: Microsoft\nNASA FIRMS (Independent Publisher)\nBy: Fördős András\nNASA Image and Video Library (Independent Publisher)\nBy: Paul Culmsee, Seven Sigma\nNational Park Service (Independent Publisher)\nBy: Troy Taylor, Hitachi Solutions\nNational Weather Service (Independent Publisher)\nBy: Troy Taylor\nNationalize_io (Independent Publisher)\nBy: Tomasz Poszytek\nnBold\nBy: nBold\nNCEI Climate Data (Independent Publisher)\nBy: Troy Taylor\nNederlandse Spoorwegen (Independent Publisher)\nBy: Miguel Verweij\nNEOWs (Independent Publisher)\nBy: Troy Taylor\nNetDocuments\nBy: NetDocuments Software, Inc.\nNetvolution\nBy: Atcom S.A\nNeum (Independent Publisher)\nBy: Troy Taylor\nNew York Times (Independent Publisher)\nBy: Roy Paar\nNewsData.io (Independent Publisher)\nBy: Troy Taylor\nNexmo\nBy: Microsoft\nNextcom\nBy: Nextcom Evolution AS\nNH360 Portfolio Insights\nBy: UMT Software\nNHTSA vPIC (Independent Publisher)\nBy: Troy Taylor\nNifty Gateway (Independent Publisher)\nBy: Roy Paar\nNimflow\nBy: Nimflow LLC\nNintex Workflow\nBy: Nintex USA LLC.\nNIST NVD (Independent Publisher)\nBy: Paul Culmsee\nNitro\nBy: Nitro Software, Inc.\nNitro Sign Enterprise Verified\nBy: Nitro Software Belgium NV\nNodefusion Portal\nBy: Nodefusion d.o.o\nNosco\nBy: Nosco ApS\nNotifications\nBy: Microsoft\nNotiivy Browser Notifications\nBy: Notiivy\nNotion (Independent Publisher)\nBy: Chandra Sekhar & Harshini Varma\nNoxtua AI\nBy: Xayn\nNozbe\nBy: NOZBE SP Z O O\nnps.today\nBy: nps.today\nNREL (Independent Publisher)\nBy: Troy Taylor\nNumlookupAPI (Independent Publisher)\nBy: Troy Taylor\nnunify\nBy: nunify\nNutrient - Convert to PDF\nBy: Muhimbi trading as Nutrient\nNutrient - Extract from PDF\nBy: Muhimbi trading as Nutrient\nNutrient - PDF OCR\nBy: Muhimbi trading as Nutrient\nNutrient - Watermark to PDF\nBy: Muhimbi trading as Nutrient\nNutrient Document Converter\nBy: Muhimbi trading as Nutrient\nNutrient Workflow Automation\nBy: Muhimbi trading as Nutrient\nObjective Connect\nBy: Objective Corporation\nOccuspace\nBy: Occuspace Inc\nOffice 365 Groups\nBy: Microsoft\nOffice 365 Groups Mail\nBy: Microsoft\nOffice 365 Outlook\nBy: Microsoft\nOffice 365 Users\nBy: Microsoft\nOffice 365 Video [DEPRECATED]\nBy: Microsoft\nOfficeAi Agent\nBy: Microsoft\nOK dokument (Independent Publisher)\nBy: Seyfor Slovensko, a.s.\nOMDb (Independent Publisher)\nBy: Aaryan Arora\noncehub\nBy: OnceHub\nOneBlink\nBy: OneBlink\nOneDrive\nBy: Microsoft\nOneDrive for Business\nBy: Microsoft\nOneflow\nBy: Oneflow\nOneNote (Business)\nBy: Microsoft\nOneNote Consumer (Independent Publisher)\nBy: Troy Taylor\nOnePlan\nBy: OnePlan, LLC\nOne-Time Secret (Independent Publisher)\nBy: Aldo Gillone\nOodrive Sign\nBy: Oodrive Sign\nOpen Brewery DB (Independent Publisher)\nBy: Fördős András\nOpen Charge Map (Independent Publisher)\nBy: Troy Taylor\nOpen Experience\nBy: Open Experience GmbH\nOpenAI (Independent Publisher)\nBy: Robin Rosengrün\nOpenAI Assistants (Independent Publisher)\nBy: Troy Taylor\nOpenAI GPT (Independent Publisher)\nBy: Troy Taylor\nOpenCage Geocoding (Independent Publisher)\nBy: Ahmad Najjar\nOpen-Elevation (Independent Publisher)\nBy: Fördős András\nopenFDA Drug (Independent Publisher)\nBy: Woong Choi\nOpenFEC (Independent Publisher)\nBy: krautrocker\nOpenLegacy IBM I (AS400)\nBy: OpenLegacy Technologies Inc.\nOpenLegacy IBM Mainframe\nBy: OpenLegacy Technologies Inc.\nOpenNEM (Independent Publisher)\nBy: Paul Culmsee\nOpenPLZ (Independent Publisher)\nBy: LowCodeInvestigator\nopenpm (Independent Publisher)\nBy: Troy Taylor\nOpenQR (Independent Publisher)\nBy: Troy Taylor\nOpenRouter (Independent Publisher)\nBy: Fördős András\nOpenSanctions (Independent Publisher)\nBy: krautrocker\nOpenText Core Share\nBy: One Fox\nOpenText Documentum\nBy: One Fox\nOpenText eDOCS\nBy: One Fox\nOpenText Extended ECM\nBy: One Fox\nOpenTrivaDatabase (Independent Publisher)\nBy: Kiveshan Naidoo\nOptiAPI\nBy: Busk\nOQSHA\nBy: Osmosys Software Solutions UK Limited\nOracle Database\nBy: Microsoft\nORB Intelligence (Independent Publisher)\nBy: Aaryan Arora, Ankita Singh\nOrbusInfinity\nBy: Orbus Software\nOrdnance Survey Places\nBy: Ordnance Survey\nOriginality.AI (Independent Publisher)\nBy: Osazee Odigie\nOtto.bot\nBy: Otto.bot, LLC\nOutlook Tasks [DEPRECATED]\nBy: Microsoft\nOutlook.com\nBy: Microsoft\nOutreach Insights\nBy: Outreach\nOwlbot (Independent Publisher)\nBy: Troy Taylor\nPagePixels Screenshots\nBy: PagePixels: Screenshots\nPagerDuty\nBy: Microsoft\nPantry (Independent Publisher)\nBy: Troy Taylor, Hitachi Solutions\nPanviva\nBy: Panviva\nPappers (Independent Publisher)\nBy: Troy Taylor\nParishSoft Family Suite\nBy: Ministry Brands ParishSOFT\nParserr\nBy: Parserr LLC\nParseur\nBy: Parseur\nPartner Center Events\nBy: Microsoft\nPartner Center Referrals\nBy: Microsoft Corporation\nPartnerLinq\nBy: Visionet Systems Inc.\nPassage by 1Password - Auth (Independent Publisher)\nBy: Troy Taylor\nPassage by 1Password - Manage (Independent Publisher)\nBy: Troy Taylor\nPaylocity\nBy: Paylocity\nPaySpace (Independent Publisher)\nBy: Mint Management Technologies\nPDF Blocks\nBy: Integrable, LLC\nPDF Tools\nBy: ConvertAPI\nPDF Tools by Tachytelic (Independent Publisher)\nBy: tachytelic\nPDF4me\nBy: Ynoox GmbH\nPDF4me AI\nBy: Ynoox GmbH\nPDF4me Barcode\nBy: Ynoox GmbH\nPDF4me Connect\nBy: Ynoox GmbH\nPDF4me Convert\nBy: Ynoox GmbH\nPDF4me Excel\nBy: Ynoox GmbH\nPDF4me Image\nBy: Ynoox GmbH\nPDF4me PDF\nBy: Ynoox GmbH\nPDF4me SwissQR\nBy: Ynoox GmbH\nPDF4me Word\nBy: Ynoox GmbH\nPDFco\nBy: PDF.co\nPDFcross\nBy: PotCross GK.\nPdfless\nBy: Synapsium\nPeakboard\nBy: Peakboard GmbH\nPeltarion AI\nBy: Peltarion\nPerfect Wiki\nBy: OOO RD17\nPerplexity AI (Independent Publisher)\nBy: Troy Taylor\nPersonr\nBy: Microsoft\nPexels (Independent Publisher)\nBy: That API Guy\nPhilips HUE (Independent Publisher)\nBy: Tomasz Poszytek\nPilot Things\nBy: Pilot Things\nPinecone\nBy: Troy Taylor\nPinterest\nBy: Microsoft\nPipedrive\nBy: Microsoft\nPipeliner CRM\nBy: Pipelinersales Corporation\nPIPware KPIs\nBy: PIPware Solutions\nPitney Bowes Data Validation [DEPRECATED]\nBy: Not Available\nPitney Bowes Tax Calculator [DEPRECATED]\nBy: Not Available\nPivotal Tracker\nBy: Microsoft\nPixel Encounter (Independent Publisher)\nBy: Fordos Andras\nPixela (Independent Publisher)\nBy: Troy Taylor\nPixelMe\nBy: Troy Taylor\nPKIsigning\nBy: SBRS B.V.\nPlacedog (Independent Publisher)\nBy: Troy Taylor, Hitachi Solutions\nPlanful\nBy: Planful\nPlanner\nBy: Microsoft\nPling\nBy: Fellowmind Denmark\nPlivo\nBy: Plivo Inc\nPlumsail Actions\nBy: Plumsail\nPlumsail Documents\nBy: Plumsail\nPlumsail Forms\nBy: Plumsail Inc.\nPlumsail HelpDesk\nBy: Plumsail Inc.\nPoka\nBy: Poka Inc\nPokeAPI Core (Independent Publisher)\nBy: Fördős András\nPokeAPI World (Independent Publisher)\nBy: Fördős András\nPolaris PSA\nBy: Replicon Inc\nPoliteMail\nBy: PoliteMail Software\nPolygon (Independent Publisher)\nBy: Itransition Group Ltd\nPortfolio and Roadmap\nBy: Microsoft\nPostgreSQL\nBy: Microsoft\nPostman (Independent Publisher)\nBy: Fördős András\nPowell Teams\nBy: Powell Software\nPower Apps for Admins\nBy: Microsoft\nPower Apps for Makers\nBy: Microsoft\nPower Apps Notification\nBy: Microsoft\nPower Apps Notification V2\nBy: Microsoft\nPower Assist\nBy: Elevate Digital\nPower Automate for Admins\nBy: Microsoft\nPower Automate Management\nBy: Microsoft\nPower BI\nBy: Microsoft\nPower Form 7\nBy: Reenhanced LLC\nPower Platform for Admins\nBy: Microsoft\nPower Platform for Admins V2\nBy: Microsoft\nPower Query Dataflows\nBy: Microsoft\nPower Textor\nBy: Imperium Dynamics\nPower Virtual Agents\nBy: Microsoft\nPPM Express\nBy: PPM Express Corporation\nPreserve365\nBy: Preservica\nPrexView (Independent Publisher)\nBy: Troy Taylor\nPriority Matrix\nBy: Appfluence Inc\nPriority Matrix HIPAA\nBy: Appfluence Inc\nPriva\nBy: Microsoft, Purview Privacy\nProcess Mining\nBy: Microsoft\nProcess Street\nBy: Process Street\nProcess Street MCP Server\nBy: Process Street\nProfisee\nBy: Profisee\nProgressus Advanced Projects\nBy: Plumbline Consulting\nProject Online\nBy: Microsoft\nProjectPlace\nBy: Planview inc.\nProjectum Present It\nBy: Projectum\nProjectWise Design Integration\nBy: Bentley Systems, Incorporated\nProjectwise Share [DEPRECATED]\nBy: Bently Systems, Inc.\nProPublica Campaign Finance (Independent Publisher)\nBy: Troy Taylor\nProPublica Congress (Independent Publisher)\nBy: Troy Taylor, Hitachi Solutions\nProPublica Nonprofit Explorer (Independent Publisher)\nBy: Troy Taylor\nPROS AI\nBy: PROS Inc.\nPublic 360\nBy: Tietoevry Norway\nPUG Gamified Engagement\nBy: Pug Interactive Inc\nPure Leads\nBy: Pure Digital Pte Ltd\nPushcut\nBy: Pushcut\nPushover (Independent Publisher)\nBy: Glen Hutson\nQdrant (Independent Publisher)\nBy: Anush\nQnA Maker\nBy: Microsoft\nQPP NextGen\nBy: Quark Software Inc.\nQuickbase (Independent Publisher)\nBy: Troy Taylor, Hitachi Solutions\nQuickBooks Time (Independent Publisher)\nBy: Artesian Software Technologies LLP\nQuickChart (Independent Publisher)\nBy: Troy Taylor\nr/SpaceX (Independent Publisher)\nBy: Troy Taylor, Hitachi Solutions\nRainbird\nBy: Rainbird Technologies ltd\nRamQuest Actions\nBy: Ramquest Software, Inc\nRamQuest Events\nBy: RamQuest Software, Inc\nRAPID Platform\nBy: RAPID Platform\nRarible (Independent Publisher)\nBy: Roy Paar\nReachability (Independent Publisher)\nBy: Troy Taylor, Hitachi Solutions\nReadwise (Independent Publisher)\nBy: Troy Taylor\nRealFaviconGenerator (Independent Publisher)\nBy: Troy Taylor\nRebrandly (Independent Publisher)\nBy: That API Guy\nRebrickable (Independent Publisher)\nBy: Troy Taylor\nReceptful\nBy: Los Trigos, Inc\nRecorded Future [DEPRECATED]\nBy: Recorded Future\nRecorded Future Identity\nBy: Recorded Future\nRecorded Future Sandbox\nBy: Recorded Future\nRecorded Future V2\nBy: Recorded Future\nRedmine\nBy: Microsoft\nRedque\nBy: Redque s.r.o.\nReflect\nBy: Troy Taylor\nRefuge Restrooms (Independent Publisher)\nBy: Troy Taylor\nRegEx Matching (Independent Publisher) [DEPRECATED]\nBy: Mitanshu Garg\nRegexFlow ExecutePython\nBy: Epicycle\nRegexFlow Regular Expression\nBy: Epicycle\nRegoLink for Clarity PPM\nBy: Rego Consulting Corporation\nReliefWeb (Independent Publisher)\nBy: Troy Taylor\nRencore Code\nBy: Rencore GmbH\nRencore Governance\nBy: Rencore GmbH\nRepfabric\nBy: Repfbaric\nRepfabric Job Loader\nBy: Repfabric LLC\nRepfabric Lead Loader\nBy: Repfabric LLC\nReplicate (Independent Publisher)\nBy: Troy Taylor\nReplicon\nBy: Replicon Inc\nRequestor\nBy: Requestor\nResco Cloud\nBy: Resco\nResco Reports\nBy: Resco\nRescueGroups (Independent Publisher)\nBy: Troy Taylor\nResend (Independent Publisher)\nBy: Troy Taylor\nREST Countries (Independent Publisher)\nBy: Siddharth Vaghasia\nRetarus SMS\nBy: retarus GmbH\nRev AI (Independent Publisher)\nBy: Troy Taylor\nRevelation helpdesk\nBy: Yellowfish Software\nReversingLabs A1000\nBy: ReversingLabs\nReversingLabs TitaniumCloud\nBy: ReversingLabs\nRevue (Independent Publisher)\nBy: Daniel Laskewitz\nRijksmuseum (Independent Publisher)\nBy: Ashwin Ganesh Kumar\nRijksoverheid (Independent Publisher)\nBy: Dennis Goedegebuure\nRiskIQ\nBy: Microsoft\nRiskIQ Digital Footprint\nBy: RiskIQ\nRiskIQ Illuminate\nBy: RiskIQ\nRobohash (Independent Publisher)\nBy: Troy Taylor, Hitachi Solutions\nRobolytix\nBy: Robolytix\nRobots for Power BI\nBy: DevScope S.A.\nRon Swanson Quotes (Independent Publisher)\nBy: Troy Taylor\nRowShare\nBy: ROWSHARE\nRSign\nBy: RPost US Inc\nRSS\nBy: Microsoft\nSalesforce\nBy: Microsoft\nSAP\nBy: Microsoft\nSAP ERP\nBy: Microsoft\nSAP OData\nBy: Microsoft\nSapling.ai (Independent Publisher)\nBy: Fördős András\nSAS Decisioning\nBy: SAS Institute, Inc.\nScanCloud\nBy: Scancloud\nSchiphol Airport (Independent Publisher)\nBy: Michel Gueli\nSchoolDigger (Independent Publisher)\nBy: Troy Taylor\nScrapingBee (Independent Publisher)\nBy: Troy Taylor\nScreenshot One (Independent Publisher)\nBy: Troy Taylor\nScrive eSign\nBy: Scrive\nScryfall (Independent Publisher)\nBy: Troy Taylor\nSearchAPI - Google Search (Independent Publisher)\nBy: Troy Taylor\nSECIB\nBy: SECIB\nSecret Server\nBy: Delinea, Inc.\nSecure Code Warrior (Independent Publisher)\nBy: Hitachi Solutions\nSeeBotRun - Link\nBy: SeeBotRun\nSeekTable\nBy: SeekTable.com\nSeismic\nBy: Seismic Software, Inc.\nSeismic Configuration\nBy: Seismic\nSeismic Content Discovery\nBy: Seismic\nSeismic Engagement\nBy: Seismic\nSeismic for Copilot for Sales\nBy: Seismic Software\nSeismic Library\nBy: Seismic\nSeismic Livedoc\nBy: Seismic\nSeismic Planner\nBy: Seismic\nSeismic Programs\nBy: Seismic Software\nSeismic Workspace\nBy: Seismic\nSendFox (Independent Publisher)\nBy: Troy Taylor\nSendGrid\nBy: Microsoft\nSendmode\nBy: SendMode\nServerless360 BAM & Tracking\nBy: Kovai Limited\nService Bus\nBy: Microsoft\nService Objects\nBy: Service Objects\nServiceDesk Plus Cloud\nBy: ManageEngine (A division of Zoho Corporation)\nServiceNow\nBy: Microsoft\nSerwerSMS\nBy: SerwerSMS\nSessionize (Independent Publisher)\nBy: Nanddeep Nachan, Smita Nachan\nSFTP - SSH\nBy: Microsoft\nSFTP [DEPRECATED]\nBy: Microsoft\nShadify (Independent Publisher)\nBy: Troy Taylor\nShare-Effect\nBy: ShareEffect\nSharePoint\nBy: Microsoft\nSharePoint Embedded\nBy: Microsoft\nSherpa Digital\nBy: Sherpa Digital\nShields.io (Independent Publisher)\nBy: Troy Taylor, Hitachi Solutions\nShifts for Microsoft Teams\nBy: Microsoft\nShipStation IP (Independent Publisher)\nBy: Kristian Matthews\nShop (Independent Publisher)\nBy: Microsoft\nShopify (Independent Publisher)\nBy: Ray Bennett (MSFT)\nShopranos\nBy: SoftOne Technologies S.A\nShort URL\nBy: APPS 365 LTD\nShortySMS (Independent Publisher)\nBy: Troy Taylor\nShowcase Workshop\nBy: Showcase Software Ltd\nShowpad eOS\nBy: Showpad\nSHRTCODE (Independent Publisher)\nBy: Chandra Sekhar Malla\nSigma Conso CR\nBy: Sigma Conso\nSignatureAPI\nBy: SignatureAPI\nSignhost\nBy: Signhost\nSigni.com\nBy: NETWORG\nSigningHub\nBy: Ascertia\nSIGNL4 - Mobile Alerting\nBy: Derdack\nSignNow\nBy: DaDaDocs\nSignNow EU\nBy: DaDaDocs\nSignRequest\nBy: SignRequest B.V.\nSignUpGenius (Independent Publisher)\nBy: Troy Taylor\nSimple EDI\nBy: Weavo Liquid Loom\nSimpleSurvey\nBy: SimpleSurvey\nSinch\nBy: Sinch Sweden AB\nSirva Relocating Employee\nBy: Sirva Relocation\nSkribble Sign\nBy: busitec GmbH\nSkype for Business Online [DEPRECATED]\nBy: Microsoft\nSkyPoint Cloud\nBy: SkyPoint Cloud\nSlack\nBy: Microsoft\nSlascone\nBy: SLASCONE GmbH\nsmapOne\nBy: smapOne AG\nSmarp\nBy: Smarp\nSmartCOMM DocGen\nBy: Smart Communications\nSmartDialog\nBy: Arena Interactive Oy\nSmarter Drafter\nBy: Tensis Group\nSmartsheet\nBy: Microsoft\nSmileBack\nBy: ConnectWise SmileBack\nSMS Wireless Services (Independent Publisher)\nBy: ViaData\nsms77io\nBy: sms77 e.K.\nSMSAPI\nBy: LINK Mobility Poland\nSMSLink\nBy: ASTINVEST COM SRL (SMSLink)\nSMTP\nBy: Microsoft\nSnowflake\nBy: Snowflake\nSociabble\nBy: Sociabble\nSocialinsider\nBy: Socialinsider\nSoft1\nBy: SoftOne Technologies S.A\nSoftone Web CRM\nBy: Softone Technologies\nSoftools\nBy: Softools Limited\nSolarEdge (Independent Publisher)\nBy: Richard Wierenga\nSoloSign HMAC Hash Creator\nBy: Solort\nSOS Inventory (Independent Publisher)\nBy: Harold Anderson\nSparkPost\nBy: Microsoft\nSparse Power Box Tools\nBy: Sparse Development\nSpinpanel\nBy: Spinpanel B.V.\nSpoonacular Food (Independent Publisher)\nBy: Amjed Ayoub\nSpoonacular Meal Planner (Independent Publisher)\nBy: Amjed Ayoub\nSpoonacular Recipe (Independent Publisher)\nBy: Amjed Ayoub\nSpotify (Independent Publisher)\nBy: Daniel Laskewitz\nSpring Global\nBy: Enavate\nSQL Server\nBy: Microsoft\nSquare Business (Independent Publisher)\nBy: Troy Taylor\nSquare Payments (Independent Publisher)\nBy: Troy Taylor\nStability.ai (Independent Publisher)\nBy: Troy Taylor\nStaffbase\nBy: Staffbase GmbH\nStaffCircle\nBy: StaffCircle\nStandard approvals\nBy: Microsoft\nStar Wars (Independent Publisher)\nBy: Paul Culmsee\nStarmind\nBy: Starmind (inc)\nStarRez REST v1\nBy: StarRez, Inc.\nStorm Glass (Independent Publisher)\nBy: Paul Culmsee\nStormboard\nBy: Stormboard\nStraker Verify\nBy: Straker Group\nStrava (Independent Publisher)\nBy: Richard Wierenga\nStripe\nBy: Microsoft\nStudio Ghibli (Independent Publisher)\nBy: Troy Taylor\nSunrise-Sunset (Independent Publisher)\nBy: Fördős András\nSuperMCP\nBy: Supermetrics\nSupportivekoala (Independent Publisher)\nBy: Troy Taylor\nSureXeroLite (Independent Publisher)\nBy: The 848 Group\nSurvalyzer EU\nBy: Survalyzer AG\nSurvalyzer Swiss\nBy: Survalyzer AG\nSurvey123\nBy: ArcGIS Survey123\nSurveyMonkey\nBy: Microsoft\nSurveyMonkey Canada\nBy: SurveyMonkey\nSwagger Converter (Independent Publisher)\nBy: Fordos Andras\nSynthesia (Independent Publisher)\nBy: Troy Taylor\nT.LY (Independent Publisher)\nBy: Troy Taylor\nTabscanner Receipt OCR (Independent Publisher)\nBy: Ben Smith\nTAGGUN Receipt OCR Scanning (Independent Publisher)\nBy: Amjed Ayoub\nTago\nBy: Tago LLC\nTaktikal Core\nBy: Taktikal\nTalkdesk\nBy: Talkdesk\nTallyfy\nBy: Tallyfy, Inc\nTALXIS Data Feed\nBy: TALXIS\nTaqnyat\nBy: Taqnyat Network Operation\nTavily (Independent Publisher)\nBy: Troy Taylor\nTax ID Pro (Independent Publisher)\nBy: Fördős András\nTDox\nBy: Seltris srl\nTeam Forms\nBy: VP LABS PTY LTD\nTeamflect\nBy: Teamflect\nTeams-Spirit\nBy: D.F.K. Digitalteamwork GmbH\nTeamWherx\nBy: Actsoft\nTeamwork Projects\nBy: Microsoft\ntegolySIGN\nBy: tegoly GmbH\nTelegram Bot (Independent Publisher)\nBy: Woong Choi\nTelephony Xtended Serv Interf\nBy: BluIP, Inc.\nTeleSign SMS\nBy: TeleSign Corporation\nTemplafy\nBy: Templafy\nTendocs Documents\nBy: Deepdale BV\nTeradata\nBy: Microsoft\nTesseron Asset Management\nBy: Tesseron by Luithle + Luithle GmbH\nTesseron Basic Data\nBy: Tesseron by Luithle + Luithle GmbH\nTesseron Invoice\nBy: Tesseron by Luithle + Luithle GmbH\nTesseron Ticket\nBy: Tesseron by Luithle + Luithle GmbH\nText Analytics\nBy: MAQ Software\nText Request\nBy: Text Request\nThe Bot Platform\nBy: The Bot Platform\nThe Brønnøysund Registries (Independent Publisher)\nBy: Ahmad Najjar\nThe Color (Independent Publisher)\nBy: Troy Taylor\nThe Events Calendar\nBy: The Events Calendar\nThe Guardian (Independent Publisher)\nBy: Troy Taylor\nThe IT Tipster\nBy: The IT Tipster\nThe Lord of the Rings (Independent Publisher)\nBy: Troy Taylor\nThe SMS Works (Independent Publisher)\nBy: Troy Taylor\nThe Weather Channel (Independent Publisher)\nBy: Roy Paar\nTheGoodAPI (Independent Publisher)\nBy: Troy Taylor\nTheMealDB (Independent Publisher)\nBy: John Muchiri\nThreads (Independent Publisher)\nBy: Troy Taylor\nTicketing.events\nBy: Ventipix\nTicketmaster (Independent Publisher)\nBy: Troy Taylor\nTikit\nBy: Cireson\nTiliter Vision Agents\nBy: Tiliter Pty Ltd\nTilkee\nBy: Microsoft\nTimeAPI (Independent Publisher)\nBy: Fördős András\ntimeghost\nBy: timeghost.io\nTimeneye\nBy: DM Digital Software SRL\nTLDR\nBy: Troy Taylor\nToday in History (Independent Publisher)\nBy: Troy Taylor\nTodoist\nBy: Microsoft\nToggl Plan (Independent Publisher)\nBy: Daniel Laskewitz\nToggl Track (Independent Publisher)\nBy: troystaylor\nTomorrow.io (Independent Publisher)\nBy: Troy Taylor\nToodledo\nBy: Microsoft\nTophhie Cloud\nBy: Chris Greenacre\nTopMessage\nBy: TOP X\ntouchSMS\nBy: Edgility\nTPC Portal\nBy: The Portal Connector\nTraction Guest\nBy: Traction Guest\nTrade.Gov (Independent Publisher)\nBy: Dan Romano\nTransform2All\nBy: GAC Business Solutions\nTree-Nation (Independent Publisher)\nBy: Troy Taylor\nTrello\nBy: Microsoft\nTribal - Maytas\nBy: Tribal Group\nTribal - Platform\nBy: Tribal Group\nTribal - SITS\nBy: Tribal Group\nTRIGGERcmd\nBy: VanderMey Consulting, LLC\nTrovve\nBy: Trovve Inc\nTrueDialog SMS\nBy: TrueDialog Dynamics\nTrustual\nBy: Practical Crypto SpA\nTulip\nBy: Tulip Interfaces\nTumblr (Independent Publisher)\nBy: Troy Taylor\nTuxMailer\nBy: TuxMailer\nTwilio\nBy: Microsoft\nTxtSync\nBy: TxtSync Limited\ntyntec 2FA\nBy: tyntec GmbH\ntyntec Phone Verification\nBy: tyntec GmbH\ntyntec SMS Business\nBy: tyntec GmbH\ntyntec Viber Business\nBy: tyntec GmbH\ntyntec WhatsApp Business\nBy: tyntec GmbH\nTypeform\nBy: Microsoft\nU.S. Bank Treasury Management\nBy: U.S. Bank\nUber Freight\nBy: Microsoft\nUbiqod by Skiply\nBy: Skiply\nUbiqod by Taqt\nBy: Skiply\nUdemy (Independent Publisher)\nBy: Nanddeep Nachan, Smita Nachan\nUiPath\nBy: UiPath Incorporated\nUiPath Orchestrator\nBy: UiPath\nUK Bank Holidays (Independent Publisher)\nBy: Martyn Lesbirel, Troy Taylor\nUK Check VAT (Independent Publisher)\nBy: Fördős András\nUKG Pro HCM\nBy: Dilip Chenani\nUKG PRO WFM Authentication\nBy: UKG, Inc.\nUKG Pro WFM Employee\nBy: Ria Gupta\nUKG Pro WFM People\nBy: Dilip Chenani\nUKG Pro WFM Timekeeping\nBy: Ria Gupta\nUnix Timestamp (Independent Publisher)\nBy: Troy Taylor, Hitachi Solutions\nUnofficial Netflix Search (Independent Publisher)\nBy: Troy Taylor\nUnsplash (Independent Publisher)\nBy: Troy Taylor, Hitachi Solutions\nUpdates App (Microsoft 365)\nBy: Microsoft\nUpdown (Independent Publisher)\nBy: Fordos Andras\nUpland Panviva US\nBy: Upland Software Inc..\nURL.dev (Independent Publisher)\nBy: Troy Taylor\nUrLBae (Independent Publisher)\nBy: Troy Taylor\nUS Congress CRS (Independent Publisher)\nBy: Dan Romano\nUS Patent & Trademark Office (Independent Publisher)\nBy: krautrocker\nUSAJOBS (Independent Publisher)\nBy: Richard Wilson\nUSB4SAP\nBy: Ecoservity\nUserVoice\nBy: Microsoft\nUSGS Earthquake Hazards (Independent Publisher)\nBy: Troy Taylor\nVantage 365 Imaging\nBy: Vantage 365 LTD\nVaruna\nBy: Univera Computer Systems Industry and Trade Inc.\nvatcheckapi\nBy: Fördős András\nVena Solutions\nBy: Vena Solutions\nVentipix Asset and Inventory\nBy: Ventipix\nVerified\nBy: Crm - Konsulterna i Sverige AB\nVeteran Confirmation (Independent Publisher)\nBy: Troy Taylor\nVeterans Affairs Facilities (Independent Publisher)\nBy: Richard Wilson\nVeterans Affairs Forms (Independent Publisher)\nBy: Richard Wilson\nVeterans Affairs Providers (Independent Publisher)\nBy: Richard Wilson\nViafirma\nBy: Viafirma\nVideo Indexer (V2)\nBy: Microsoft\nVIES (Independent Publisher)\nBy: Tomasz Poszytek\nVimeo\nBy: Microsoft\nVirtual Data Platform\nBy: Virtual_Data_Platform_GmbH\nVirus Total\nBy: Microsoft\nViva Engage\nBy: Microsoft\nVocean\nBy: Vocean AB\nVoice Monkey (Independent Pubshisher)\nBy: Richard Wilson\nVoiceRSS (Independent Pubisher)\nBy: Fördős András\nVome\nBy: Vome Volunteer\nVonage\nBy: Vonage\nWaaila\nBy: Cross Masters s.r.o.\nWay We Do\nBy: Way We Do\nWayback Machine (Independent Publisher)\nBy: Fördős András\nWeather Forecast (Independent Publisher)\nBy: Haimantika Mitra\nWeavo Liquid Loom\nBy: Weavo Liquid Loom\nWebex\nBy: Cisco\nWebex Integration (Independent Publisher)\nBy: University College London, Oscar Hui\nWebhood URL Scanner\nBy: Webhood\nWebsite Carbon (ndependent Publisher)\nBy: Clement Olivier\nWenDocs Linker\nBy: WenDocs Ltd\nWhat3Words (Independent Publisher)\nBy: Matt Beard\nWhatIsMyBrowser (Independent Publisher)\nBy: Troy Taylor\nWhatsApp (Independent Publisher)\nBy: Zakariya Fakira\nWindows 365\nBy: Microsoft\nWithoutWire Inventory Platform\nBy: Enavate\nWitivio\nBy: Witivio\nWMATA (Independent Publisher)\nBy: Richard Wilson, Daniel Cox\nWooCommerce\nBy: Reenhanced, LLC\nWoodpecker (Independent Publisher)\nBy: Troy Taylor\nWord Cloud by Textvis (Independent Publisher)\nBy: Troy Taylor\nWord Online (Business)\nBy: Microsoft\nWordPress\nBy: Microsoft\nWork IQ Calendar MCP\nBy: Microsoft\nWork IQ Copilot MCP\nBy: Microsoft\nWork IQ Mail MCP\nBy: Microsoft\nWork IQ Teams MCP\nBy: Microsoft\nWork IQ User MCP\nBy: Microsoft\nWork IQ Word MCP\nBy: Microsoft\nWorkable (Independent Publisher)\nBy: David Kjell\nWorkday HCM\nBy: Microsoft\nWorkday SOAP\nBy: Microsoft\nWorking days (Independent Publisher)\nBy: Tomasz Poszytek\nWorkMobile\nBy: eSAY Solutions Ltd\nWorkPoint\nBy: WorkPoint\nWorkPoint 365\nBy: WorkPoint 365\nWorkSpan\nBy: WorkSpan\nWorkstem AU\nBy: OneJob Group Limited\nWorkstem HK\nBy: OneJob Group Limited\nWorld Academia\nBy: Kelcho Tech\nWorldTime (Independent Publisher)\nBy: Fördős András\nWorldwide Bank Holidays (Independent Publisher)\nBy: Reshmee Auckloo\nWP Connectr for WordPress\nBy: Reenhanced, LLC\nWPForms by Reenhanced LLC\nBy: Reenhanced, LLC\nWQRM Risk Forecast Services\nBy: Western QRM\nWritesonic (Independent Publisher)\nBy: Troy Taylor\nwttr.in (Independent Publisher)\nBy: Troy Taylor\nX\nBy: Microsoft\nX12\nBy: Microsoft\nXbridger Document Manager\nBy: Xbridger Solutions\nXC-Gate\nBy: TECHNOTREE CO., LTD.\nXero Accounting - Magnetism\nBy: Magnetism\nxkcd (Independent Publisher)\nBy: Troy Taylor\nXooa Blockchain Database\nBy: Xooa Inc\nXooa Blockchain Smart Contract\nBy: Xooa Inc\nXpertdoc (Deprecated) [DEPRECATED]\nBy: Xpertdoc Technologies Inc.\nXSOAR (Independent Publisher)\nBy: Landon Chelf\nXSS PDF Solutions Integrations\nBy: Cross-Service-Solutions\nXSS QR Code Solutions\nBy: Cross-Service-Solutions\nYakChat\nBy: YakChat Ltd.\nYarado\nBy: Yarado\nYeeflow\nBy: YEEFLOW SINGAPORE PTE LTD\nYeelight\nBy: Qingdao Yeelink Information Technology Co., Ltd.\nYelp (Independent Publisher)\nBy: Ahmad Najjar\nYou Need A Budget (Independent Publisher)\nBy: Troy Taylor\nYouTube\nBy: Microsoft\nYouTube Transcript (Independent Publisher)\nBy: troystaylor\nZahara\nBy: Zahara Systems Ltd\nZanran Scaffolder\nBy: Zanran Ltd\nZapier MCP\nBy: Zapier Inc\nZapier NLA (Independent Publisher)\nBy: Troy Taylor\nZellis\nBy: Zellis\nZendesk\nBy: Microsoft\nZenkraft\nBy: Zenkraft\nZenler (Independent Publisher)\nBy: Troy Taylor\nZenlogin (Independent Publisher)\nBy: Troy Taylor\nZippopotamus (Independent Publisher)\nBy: Tomasz Poszytek\nZIPPYDOC\nBy: ZippyDoc GmbH\nZoho Mail\nBy: Zoho Corporation Private Limited\nZoho Calendar\nBy: Zoho Mail\nZoho Forms\nBy: Zoho Corporation Private Limited\nZoho Invoice Basic (Independent Publisher)\nBy: Troy Taylor\nZoho Sign\nBy: Zoho Corporation\nZoho TeamInbox\nBy: Zoho Corporation Private Limited\nZoho ZeptoMail\nBy: Zoho Corporation Private Limited\nZoom Meetings (Independent Publisher)\nBy: Akuthota Deekshith\nzReports\nBy: zReports Software s.r.o.\nZuva DocAI\nBy: Zuva Inc.\nZvanu Parvaldnieks\nBy: Latvijas Mobilais Telefons\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "content_hash": "sha256:a48d2b996bb47fed744475c2402573c1dd8958e82a36114fcebca3ceb556d91f", + "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nConnector reference overview\nFeedback\nSummarize this article for me\nThis page summarizes key information of all connectors currently provided for Microsoft Power Automate, Microsoft Power Apps, and Azure Logic Apps. In addition to the connector icon and name, the following information is provided:\nAvailable in Azure Logic Apps.\nAvailable in Power Automate.\nAvailable in Power Apps.\nThis is an MCP Server connector.\nThis is a Preview connector.\nThis is a Premium connector for Power Automate and Power Apps or a Standard connector for Azure Logic Apps.\nYou can select a connector to view more detailed connector-specific documentation including its functionality and region availability. You can also filter all connectors by a certain category. Note that filters do not stack and each link will take you to another page within the documentation site.\nFilter all connectors by:\nTier\nRelease Status\nProduct\nPublisher\nStandard\nPreview\nPower Apps\nMicrosoft\nPremium\nProduction\nPower Automate\nNon-Microsoft\nLogic Apps\nMCP Server\nList of Connectors\n}exghts gen. Document & more\nBy: }exghts\n10to8 Appointment Scheduling\nBy: 10to8 Ltd\n1DocStop\nBy: 1DocStop\n1Me Corporate\nBy: 1Me\n1pt (Independent Publisher)\nBy: Troy Taylor\n24 pull request (Independent Publisher)\nBy: Bernard Karaba\n365 Training\nBy: We Speak You Learn, LLC\n3E Events\nBy: Elite Technology\n9A Raptor Document Warehouse\nBy: 9altitudes\nAbbreviations\nBy: Troy Taylor\nAbortion Policy (Independent Publisher)\nBy: That API Guy\nabsentify\nBy: BrainCore Solutions\nAbstract Company Enrichment (Independent Publisher)\nBy: Fördős András\nAbstract Email Validator (Independent Publisher)\nBy: Fördős András\nAbstract Exchange Rates (Independent Publisher)\nBy: Fördős András\nAbstract Holidays (Independent Publisher)\nBy: Fördős András\nAbstract IBAN Validator (Independent Publisher)\nBy: Fördős András\nAbstract IP Geolocation (Independent Publisher)\nBy: Fördős András\nAbstract Phone Validator (Independent Publisher)\nBy: Fördős András\nAbstract Timezones (Independent Publisher)\nBy: System Administrator\nAbstract VAT Validator (Independent Publisher)\nBy: Fördős András\nAccuWeather (Independent Publisher)\nBy: troystaylor\nAct!\nBy: Swiftpage ACT!\nActivityInfo\nBy: ActivityInfo\nAcumatica\nBy: Acumatica\nAddress Labs (Independent Publisher)\nBy: Richard Wilson\nAdobe Acrobat Sign\nBy: ADOBE INC.\nAdobe Acrobat Sign Sandbox\nBy: Adobe Inc.\nAdobe Creative Cloud\nBy: Adobe Inc\nAdobe Experience Manager\nBy: Adobe\nAdobe PDF Services\nBy: Adobe Acrobat Services\nAdvanced Data Operations\nBy: State Solutions\nAdvanced Scraper (Independent Publisher)\nBy: Troy Taylor, Hitachi Solutions\nAexum\nBy: Nodefusion d.o.o\nAffirmations (Independent Publisher)\nBy: Troy Taylor\nAfrica's Talking Airtime\nBy: Africa's Talking\nAfrica's Talking SMS\nBy: Africa's Talking\nAfrica's Talking Voice\nBy: Africa's Talking\nAfterShip (Independent Publisher)\nBy: Taiki Yoshida\nAgilePoint NX\nBy: AgilePoint Inc\nAgilite\nBy: Agilit-e\nAhead\nBy: ahead AG\nAhead (Intranet)\nBy: ahead AG\nAI or Not (Independent Publisher)\nBy: Fördős András\nAIForged\nBy: Larc AI (PTY) Ltd\nAIHW MyHospitals (Independent Publisher)\nBy: Paul Culmsee\nAikiDocs\nBy: Aiki-Mind Services Inc.\nAirlabs\nBy: Fördős András\nAirly (Independent Publisher)\nBy: Tomasz Poszytek\nAirmeet\nBy: Airmeet\nairSlate\nBy: airSlate Inc.\nAirtable (Independent Publisher) [DEPRECATED]\nBy: Woong Choi\nAlemba ITSM\nBy: Alemba Ltd\nAletheia\nBy: Aletheia\nAlisQI\nBy: AlisQI BV\nAlkymi\nBy: Alkymi\nallGeo\nBy: Abaqus\nAlly\nBy: Aliru\nAlmabase\nBy: Almabase, Inc.\nAlmanac (Independent Publisher)\nBy: Troy Taylor\nALVAO\nBy: ALVAO\nAmazon Redshift\nBy: Microsoft\nAmazon S3\nBy: Microsoft\nAmazon S3 Bucket (Independent Publisher)\nBy: Michael Megel\nAmazon SQS\nBy: Microsoft\nAmbee (Independent Publisher)\nBy: Troy Taylor, Hitachi Solutions\nAMEE Open Business (Independent Publisher)\nBy: Paul Culmsee\nAnnature (Independent Publisher)\nBy: Dr Adrian Colquhoun (Strategik)\nAnt Text Automation\nBy: Insight Office\nAnthropic (Independent Publisher)\nBy: Troy Taylor\nANY.RUN Threat Intelligence\nBy: ANYRUN FZCO\nApache Impala\nBy: Microsoft\nAPITemplate (Independent Publisher)\nBy: Troy Taylor\nAPlace.io (Independent Publisher)\nBy: Troy Taylor\nApp Power Forms\nBy: App Power Solutions LLC\nApp Store Connect - App Store (Independent Publisher)\nBy: Farhan Latif\nAppfigures\nBy: Microsoft\nAppsForOps Timeline\nBy: AppsForOps\nApptigent PowerTools\nBy: Apptigent\nApptigent PowerTools LITE\nBy: Apptigent Limited\nApyHub (Independent Publisher)\nBy: Troy Taylor\nApyHub Document Readability (Independent Publisher)\nBy: Troy Taylor\nApyHub Generate iCal (Independent Publisher)\nBy: Troy Taylor\nAquaforest PDF\nBy: Aquaforest Limited\nAranda Service Management\nBy: Aranda Software Corporation\nArcGIS\nBy: Esri, Inc.\nArcGIS Enterprise\nBy: Esri, Inc.\nArcGIS PaaS\nBy: Esri, Inc.\nAS2\nBy: Microsoft\nAsana\nBy: Microsoft\nAsite\nBy: Asite Solutions Pvt Ltd\nAsite (Canada)\nBy: Asite Solutions Limited\nAsite (Hong Kong)\nBy: Asite Solutions Limited\nAsite (KSA)\nBy: Asite Solutions Limited\nAsite (UAE)\nBy: Asite Solutions Limited\nAsite (US Gov.)\nBy: Asite Solutions Limited\nASPSMS\nBy: Vadian .Net AG\nAssemblyAI\nBy: AssemblyAI\nAssently E-Sign\nBy: Assently AB\nAtBot Admin\nBy: H3 Solutions Inc.\nAtBot Logic\nBy: H3 Solutions Inc.\nAutenti E-Signature Workflow\nBy: Autenti sp. z o.o.\nAutodesk Data Exchange\nBy: Autodesk, Inc.\nAutoReview\nBy: Power DevBox\nAutoSeller\nBy: Microsoft Corporation\nAvePoint Cloud Governance\nBy: AvePoint, inc.\nAviationstack (Independent Publisher)\nBy: Troy Taylor, Hitachi Solutions\nAWeber\nBy: Microsoft\nAzure AD Identity and Access\nBy: Microsoft, Daniel Laskewitz\nAzure AI Document Intelligence (form recognizer)\nBy: Microsoft\nAzure AI Foundry Agent Service\nBy: Microsoft\nAzure AI Foundry Inference\nBy: Microsoft\nAzure AI Search\nBy: Microsoft\nAzure App Service\nBy: Microsoft\nAzure Application Insights\nBy: Microsoft\nAzure Automation\nBy: Microsoft\nAzure Batch Speech-to-text\nBy: Microsoft\nAzure Blob Storage\nBy: Microsoft\nAzure Cognitive Service for Language\nBy: Microsoft\nAzure Communication Chat\nBy: Microsoft\nAzure Communication Email\nBy: Microsoft\nAzure Communication Services Identity\nBy: Microsoft\nAzure Communication Services SMS\nBy: Microsoft\nAzure Communication Services SMS Events\nBy: Microsoft\nAzure Confidential Ledger\nBy: Microsoft Corporation\nAzure Container Instance\nBy: Microsoft\nAzure Cosmos DB\nBy: Microsoft\nAzure Data Explorer\nBy: Microsoft\nAzure Data Factory\nBy: Microsoft\nAzure Data Lake\nBy: Microsoft\nAzure Database for MySQL\nBy: Microsoft\nAzure Databricks\nBy: Databricks Inc.\nAzure DevOps\nBy: Microsoft\nAzure Digital Twins\nBy: Microsoft Corporation\nAzure Event Grid\nBy: Microsoft\nAzure Event Grid Publish\nBy: Microsoft\nAzure File Storage\nBy: Microsoft\nAzure IoT Central V2\nBy: Microsoft Corporation\nAzure IoT Central V3\nBy: Microsoft Corporation\nAzure Key Vault\nBy: Microsoft\nAzure Log Analytics [DEPRECATED]\nBy: Microsoft\nAzure Log Analytics Data Collector\nBy: Microsoft\nAzure Monitor Logs\nBy: Microsoft\nAzure OpenAI\nBy: Microsoft\nAzure Queues\nBy: Microsoft\nAzure Resource Manager\nBy: Microsoft\nAzure Speech Pronunciation Assessment\nBy: Microsoft\nAzure SQL Data Warehouse\nBy: Microsoft\nAzure Table Storage\nBy: Microsoft\nAzure Text to speech\nBy: Microsoft\nAzure VM\nBy: Microsoft\nBadgr (Independent Publisher)\nBy: Troy Taylor, Hitachi Solutions\nBasecamp 2\nBy: Microsoft\nBasecamp 3\nBy: Microsoft\nBBC News (Independent Publisher)\nBy: krautrocker\nBeauhurst (Independent Publisher)\nBy: Axazure\nBenchmark Email\nBy: Microsoft\nBenifex\nBy: Benefex Ltd\nBigdata-com\nBy: RAVENPACK INTERNATIONAL SL.\nBillsPLS\nBy: IN-D by Intain\nBIN Checker (Independent Publisher)\nBy: Troy Taylor\nBinance.us (Independent Publisher)\nBy: Roy Paar\nBing Maps\nBy: Microsoft\nBing Search\nBy: Microsoft\nBitbucket\nBy: Microsoft\nBitly\nBy: Microsoft\nBitlyIP (Independent Publisher)\nBy: Troy Taylor\nBitskout\nBy: Bitskout\nBitvore Cellenus\nBy: Bitvore Corp.\nBizTalkServer\nBy: Microsoft\nBKK Futar (Independent Publisher)\nBy: Fördős András\nBlackbaud Altru Constituent\nBy: Blackbaud, Inc.\nBlackbaud Church Management [DEPRECATED]\nBy: Blackbaud, Inc.\nBlackbaud CRM Constituent\nBy: Blackbaud, Inc.\nBlackbaud CRM Prospect\nBy: Blackbaud, Inc.\nBlackbaud FENXT General Ledger\nBy: Blackbaud, Inc.\nBlackbaud FENXT Payable\nBy: Blackbaud, Inc.\nBlackbaud FENXT Query\nBy: Blackbaud. Inc\nBlackbaud Raisers Edge NXT\nBy: Blackbaud, Inc.\nBlackbaud Raisers Edge NXT Constituents\nBy: Blackbaud, Inc.\nBlackbaud Raisers Edge NXT Documents\nBy: Blackbaud, Inc.\nBlackbaud Raisers Edge NXT Events\nBy: Blackbaud, Inc.\nBlackbaud Raisers Edge NXT Fundraising\nBy: Blackbaud, Inc.\nBlackbaud Raisers Edge NXT Interactions\nBy: Blackbaud, Inc.\nBlackbaud Raisers Edge NXT Lists\nBy: Blackbaud, Inc.\nBlackbaud Raisers Edge NXT Prospects\nBy: Blackbaud, Inc.\nBlackbaud RENXT Gifts\nBy: Blackbaud, Inc.\nBlackbaud RENXT Query\nBy: Blackbaud. Inc\nBlackbaud RENXT Reports\nBy: Blackbaud, Inc.\nBlackbaud SKY Add-ins\nBy: Blackbaud. Inc\nBlogger\nBy: Microsoft\nBloomflow\nBy: Bloomflow\nBlueInk\nBy: Blueink\nBluesky Social (Independent Publisher)\nBy: krautrocker\nBoldSign\nBy: Syncfusion-Inc\nboomapp connect\nBy: Boomerang I-Comms Ltd\nBox\nBy: Microsoft\nBox MCP Server\nBy: Box.\nBrave Search (Independent Publisher)\nBy: Troy Taylor\nbttn\nBy: Microsoft\nBttn ONE\nBy: Bttn\nBuffer\nBy: Microsoft\nBuildingMinds DigitalTwin Core\nBy: BuildingMinds\nBulkSMS\nBy: BulkSMS.com\nBureau of Labor Statistics (Independent Publisher)\nBy: krautrocker\nBusiness Assist [DEPRECATED]\nBy: Microsoft\nBusinessmap\nBy: Businessmap\nBuy Me A Coffee (Independent Publisher)\nBy: Troy Taylor\nBuzz\nBy: Skyscape\nByword (Independent Publisher)\nBy: Troy Taylor\nCalculate Working Day\nBy: Tweed Technology Ltd\nCalendar Pro\nBy: Witivio\nCalendarific (Independent Publisher)\nBy: Fordos Andras\nCalendly\nBy: Calendly\nCalendly (legacy)\nBy: Microsoft\nCampfire\nBy: Microsoft\nCandidateZip Resume/Job Parser\nBy: CandidateZip CV/Job Parser\nCapsule CRM\nBy: Microsoft\nCaptisa Forms\nBy: Connect Captisa\nCarbon Intensity (Independent Publisher)\nBy: Hasan Unlu\nCarbonFootprint (Independent Publisher)\nBy: Troy Taylor\nCardPlatform Adaptive Cards\nBy: CardPlatform\nCarsXE (Independent Publisher)\nBy: Troy Taylor\nCascade\nBy: Cascade\nCascade Strategy New\nBy: Nicolas Durik-Ha\nCasper365 for Education\nBy: Microsoft\nCB Blockchain Seal\nBy: Connecting Software s.r.o. & Co. KG\nCData Connect AI\nBy: CData Software Inc\nCDC Content Services (Independent Publisher)\nBy: Troy Taylor\nCDK Drive Customer\nBy: CDK Global\nCDK Drive Service Vehicles\nBy: CDK Global\nCelonis\nBy: Celonis\nCelonis MCP Server\nBy: Celonis GmbH\nCentrical\nBy: Centrical\nCertinal eSign\nBy: Certinal Inc.\nCertopus\nBy: DevSquirrel Technologies Private Limited\nCGTrader\nBy: Microsoft\nChainpoint [DEPRECATED]\nBy: Chainpoint\nChatter\nBy: Microsoft\nCheckly (Independent Publisher)\nBy: Troy Taylor\nChuck Norris IO (Independent Publisher)\nBy: Daniel Laskewitz\ncioplenu\nBy: cioplenu GmbH\nCireson Service Manager Portal\nBy: Cireson\nCisco Webex Meetings\nBy: Cisco\nCitymapper (Independent Publisher)\nBy: Troy Taylor\nCivicPlus Transform\nBy: OneBlink\nClearbit (Independent Publisher)\nBy: Troy Taylor, Hitachi Solutions\nCleverTap\nBy: CleverTap Pvt. ltd.\nClickSend\nBy: Sinch Sweden AB\nClickSend Postcards\nBy: ClickSend Postcards\nClickUp Team Manager (Independent Publisher)\nBy: Duke DeVan\nClimatiq (Independent Publisher)\nBy: Troy Taylor\nClinical Trials (Independent Publisher)\nBy: Troy Taylor\nClockify (Independent Publisher)\nBy: Dr Adrian Colquhoun (Strategik)\nCloud BOT\nBy: C-RISE Ltd.\nCloud Connect Studio\nBy: Fuji Xerox\nCloud PKI Management\nBy: 509 Solutions Pty Ltd\nCloudConvert\nBy: Lunaweb GmbH\nCloudmersive Barcode\nBy: Cloudmersive, LLC\nCloudmersive CDR\nBy: Cloudmersive, LLC\nCloudmersive Currency\nBy: Cloudmersive, LLC\nCloudmersive Data Validation\nBy: Cloudmersive, LLC\nCloudmersive Document Conversion\nBy: Cloudmersive, LLC\nCloudmersive File Processing\nBy: Cloudmersive, LLC\nCloudmersive Image Processing\nBy: Cloudmersive, LLC\nCloudmersive NLP\nBy: Cloudmersive, LLC\nCloudmersive PDF\nBy: Cloudmersive, LLC\nCloudmersive Security\nBy: Cloudmersive, LLC\nCloudmersive Video and Media\nBy: Cloudmersive, LLC\nCloudmersive Virus Scan\nBy: Cloudmersive, LLC\nCloudTools for Salesforce\nBy: Apptigent\nCloverly (Independent Publisher)\nBy: Troy Taylor\nCluedIn\nBy: CluedIn Official\nCMI\nBy: CM Informatik AG\nCO2 Signal (Independent Publisher)\nBy: Paul Culmsee\nCobbleStone - Contract Insight\nBy: Cobblestone Software\nCognito Forms\nBy: Cognito Forms\nCognizant Automation Center\nBy: Cognizant\nCohere (Independent Publisher)\nBy: Troy Taylor\nCohesity Gaia\nBy: Cohesity, Inc.\nCoinbase (Independent Publisher)\nBy: Roy Paar\nCommercient\nBy: Commercient LLC\nCompanies House (Independent Publisher)\nBy: Matt Collins\nCompany Connect\nBy: InSpark\nComposer by Tachytelic\nBy: Accendo Solutions Ltd\nComputer Vision API\nBy: Microsoft\nConfluence\nBy: Microsoft\nConnect2All\nBy: GAC Business Solutions\nConnect2All on-premises\nBy: GAC Business Solutions\nConnective eSignatures\nBy: Connective\nConnectWise PSA (Independent Publisher)\nBy: howellchrisj\nconnpass (Independent Publisher)\nBy: Miyake Hideo\nConsenSys Ethereum (Deprecated) [DEPRECATED]\nBy: ConsenSys\nContacts Pro\nBy: Witivio\nContent Conversion\nBy: Microsoft\nContent Gate\nBy: SignUp Software Netherlands B.V\nContent Manager Power Connect\nBy: Kapish Services Pty Ltd\nContent Moderator\nBy: Microsoft\nContoso Hub\nBy: Microsoft\nConverter by Power2Apps\nBy: Power2Apps P2A GmbH\nConvertKit (Independent Publisher)\nBy: Troy Taylor, Hitachi Solutions\nCopilot for Finance\nBy: Microsoft\nCopilot for Sales\nBy: Microsoft Corporation\nCopilot for Service extension (preview)\nBy: Microsoft Corporation\nCopy.ai\nBy: Troy Taylor\nCorda Blockchain [DEPRECATED]\nBy: Microsoft\nCornerstone Learning vILT\nBy: Cornerstone On Demand\nCorporate Buzzword Generator (Independent Publisher)\nBy: Troy Taylor\nCOSMO Bot\nBy: COSMO CONSULT GmbH\nCoupa (Independent Publisher)\nBy: NovaGL\nCourier (Independent Publisher)\nBy: Troy Taylor\nCOVID-19 JHU CSSE (Independent Publisher)\nBy: Woong Choi\nCPQSync\nBy: Cincom Systems\nCPSC Recalls Retrieval (Independent Publisher)\nBy: Troy Taylor\nCQC Data (Independent Publisher)\nBy: Martyn Lesbirel\nCradl AI\nBy: Cradl AI\nCraftMyPDF (Independent Publisher)\nBy: Troy Taylor\nCRM Bot\nBy: CRM Bot Ltd\nCronofy MCP\nBy: Cronofy Ltd\nCrossbeam\nBy: Crossbeam\nCSC Corptax\nBy: Corptax\nCSV Converter by Power2Apps\nBy: Power2Apps P2A GmbH\nCustom Vision\nBy: Microsoft\nCustomJS\nBy: TechnologyCircle GmbH\nCX Cards by Surveyapp\nBy: VOC Metrics Limited\nCyberday\nBy: Agendium Ltd\nCyberProof\nBy: CyberProof Inc.\nD&B Optimizer [DEPRECATED]\nBy: Dun & Bradstreet\nd.velop\nBy: d.velop AG\nD365 Contact Center Admin MCP\nBy: Microsoft\nD7Messaging\nBy: Signtaper Technologies FZCO\nD7SMS\nBy: Signtaper Technologies FZCO\nDad Jokes (Independent Publisher)\nBy: Troy Taylor\nDadJokesIO (Independent Publisher)\nBy: Troy Taylor\nDaffy (Independent Publisher)\nBy: Troy Taylor\nDailyMed (Independent Publisher)\nBy: Troy Taylor\nDandelion (Independent Publisher)\nBy: Troy Taylor\nData Activator\nBy: Microsoft, Data Activator\nData Activator Early Access\nBy: Microsoft\nData8 Data Enrichment\nBy: Data8 Limited\nDatablend\nBy: DataBlend\nDatabook C4S\nBy: Databook Labs, Inc.\nDatabox (Independent Publisher)\nBy: Troy Taylor\nDatabricks\nBy: Databricks Inc.\nDataFlows SMS\nBy: DATAFLOWS SMS\nDataMotion\nBy: DataMotion, Inc.\nDatamuse (Independent Publisher)\nBy: Troy Taylor\nDataScope Forms\nBy: DataScope\nDB2\nBy: Microsoft\nDBF2XML\nBy: SMART\nDe Lijn (Independent Publisher)\nBy: Lenard Schockaert\nDecentraland (Independent Publisher)\nBy: Roy Paar\nDeck of Cards (Independent Publisher)\nBy: Troy Taylor\nDeepgram (Independent Publisher)\nBy: Troy Taylor\nDeepL\nBy: DeepL\nDeepLIP (Independent Publisher)\nBy: Michal Romiszewski\nDeepSign\nBy: DeepCloud\nDefault title\nBy: WordLift\nDefender for Cloud Apps\nBy: Microsoft\nDeprecated Integration [DEPRECATED]\nBy: Maximizer\nDerdack SIGNL4\nBy: Derdack GmbH\nDesk365\nBy: Kani Technologies Inc\nDeskDirector\nBy: DeskDirector\nDesktop flows\nBy: Microsoft\nDexcom (Independent Publisher)\nBy: FlowJoe\nDHL Tracking (DEPRECATED) (Independent Publisher) [DEPRECATED]\nBy: Rapid Circle\nDiceBear (Independent Publisher)\nBy: Troy Taylor\nDid You Mean This (Independent Publisher)\nBy: Troy Taylor\nDiffchecker\nBy: Fördős András\nDigiDates (Independent Publisher)\nBy: Troy Taylor\nDigiLEAN Connect\nBy: DigiLEAN AS\nDigitalHumani (Independent Publisher)\nBy: Troy Taylor\nDime.Scheduler\nBy: Dime Software\nDime.Scheduler (on-prem)\nBy: Dime Software\nDiscord (Independent Publisher)\nBy: Daniel Laskewitz | Microsoft & Michael Guzowski | Developico\nDisqus\nBy: Microsoft\nDo Not Call Reported Calls (Independent Publisher)\nBy: Troy Taylor, Hitachi Solutions\nDoc To PDF\nBy: Spot Solutions, Inc\nDocFusion365 – SP\nBy: Assimilated Information Systems\nDocJuris\nBy: DocJuris\nDocparser\nBy: Docparser\nDocugami\nBy: Docugami.com\nDocuGenerate\nBy: DocuGenerate\nDocument AI\nBy: Cloudmersive, LLC\nDocument AI Konfuzio\nBy: Helm & Nagel GmbH\nDocument Drafter\nBy: Document Drafter\nDocument Merge\nBy: CIRRUS SOFT LTD\nDocumentero\nBy: Documentero\nDocumentsCorePack\nBy: mscrm-addons.com ( PTM EDV Systeme )\nDocuMotor\nBy: Omnidocs\nDocurain\nBy: root42 Inc.\nDocusign\nBy: DocuSign, Inc.\nDocusign Demo\nBy: DocuSign, Inc.\nDocuWare\nBy: DocuWare\nDokobit Portal\nBy: Dokobit\nDokobit Universal API\nBy: Dokobit\nDomainTools Iris Enrich\nBy: DomainTools, LLC\nDomainTools Iris Investigate\nBy: DomainTools, LLC\nDoppler Farhan Latif (Independent Publisher)\nBy: Farhan Latif\ndox42\nBy: dox42\nDPIRD Radar - West Australia (Independent Publisher)\nBy: Paul Culmsee\nDPIRD Science - West Australia (Independent Publisher)\nBy: Paul Culmsee\nDPIRD Weather - West Australia (Independent Publisher)\nBy: Paul Culmsee\nDQ on Demand\nBy: DQ Global\nDraup\nBy: Draup\nDraup MCP Server\nBy: Draup\nDropbox\nBy: Microsoft\nDuration Calculator (Independent Publisher)\nBy: Troy Taylor\nDVLA Vehicle Enquiry Service (Independent Publisher)\nBy: Gulshan Khurana and Pranav Khurana\nDynamic Signal\nBy: Dynamic Signal\nDynamicDocs (Independent Publisher)\nBy: Troy Taylor\nDynamics 365 (deprecated)\nBy: Microsoft\nDynamics 365 Business Central\nBy: Microsoft\nDynamics 365 Business Central (on-premises)\nBy: Microsoft\nDynamics 365 Commerce - Ratings and Reviews\nBy: Microsoft\nDynamics 365 Commerce Merchandising [DEPRECATED]\nBy: Microsoft\nDynamics 365 Customer Insights\nBy: Microsoft\nDynamics 365 Customer Voice\nBy: Microsoft\nDynamics 365 Fraud Protection\nBy: Microsoft\nDynamics 365 Sales Insights\nBy: Microsoft\nDynamics NAV\nBy: Microsoft\nDynamics Translation Service\nBy: Microsoft Corporation\nDynatrace\nBy: Dynatrace\nEasy Redmine\nBy: Microsoft\nEasyPost Mail\nBy: Bing Technologies\nEasyship (Independent Publisher)\nBy: Troy Taylor\nEasyvista Self Help\nBy: Easyvista\nEasyVista Service Manager\nBy: Easyvista\neBay (Independent Publisher)\nBy: Artesian Software Technologies LLP\nEBMS\nBy: Eagle Business Software\neCFR (Independent Publisher)\nBy: Dan Romano\nEcologi (Independent Publisher)\nBy: Troy Taylor\nedatalia Sign Online (Independent Publisher)\nBy: Victor Sanchez Olaya\nEden AI\nBy: Eden AI\nEdgility\nBy: Edgility\nEdifact\nBy: Microsoft\nEduframe\nBy: Microsoft\nEgain\nBy: eGain Corporation\nEgnyte\nBy: Egnyte\nE-goi\nBy: E-goi\nEigen Events\nBy: Eigen Ltd\nElastic Forms\nBy: Workai\nElasticOCR [DEPRECATED]\nBy: ElasticOCR\nElead Product Reference Data\nBy: CDK Global\nElead Sales Customers\nBy: CDK Global\nElead Sales Opportunities\nBy: CDK Global\nElectricity Maps (Independent Publisher)\nBy: Vitalii Sorokin\nElfsquad Data\nBy: Elfsquad B.V.\nElfsquad Product Configurator\nBy: Elfsquad\nEmail Domain Checker\nBy: Mightora.io\nEmail Veritas – URL Checker\nBy: eVeritas\nemfluence Marketing Platform\nBy: emfluence, llc\nEmigo\nBy: Sagra Technology Sp. z o.o.\nEmojiHub (Independent Publisher)\nBy: Troy Taylor\nEMT ATLAS AIMS\nBy: Enable My Team\nEnadoc\nBy: Enadoc Pte Ltd\nEncodian - Barcode\nBy: Encodian\nEncodian - Convert\nBy: Encodian\nEncodian - Excel\nBy: Encodian\nEncodian - General\nBy: Encodian\nEncodian - Image\nBy: Encodian\nEncodian - PDF\nBy: Encodian\nEncodian - PowerPoint\nBy: Encodian\nEncodian - Sign\nBy: Encodian\nEncodian - Utilities\nBy: Encodian\nEncodian - Word\nBy: Encodian\nEncodian [DEPRECATED]\nBy: Encodian\nEncodian Filer\nBy: Encodian\nEngagement Cloud\nBy: dotdigital\nEnlyft Insights\nBy: Enlyft.\nEnlyft MCP\nBy: Enlyft.\nEntegrations.io\nBy: entegrations.io inc\nEntersoft\nBy: Entersoft SA\nEnveloop (Independent Publisher)\nBy: Troy Taylor\nEnvoy\nBy: Envoy, Inc.\nEONET by NASA (Independent Publisher)\nBy: Troy Taylor, Hitachi Solutions\nEphesoft Semantik For Invoices\nBy: Ephesoft Inc.\nE-Sign\nBy: E-Sign\nEthereum Blockchain [DEPRECATED]\nBy: Microsoft\nEtsy (Independent Publisher)\nBy: Troy Taylor, Hitachi Solutions\nEvent Hubs\nBy: Microsoft\nEvent Tickets\nBy: The Events Calendar\nEventbrite\nBy: Microsoft\nEvery (Independent Publisher)\nBy: Troy Taylor\nEvocom\nBy: Evocom Informationssysteme GmbH\neWay-CRM\nBy: eWay-CRM\nExact Online Premium [DEPRECATED]\nBy: Exact MKB Software BV\nExact Time & Billing (Independent Publisher)\nBy: Indocs\nExasol\nBy: Exasol AG\nExcel [DEPRECATED]\nBy: Microsoft\nExcel Online (Business)\nBy: Microsoft\nExcel Online (OneDrive)\nBy: Microsoft\nExchange Rate (Independent Publisher)\nBy: Fördős András\nExpensya\nBy: EXPENSYA SA\nExperlogix CPQ\nBy: Experlogix US\nExperlogix Smart Flows\nBy: Experlogix US\nExpiration Reminder\nBy: SkyXoft Technologies, Inc.\nEXPOCAD\nBy: EXPOCAD\nEzekia-MCP\nBy: Ezekia\nFabric MCP\nBy: Microsoft\nFace API\nBy: Microsoft\nFactSet\nBy: FactSet Research Systems\nFantasy Premier League (Independent Publisher)\nBy: Joe Unwin (FlowJoe)\nFarsight DNSDB\nBy: Farsight Security\nFBI Most Wanted (Independent Publisher)\nBy: Richard Wilson\nFCA (Independent Publisher)\nBy: Gulshan Khurana\nFeathery\nBy: Troy Taylor\nFeathery Forms\nBy: Feathery\nFederal Reserve Economic Data (Independent Publisher)\nBy: Troy Taylor, Hitachi Solutions\nFederal Reserve Markets (Independent Publisher)\nBy: Dan Romano\nFEMA (Independent Publisher)\nBy: Troy Taylor\nFestivo (Independent Publisher)\nBy: Troy Taylor\nFHIRBase\nBy: Microsoft\nFHIRClinical\nBy: Microsoft\nFHIRlink\nBy: Microsoft Cloud for Healthcare\nFieldEquip\nBy: FieldEquip\nFile System\nBy: Microsoft\nFile.io (Independent Publisher)\nBy: Troy Taylor\nFiles.com\nBy: Files.com\nFin & Ops Apps (Dynamics 365)\nBy: Microsoft\nFinalcad One Connect 4.0\nBy: FINALCAD\nFinancial Edge NXT Query [DEPRECATED]\nBy: Blackbaud. Inc\nFinnish BIS (Independent Publisher)\nBy: Timo Pertila\nFinnish Railway Traffic (Independent Publisher)\nBy: Timo Pertilä\nFINRA (Independent Publisher)\nBy: Dan Romano\nFireText\nBy: FireText\nFiscal Data Service (Independent Publisher)\nBy: Dan Romano\nFishWatch (Independent Publisher)\nBy: Fordos Andras\nFitbit (Independent Publisher)\nBy: Ashwin Ganesh Kumar\nFlic\nBy: Microsoft\nFliplet\nBy: Fliplet\nFlotiq headless CMS\nBy: CodeWave LLC\nFlowForma\nBy: FlowForma Limited\nFlowForma V2\nBy: FlowForma Limited\nFocusmate (Independent Publisher)\nBy: Phil Cole\nFORCAM FORCE Bridge\nBy: FORCAM GmbH\nForceManager CRM\nBy: Tritium Software S.L.\nForem (Independent Publisher)\nBy: Daniel Laskewitz\nFormstack Documents\nBy: Formstack LLC\nFormstack Forms\nBy: Formstack LLC\nFraudLabs Pro (Independent Publisher)\nBy: Troy Taylor\nFreeAgent (Independent Publisher)\nBy: Nirmal Kumar\nFreshBooks\nBy: Microsoft\nFreshdesk\nBy: Microsoft\nFreshservice\nBy: Microsoft\nFTP\nBy: Microsoft\nFulcrum\nBy: Spatial Networks, Inc.\nFun Translations (Independent Publisher)\nBy: Troy Taylor\nFuseLagNotam1.1 (Independent Publisher)\nBy: Falana Kidd\nFuxsy-ADSKFusionManagePaid (Independent Publisher)\nBy: Fuxsy.eu\nGenerative actions\nBy: Microsoft\nGeoDB (Independent Publisher)\nBy: Troy Taylor\nGerman Federal Parliament (Independent Publisher)\nBy: Dan Romano\nGetAccept\nBy: GetAccept, Inc.\nGetMyInvoices\nBy: GetMyInvoices\nGieni TS Server MCP\nBy: Orderfox-Gieni\nGIPHY (Independent Publisher)\nBy: Priyanshu Srivastav\nGIS Cloud\nBy: HandyGeo Solutions\nGitHub\nBy: Microsoft\nGithub Data (Independent Publisher)\nBy: Nathalie-Leenders\nGitHub Gists (Independent Publisher)\nBy: Troy Taylor\nGitHub Utils (Independent Publisher)\nBy: Daniel Laskewitz\nGitLab (Independent Publisher)\nBy: Roy Paar\nGivebutter (Independent Publisher)\nBy: Troy Taylor, Hitachi Solutions\nGlaass Pro\nBy: Glaass Pty Ltd\nGlobal Exchange Rates\nBy: MEMENTO SRL\nGlobalGiving Project (Independent Publisher)\nBy: Troy Taylor, Hitachi Solutions\nGmail\nBy: Microsoft\nGMO Sign\nBy: GMO GlobalSign Holdings K.K.\nGoFileRoom\nBy: Thomson Reuters\nGoogle BigQuery - Dev (Independent Publisher)\nBy: Ashwani Kumar\nGoogle Books (Independent Publisher)\nBy: Fördős András\nGoogle Calendar\nBy: Microsoft\nGoogle Cloud Translation (Independent Publisher)\nBy: Troy Taylor, Hitachi Solutions\nGoogle Contacts\nBy: Microsoft\nGoogle Drive\nBy: Microsoft\nGoogle Gemini (Independent Publisher)\nBy: Priyaranjan KS , Vidya Sagar Alti [Tata Consultancy Services]\nGoogle PaLM (Independent Publisher)\nBy: Priyaranjan KS , Vidya Sagar Alti [Tata Consultancy Services]\nGoogle Photos (Independent Publisher)\nBy: Julia Muiruri\nGoogle Sheets\nBy: Microsoft\nGoogle Tasks\nBy: Microsoft\nGoQR (Independent Publisher)\nBy: Rui Santos\nGoToMeeting\nBy: LogMeIn Inc\nGoToTraining\nBy: Microsoft\nGoToWebinar\nBy: Microsoft\nGovee (Independent Publisher)\nBy: Richard Wilson\nGratavid\nBy: Gratavid\nGravity Forms by reenhanced\nBy: Reenhanced LLC\nGravity Forms Professional\nBy: Reenhanced, LLC\nGroopit\nBy: Groopit\nGroupMgr\nBy: GroupMgr\nGSA Analytics (Independent Publisher)\nBy: Richard Wilson\nGSA Per Diem (Independent Publisher)\nBy: Richard Wilson\nGSA Public Comment (Independent Publisher)\nBy: Dan Romano\nGSA Site Scanning (Independent Publisher)\nBy: Richard Wilson\nHarness PDFx\nBy: Harness Data Intelligence Ltd.\nHarvest\nBy: Microsoft\nHash Generator (Independent Publisher)\nBy: Troy Taylor, Jeffrey Irwin, Ramiro Melgoza\nHashify (Independent Publisher)\nBy: Troy Taylor, Hitachi Solutions\nHashtag API (Independent Publisher)\nBy: Troy Taylor\nHave I Been Pwned (Independent Publisher)\nBy: Troy Taylor\nHelloSign\nBy: Microsoft\nHHS Media Services (Independent Publisher)\nBy: Troy Taylor\nHighGear Workflow\nBy: HighGear Software, Inc.\nHighQ\nBy: Thomson Reuters Incorporated\nHighspot\nBy: Highspot\nHighspot MCP\nBy: Highspot\nHipChat\nBy: Microsoft\nHitHorizons\nBy: FinStat, s. r. o.\nHive CPQ Product Configurator\nBy: Hive CPQ\nHolopin\nBy: Troy Taylor\nHolopin (Independent Publisher)\nBy: troystaylor\nHoneywell Forge\nBy: Honeywell International\nHost.io (Independent Publisher)\nBy: Troy Taylor\nHotProfile\nBy: Hammock corporation\nHoudin.io\nBy: Houdin.io\nHouseRater QA\nBy: HouseRater, LLC\nHR Cloud\nBy: HR Cloud\nHrFlow.ai\nBy: HrFlow.ai\nHTML to PDF by Pascalcase\nBy: Pascalcase\nhttp garden (Independent Publisher)\nBy: Troy Taylor\nHTTP With Microsoft Entra ID\nBy: Microsoft\nHTTP with Microsoft Entra ID (preauthorized)\nBy: Microsoft\nHubSpot CMS (Independent Publisher)\nBy: Hitachi Solutions\nHubSpot CMS V2 (Independent Publisher)\nBy: Troy Taylor\nHubSpot Conversations V2 (Independent Publisher)\nBy: Troy Taylor\nHubSpot CRM (Independent Publisher)\nBy: Troy Taylor, Hitachi Solutions\nHubSpot CRM V2 (Independent Publisher)\nBy: Troy Taylor\nHubSpot Engagements V2 (Independent Publisher)\nBy: Troy Taylor\nHubSpot Files V2 (Independent Publisher)\nBy: Troy Taylor\nHubSpot Marketing (Independent Publisher)\nBy: Hitachi Solutions\nHubSpot Marketing V2 (Independent Publisher)\nBy: Troy Taylor\nHubSpot Settings V2 (Independent Publisher)\nBy: Troy Taylor\nHuddle\nBy: Huddle\nHuddle for US Gov & Healthcare\nBy: Huddle\nHuddo Boards\nBy: Huddo by ISW Development Pty Ltd\nHUE Datagate\nBy: Works Applications Co., Ltd.\nHugging Face (Independent Publisher)\nBy: Troy Taylor\nHume (Independent Publisher)\nBy: Troy Taylor\nHunter (Independent Publisher)\nBy: Troy Taylor\nHVI Vehicle Inspection V1.2\nBy: JRS Innovation/Ram Upadhayay\nHYAS Insight\nBy: HYAS Infosec\nIA-Connect Dynamic Code\nBy: Ultima Business\nIA-Connect Java\nBy: Ultima Labs\nIA-Connect JML\nBy: Ultima Business\nIA-Connect Mainframe\nBy: Ultima Labs\nIA-Connect SAP GUI\nBy: Ultima Business\nIA-Connect Session\nBy: Ultima Business\nIA-Connect to Microsoft Office\nBy: Ultima Business\nIA-Connect UI\nBy: Ultima Business\nIA-Connect Web Browser\nBy: Ultima Business\niAuditor\nBy: SafetyCulture Pty Ltd\nIBM 3270\nBy: Microsoft\nIBM Watson Assistant (Independent Publisher)\nBy: Lucas Titus\nIBM Watson Text to Speech (Independent Publisher)\nBy: Lucas Titus\nicanhazdadjoke (Independent Publisher)\nBy: Daniel Laskewitz\nIce and Fire (Game of Thrones) (Independent Publisher)\nBy: Troy Taylor\nIcon Horse (Independent Publisher)\nBy: Troy Taylor, Hitachi Solutions\nID Analyzer\nBy: Evith Techology\nIdeanote\nBy: Ideanote ApS\niFacto Proof Of Delivery\nBy: iFacto Business Solutions NV\niLovePDF\nBy: i Love PDF\niLoveSign\nBy: i Love PDF\niManage AI\nBy: iManage Power Platform Connector\niManage Data Marts\nBy: iManage Power Platform Connector\niManage Insight Plus\nBy: iManage LLC\niManage Tracker\nBy: iManage LLC\niManage Work\nBy: iManage Power Platform Connector\niManage Work for Admins\nBy: iManage Power Platform Connector\niMIS\nBy: Computer System Innovations, Inc.\nImpexium\nBy: Impexium Corporation\nImpower ERP\nBy: Impower GmbH\nImprezian360-CRM\nBy: KnowTia Concepts Corporation\nIN-D Aadhaar Number Masking\nBy: IN-D by Intain\nIN-D Face Match\nBy: IN-D by Intain\nIN-D Insurance (ICD10 & CPT)\nBy: IN-D by Intain\nIN-D Invoice Data Capture\nBy: IN-D AI\nIN-D KYC India\nBy: IN-D by Intain\nIN-D Payables\nBy: IN-D by Intain\nIndustrial App Store\nBy: Intelligent Plant\nInEight\nBy: InEight\nInfluenza and Covid-19 (Independent Publisher)\nBy: Kevin Comba Gatimu, Denis Wachira Kathuri\nInfobip\nBy: Infobip\nInfoQuery\nBy: InfoQuery LLC\nInformix\nBy: Microsoft\nInfoShare\nBy: Kendox AG\nInfoVetted\nBy: InfoVetted\nInfura Ethereum (Independent Publisher)\nBy: Sebastian Zolg\nInfusionsoft\nBy: Microsoft\nInLoox\nBy: InLoox\nInoreader\nBy: Microsoft\ninQuba Journey\nBy: Inquba Customer Intelligence Pty Ltd\nInsightly\nBy: Microsoft\nInstagram Basic Display (Independent Publisher)\nBy: Reshmee Auckloo\nInstapaper\nBy: Microsoft\nInstatus (Independent Publisher)\nBy: Troy Taylor\nIntegrable PDF\nBy: Integrable, LLC\nIntegration Toolbox [DEPRECATED]\nBy: LF Software Engineering\nIntelix IOC Analysis MCP\nBy: Sophos Ltd.\nintelliHR\nBy: intelliHR\nIntentional Data Sources\nBy: Microsoft\nInterAction\nBy: LexisNexis Legal and Professional\nIntercom\nBy: Microsoft\niObeya\nBy: iObeya\nIP2LOCATION (Independent Publisher)\nBy: Fördős András\nIP2WHOIS (Independent Publisher)\nBy: Fordos Andras\nIPQS Fraud and Risk Scoring\nBy: IPQualityScore\nIQAir (Independent Publisher)\nBy: Fordos Andras\nISOPlanner\nBy: REDLAB\nITautomate\nBy: ITautomate LTD\nITGlue (Independent Publisher)\nBy: Nirmal Kumar\nJasper (Independent Publisher)\nBy: Troy Taylor\nJBHunt\nBy: Microsoft\nJedox OData Hub\nBy: Jedox\nJG Integrations\nBy: JG Software Solutions Limited\nJira\nBy: Microsoft\nJIRA Search (Independent Publisher)\nBy: Paul Culmsee\nJotForm\nBy: JotForm Inc.\nJotform Enterprise\nBy: JotForm Admin\nJservice (Independent Publisher) [DEPRECATED]\nBy: Troy Taylor\nJungleMail 365\nBy: EnovaPoint, UAB\nJupyrest\nBy: Microsoft\nK2 Workflow\nBy: K2\nKagi (Independent Publisher)\nBy: Troy Taylor\nKanban Tool\nBy: Shore Labs\nKhalibre LMS Test\nBy: Khalibre\nkintone\nBy: Kintone\nKnowledgeLake\nBy: KnowledgeLake\nKnowledgeone RecFind6\nBy: Knowledgeone Corporation\nKORTO V2\nBy: Korto\nKroki\nBy: Troy Taylor\nKrozu PM (Independent Publisher)\nBy: Osazee Odigie\nKyndryl mainframe\nBy: Ryan Treacy\nLang.ai\nBy: Lang.ai\nLanguage - Question Answering\nBy: Microsot\nLanguageTool (Independent Publisher) (deprecated) [DEPRECATED]\nBy: Fordos Andras\nLansweeper App For Sentinel\nBy: Lansweeper\nLasso X\nBy: Lasso X A/S\nLatinShare Documents\nBy: LatinShare\nLatinShare SHP Management\nBy: LatinShare\nLatinShare SHP Permissions\nBy: LatinShare\nLaunch Library 2 (Independent Publisher)\nBy: Troy Taylor\nLawlift\nBy: Lawlift GmbH\nLawVu\nBy: LAWVU LIMITED\nLCP - iCordis\nBy: LCP nv\nLeadDesk\nBy: LeadDesk\nLeanKit\nBy: Microsoft\nLeap (Independent Publisher)\nBy: Chandra Sekhar Malla, Troy Taylor\nLeave Dates (Independent Publisher)\nBy: Tiago Ramos (novalogica)\nLegalBot AI Tools\nBy: LegalBot.io\nLegalesign\nBy: Legalesign\nLegiScan (Independent Publisher)\nBy: krautrocker\nLetterdrop (Independent Publisher)\nBy: Troy Taylor\nLettria (Independent Publisher)\nBy: Troy Taylor\nLettria GDPR Compliance\nBy: lettria\nLex Power Sign\nBy: Lex Persona\nLexica (Independent Publisher)\nBy: Troy Taylor\nLexoffice (Independent Publisher)\nBy: LowCodeInvestigator\nLibrary of Congress\nBy: Troy Taylor\nLibreBor (Independent Publisher)\nBy: Mario Trueba and Marco Amoedo\nLIFX\nBy: Microsoft\nLine Message (Independent Publisher)\nBy: Felaray Ho\nLINK Mobility\nBy: LINK Mobility\nLinkedIn [DEPRECATED]\nBy: Microsoft\nLinkedIn V2\nBy: Microsoft\nLit Ipsum (Independent Publisher)\nBy: Troy Taylor\nLitera Search\nBy: Litera\nLiveChat\nBy: Microsoft\nLiveTiles Bots\nBy: LiveTiles Pty Ltd.\nLMS365\nBy: Zensai International Aps\nLnk.Bio\nBy: Lnk.Bio\nLoginLlama\nBy: Troy Taylor\nLoopio\nBy: Loopio\nLoopio-EU\nBy: Loopio\nLoopio-Int01\nBy: Loopio\nLoripsum (Independent Publisher)\nBy: Troy Taylor, Hitachi Solutions\nLSEG\nBy: LSEG Financial Analytics\nLSEG Financial Analytics\nBy: LSEG Financial Analytics\nLUIS\nBy: Microsoft\nLuware Nimbus\nBy: Luware\nM365 Search (Deprecated) [DEPRECATED]\nBy: Microsoft\nMaersk (Independent Publisher)\nBy: Dan Romano\nMail\nBy: Microsoft\nMailboxValidator (Independent Publisher)\nBy: Troy Taylor, Hitachi Solutions\nMailChimp\nBy: Microsoft\nMailform\nBy: Mailform, Inc.\nMailinator\nBy: Troy Taylor\nMailJet (Independent Publisher)\nBy: Clement Olivier\nMailParser\nBy: SureSwift Capital, Inc.\nMaintenance Request - Oxmaint (Independent Publisher)\nBy: JRS Innovation/Ram Upadhayay\nMandrill\nBy: Microsoft\nMap Pro\nBy: Witivio\nMapbox (Independent Publisher)\nBy: Simone Lin\nMarkdown Converter (Independent Publisher)\nBy: troystaylor\nMarketing Content Hub\nBy: Stylelabs\nMarketo MA\nBy: Microsoft Inc.\nMavim-iMprove\nBy: Mavim\nMaximizer CRM\nBy: Maximizer\nMCP Hive.T Integration\nBy: Tesselate\nMeaningCloud (Independent Publisher)\nBy: Clement Olivier\nMedallia\nBy: Medallia, Inc.\nMediastack (Independent Publisher)\nBy: Fördős András\nMedium\nBy: Microsoft\nMeekou Share (Independent Publisher)\nBy: Meekou\nMeetingRoomMap\nBy: TNS Holding ApS\nMeisterplan\nBy: itdesign GmbH\nMeme (Independent Publisher)\nBy: Troy Taylor\nMensagia\nBy: Mensagia\nMensagia (Independent Publisher)\nBy: Sistemas Informaticos ICON, S.L.\nMessageBird SMS (Independent Publisher)\nBy: Troy Taylor\nMetatask\nBy: Build My Team LLC\nMichael Scott Quotes (Independent Publisher) [DEPRECATED]\nBy: Troy Taylor\nMicrosoft 365 compliance\nBy: Microsoft\nMicrosoft 365 message center\nBy: Microsoft\nMicrosoft 365 Self-Help\nBy: Microsoft\nMicrosoft Acronyms\nBy: Troy Taylor\nMicrosoft Bookings\nBy: Microsoft Corporation\nMicrosoft Copilot Studio\nBy: Microsoft\nMicrosoft D365CE v9 OnPrem (Independent Publisher)\nBy: Roy Paar\nMicrosoft Dataverse\nBy: Microsoft\nMicrosoft Dataverse [DEPRECATED]\nBy: Microsoft\nMicrosoft Defender ATP\nBy: Microsoft\nMicrosoft Defender for Cloud Alert\nBy: Microsoft\nMicrosoft Defender for Cloud Recommendation\nBy: Microsoft\nMicrosoft Defender for Cloud Regulatory Compliance\nBy: Microsoft\nMicrosoft Entra ID\nBy: Microsoft\nMicrosoft Entra ID App Registrations\nBy: Paul Culmsee (Rapid Circle) and Microsoft\nMicrosoft Entra ID Protection\nBy: Microsoft\nMicrosoft Forms\nBy: Microsoft\nMicrosoft Graph Add Users (Independent Publisher)\nBy: Troy Taylor\nMicrosoft Graph Security (deprecated) [DEPRECATED]\nBy: Microsoft\nMicrosoft Kaizala\nBy: Microsoft\nMicrosoft Learn Catalog (Independent Publisher)\nBy: Sean Kelly\nMicrosoft Learn Docs MCP\nBy: Microsoft\nMicrosoft Loop [DEPRECATED]\nBy: Microsoft\nMicrosoft Partner Center [DEPRECATED]\nBy: Microsoft\nMicrosoft School Data Sync V2\nBy: Microsoft\nMicrosoft Security Copilot\nBy: Microsoft\nMicrosoft Sentinel\nBy: Microsoft\nMicrosoft Sentinel MCP\nBy: Microsoft\nMicrosoft Teams\nBy: Microsoft\nMicrosoft Teams Virtual Events (deprecated) [DEPRECATED]\nBy: Microsoft\nMicrosoft To-Do (Business)\nBy: Microsoft\nMicrosoft To-Do (Consumer)\nBy: Microsoft\nMicrosoft Translator [DEPRECATED]\nBy: Microsoft\nMicrosoft Translator V2\nBy: Microsoft\nMicrosoft Translator V3\nBy: Microsoft Translator\nMime Automation (Independent Publisher)\nBy: Andreas Cieslik\nMiniSoup HTML Parser (Independent Publisher)\nBy: Shogo Shindo\nMintlify (Independent Publisher)\nBy: Troy Taylor\nMintNFT (Independent Publisher)\nBy: Shreyan J D Fernandes\nMiro (Independent Publisher)\nBy: Michal Romiszewski\nMistral (Independent Publisher)\nBy: Troy Taylor\nMitto\nBy: Mitto AG\nMobile Text Alerts MCP Server\nBy: Mobile Text Alerts\nMobili Stotele\nBy: Tele2\nMobilyWS\nBy: MobilyWS\nMOBSIM Send SMS\nBy: MOBSIM Comunicacao Mobile SMS\nMockaroo (Independent Publisher)\nBy: Richard Wilson\nMockster\nBy: Mockster\nModuleQ [DEPRECATED]\nBy: ModuleQ, Inc.\nmonday\nBy: Plugin Genie\nmonday.com\nBy: monday.com ltd\nmondaycom (Independent Publisher)\nBy: Woong Choi\nMongoDB\nBy: MongoDB Corp\nMonster API (Independent Publisher)\nBy: Troy Taylor\nMoosend (Independent Publisher)\nBy: Troy Taylor\nMoreApp Forms\nBy: MoreApp International\nMorf\nBy: AFTIA Solutions\nMorningstar\nBy: Morningstar Test\nMorta\nBy: Morta\nMotaWord Translations\nBy: MotaWord\nMotimate\nBy: Motimate AS\nMQ\nBy: Microsoft\nMS Graph Groups and Users\nBy: Jay Jani\nMSN Weather\nBy: Microsoft\nMtarget SMS\nBy: Mtarget SAS\nMuhimbi PDF\nBy: Muhimbi trading as Nutrient\nMURAL\nBy: MURAL\nMy Acclaro\nBy: Acclaro Inc.\nMy Hours\nBy: Spica International\nMySQL\nBy: Microsoft\nmyStrom (Independent Publisher)\nBy: Tomasz Poszytek\nN-able Cloud Commander\nBy: N-Able Technologies Ltd.\nN-able Cloud User Hub\nBy: N-able Cloud User Hub B.V.\nNameAPI (Independent Publisher)\nBy: Fördős András\nNarvar\nBy: Microsoft\nNASA FIRMS (Independent Publisher)\nBy: Fördős András\nNASA Image and Video Library (Independent Publisher)\nBy: Paul Culmsee, Seven Sigma\nNational Park Service (Independent Publisher)\nBy: Troy Taylor, Hitachi Solutions\nNational Weather Service (Independent Publisher)\nBy: Troy Taylor\nNationalize_io (Independent Publisher)\nBy: Tomasz Poszytek\nnBold\nBy: nBold\nNCEI Climate Data (Independent Publisher)\nBy: Troy Taylor\nNederlandse Spoorwegen (Independent Publisher)\nBy: Miguel Verweij\nNEOWs (Independent Publisher)\nBy: Troy Taylor\nNetDocuments\nBy: NetDocuments Software, Inc.\nNetvolution\nBy: Atcom S.A\nNeum (Independent Publisher)\nBy: Troy Taylor\nNew York Times (Independent Publisher)\nBy: Roy Paar\nNewsData.io (Independent Publisher)\nBy: Troy Taylor\nNexmo\nBy: Microsoft\nNextcom\nBy: Nextcom Evolution AS\nNH360 Portfolio Insights\nBy: UMT Software\nNHTSA vPIC (Independent Publisher)\nBy: Troy Taylor\nNifty Gateway (Independent Publisher)\nBy: Roy Paar\nNimflow\nBy: Nimflow LLC\nNintex Workflow\nBy: Nintex USA LLC.\nNIST NVD (Independent Publisher)\nBy: Paul Culmsee\nNitro\nBy: Nitro Software, Inc.\nNitro Sign Enterprise Verified\nBy: Nitro Software Belgium NV\nNodefusion Portal\nBy: Nodefusion d.o.o\nNosco\nBy: Nosco ApS\nNotifications\nBy: Microsoft\nNotiivy Browser Notifications\nBy: Notiivy\nNotion (Independent Publisher)\nBy: Chandra Sekhar & Harshini Varma\nNoxtua AI\nBy: Xayn\nNozbe\nBy: NOZBE SP Z O O\nnps.today\nBy: nps.today\nNREL (Independent Publisher)\nBy: Troy Taylor\nNumlookupAPI (Independent Publisher)\nBy: Troy Taylor\nnunify\nBy: nunify\nNutrient - Convert to PDF\nBy: Muhimbi trading as Nutrient\nNutrient - Extract from PDF\nBy: Muhimbi trading as Nutrient\nNutrient - PDF OCR\nBy: Muhimbi trading as Nutrient\nNutrient - Watermark to PDF\nBy: Muhimbi trading as Nutrient\nNutrient Document Converter\nBy: Muhimbi trading as Nutrient\nNutrient Workflow Automation\nBy: Muhimbi trading as Nutrient\nObjective Connect\nBy: Objective Corporation\nOccuspace\nBy: Occuspace Inc\nOffice 365 Groups\nBy: Microsoft\nOffice 365 Groups Mail\nBy: Microsoft\nOffice 365 Outlook\nBy: Microsoft\nOffice 365 Users\nBy: Microsoft\nOffice 365 Video [DEPRECATED]\nBy: Microsoft\nOfficeAi Agent\nBy: Microsoft\nOK dokument (Independent Publisher)\nBy: Seyfor Slovensko, a.s.\nOMDb (Independent Publisher)\nBy: Aaryan Arora\noncehub\nBy: OnceHub\nOneBlink\nBy: OneBlink\nOneDrive\nBy: Microsoft\nOneDrive for Business\nBy: Microsoft\nOneflow\nBy: Oneflow\nOneNote (Business)\nBy: Microsoft\nOneNote Consumer (Independent Publisher)\nBy: Troy Taylor\nOnePlan\nBy: OnePlan, LLC\nOne-Time Secret (Independent Publisher)\nBy: Aldo Gillone\nOodrive Sign\nBy: Oodrive Sign\nOpen Brewery DB (Independent Publisher)\nBy: Fördős András\nOpen Charge Map (Independent Publisher)\nBy: Troy Taylor\nOpen Experience\nBy: Open Experience GmbH\nOpenAI (Independent Publisher)\nBy: Robin Rosengrün\nOpenAI Assistants (Independent Publisher)\nBy: Troy Taylor\nOpenAI GPT (Independent Publisher)\nBy: Troy Taylor\nOpenCage Geocoding (Independent Publisher)\nBy: Ahmad Najjar\nOpen-Elevation (Independent Publisher)\nBy: Fördős András\nopenFDA Drug (Independent Publisher)\nBy: Woong Choi\nOpenFEC (Independent Publisher)\nBy: krautrocker\nOpenLegacy IBM I (AS400)\nBy: OpenLegacy Technologies Inc.\nOpenLegacy IBM Mainframe\nBy: OpenLegacy Technologies Inc.\nOpenNEM (Independent Publisher)\nBy: Paul Culmsee\nOpenPLZ (Independent Publisher)\nBy: LowCodeInvestigator\nopenpm (Independent Publisher)\nBy: Troy Taylor\nOpenQR (Independent Publisher)\nBy: Troy Taylor\nOpenRouter (Independent Publisher)\nBy: Fördős András\nOpenSanctions (Independent Publisher)\nBy: krautrocker\nOpenText Core Share\nBy: One Fox\nOpenText Documentum\nBy: One Fox\nOpenText eDOCS\nBy: One Fox\nOpenText Extended ECM\nBy: One Fox\nOpenTrivaDatabase (Independent Publisher)\nBy: Kiveshan Naidoo\nOptiAPI\nBy: Busk\nOQSHA\nBy: Osmosys Software Solutions UK Limited\nOracle Database\nBy: Microsoft\nORB Intelligence (Independent Publisher)\nBy: Aaryan Arora, Ankita Singh\nOrbusInfinity\nBy: Orbus Software\nOrdnance Survey Places\nBy: Ordnance Survey\nOriginality.AI (Independent Publisher)\nBy: Osazee Odigie\nOtto.bot\nBy: Otto.bot, LLC\nOutlook Tasks [DEPRECATED]\nBy: Microsoft\nOutlook.com\nBy: Microsoft\nOutreach Insights\nBy: Outreach\nOwlbot (Independent Publisher)\nBy: Troy Taylor\nPagePixels Screenshots\nBy: PagePixels: Screenshots\nPagerDuty\nBy: Microsoft\nPantry (Independent Publisher)\nBy: Troy Taylor, Hitachi Solutions\nPanviva\nBy: Panviva\nPappers (Independent Publisher)\nBy: Troy Taylor\nParishSoft Family Suite\nBy: Ministry Brands ParishSOFT\nParserr\nBy: Parserr LLC\nParseur\nBy: Parseur\nPartner Center Events\nBy: Microsoft\nPartner Center Referrals\nBy: Microsoft Corporation\nPartnerLinq\nBy: Visionet Systems Inc.\nPassage by 1Password - Auth (Independent Publisher)\nBy: Troy Taylor\nPassage by 1Password - Manage (Independent Publisher)\nBy: Troy Taylor\nPaylocity\nBy: Paylocity\nPaySpace (Independent Publisher)\nBy: Mint Management Technologies\nPDF Blocks\nBy: Integrable, LLC\nPDF Tools\nBy: ConvertAPI\nPDF Tools by Tachytelic (Independent Publisher)\nBy: tachytelic\nPDF4me\nBy: Ynoox GmbH\nPDF4me AI\nBy: Ynoox GmbH\nPDF4me Barcode\nBy: Ynoox GmbH\nPDF4me Connect\nBy: Ynoox GmbH\nPDF4me Convert\nBy: Ynoox GmbH\nPDF4me Excel\nBy: Ynoox GmbH\nPDF4me Image\nBy: Ynoox GmbH\nPDF4me PDF\nBy: Ynoox GmbH\nPDF4me SwissQR\nBy: Ynoox GmbH\nPDF4me Word\nBy: Ynoox GmbH\nPDFco\nBy: PDF.co\nPDFcross\nBy: PotCross GK.\nPdfless\nBy: Synapsium\nPeakboard\nBy: Peakboard GmbH\nPeltarion AI\nBy: Peltarion\nPerfect Wiki\nBy: OOO RD17\nPerplexity AI (Independent Publisher)\nBy: Troy Taylor\nPersonr\nBy: Microsoft\nPexels (Independent Publisher)\nBy: That API Guy\nPhilips HUE (Independent Publisher)\nBy: Tomasz Poszytek\nPilot Things\nBy: Pilot Things\nPinecone\nBy: Troy Taylor\nPinterest\nBy: Microsoft\nPipedrive\nBy: Microsoft\nPipeliner CRM\nBy: Pipelinersales Corporation\nPIPware KPIs\nBy: PIPware Solutions\nPitney Bowes Data Validation [DEPRECATED]\nBy: Not Available\nPitney Bowes Tax Calculator [DEPRECATED]\nBy: Not Available\nPivotal Tracker\nBy: Microsoft\nPixel Encounter (Independent Publisher)\nBy: Fordos Andras\nPixela (Independent Publisher)\nBy: Troy Taylor\nPixelMe\nBy: Troy Taylor\nPKIsigning\nBy: SBRS B.V.\nPlacedog (Independent Publisher)\nBy: Troy Taylor, Hitachi Solutions\nPlanful\nBy: Planful\nPlanner\nBy: Microsoft\nPling\nBy: Fellowmind Denmark\nPlivo\nBy: Plivo Inc\nPlumsail Actions\nBy: Plumsail\nPlumsail Documents\nBy: Plumsail\nPlumsail Forms\nBy: Plumsail Inc.\nPlumsail HelpDesk\nBy: Plumsail Inc.\nPoka\nBy: Poka Inc\nPokeAPI Core (Independent Publisher)\nBy: Fördős András\nPokeAPI World (Independent Publisher)\nBy: Fördős András\nPolaris PSA\nBy: Replicon Inc\nPoliteMail\nBy: PoliteMail Software\nPolygon (Independent Publisher)\nBy: Itransition Group Ltd\nPortfolio and Roadmap\nBy: Microsoft\nPostgreSQL\nBy: Microsoft\nPostman (Independent Publisher)\nBy: Fördős András\nPowell Teams\nBy: Powell Software\nPower Apps for Admins\nBy: Microsoft\nPower Apps for Makers\nBy: Microsoft\nPower Apps Notification\nBy: Microsoft\nPower Apps Notification V2\nBy: Microsoft\nPower Assist\nBy: Elevate Digital\nPower Automate for Admins\nBy: Microsoft\nPower Automate Management\nBy: Microsoft\nPower BI\nBy: Microsoft\nPower Form 7\nBy: Reenhanced LLC\nPower Platform for Admins\nBy: Microsoft\nPower Platform for Admins V2\nBy: Microsoft\nPower Query Dataflows\nBy: Microsoft\nPower Textor\nBy: Imperium Dynamics\nPower Virtual Agents\nBy: Microsoft\nPPM Express\nBy: PPM Express Corporation\nPreserve365\nBy: Preservica\nPrexView (Independent Publisher)\nBy: Troy Taylor\nPriority Matrix\nBy: Appfluence Inc\nPriority Matrix HIPAA\nBy: Appfluence Inc\nPriva\nBy: Microsoft, Purview Privacy\nProcess Mining\nBy: Microsoft\nProcess Street\nBy: Process Street\nProcess Street MCP Server\nBy: Process Street\nProfisee\nBy: Profisee\nProgressus Advanced Projects\nBy: Plumbline Consulting\nProject Online\nBy: Microsoft\nProjectPlace\nBy: Planview inc.\nProjectum Present It\nBy: Projectum\nProjectWise Design Integration\nBy: Bentley Systems, Incorporated\nProjectwise Share [DEPRECATED]\nBy: Bently Systems, Inc.\nProPublica Campaign Finance (Independent Publisher)\nBy: Troy Taylor\nProPublica Congress (Independent Publisher)\nBy: Troy Taylor, Hitachi Solutions\nProPublica Nonprofit Explorer (Independent Publisher)\nBy: Troy Taylor\nPROS AI\nBy: PROS Inc.\nPublic 360\nBy: Tietoevry Norway\nPUG Gamified Engagement\nBy: Pug Interactive Inc\nPure Leads\nBy: Pure Digital Pte Ltd\nPushcut\nBy: Pushcut\nPushover (Independent Publisher)\nBy: Glen Hutson\nQdrant (Independent Publisher)\nBy: Anush\nQnA Maker\nBy: Microsoft\nQPP NextGen\nBy: Quark Software Inc.\nQuickbase (Independent Publisher)\nBy: Troy Taylor, Hitachi Solutions\nQuickBooks Time (Independent Publisher)\nBy: Artesian Software Technologies LLP\nQuickChart (Independent Publisher)\nBy: Troy Taylor\nr/SpaceX (Independent Publisher)\nBy: Troy Taylor, Hitachi Solutions\nRainbird\nBy: Rainbird Technologies ltd\nRamQuest Actions\nBy: Ramquest Software, Inc\nRamQuest Events\nBy: RamQuest Software, Inc\nRAPID Platform\nBy: RAPID Platform\nRarible (Independent Publisher)\nBy: Roy Paar\nReachability (Independent Publisher)\nBy: Troy Taylor, Hitachi Solutions\nReadwise (Independent Publisher)\nBy: Troy Taylor\nRealFaviconGenerator (Independent Publisher)\nBy: Troy Taylor\nRebrandly (Independent Publisher)\nBy: That API Guy\nRebrickable (Independent Publisher)\nBy: Troy Taylor\nReceptful\nBy: Los Trigos, Inc\nRecorded Future [DEPRECATED]\nBy: Recorded Future\nRecorded Future Identity\nBy: Recorded Future\nRecorded Future Sandbox\nBy: Recorded Future\nRecorded Future V2\nBy: Recorded Future\nRedmine\nBy: Microsoft\nRedque\nBy: Redque s.r.o.\nReflect\nBy: Troy Taylor\nRefuge Restrooms (Independent Publisher)\nBy: Troy Taylor\nRegEx Matching (Independent Publisher) [DEPRECATED]\nBy: Mitanshu Garg\nRegexFlow ExecutePython\nBy: Epicycle\nRegexFlow Regular Expression\nBy: Epicycle\nRegoLink for Clarity PPM\nBy: Rego Consulting Corporation\nReliefWeb (Independent Publisher)\nBy: Troy Taylor\nRencore Code\nBy: Rencore GmbH\nRencore Governance\nBy: Rencore GmbH\nRepfabric\nBy: Repfbaric\nRepfabric Job Loader\nBy: Repfabric LLC\nRepfabric Lead Loader\nBy: Repfabric LLC\nReplicate (Independent Publisher)\nBy: Troy Taylor\nReplicon\nBy: Replicon Inc\nRequestor\nBy: Requestor\nResco Cloud\nBy: Resco\nResco Reports\nBy: Resco\nRescueGroups (Independent Publisher)\nBy: Troy Taylor\nResend (Independent Publisher)\nBy: Troy Taylor\nREST Countries (Independent Publisher)\nBy: Siddharth Vaghasia\nRetarus SMS\nBy: retarus GmbH\nRev AI (Independent Publisher)\nBy: Troy Taylor\nRevelation helpdesk\nBy: Yellowfish Software\nReversingLabs A1000\nBy: ReversingLabs\nReversingLabs TitaniumCloud\nBy: ReversingLabs\nRevue (Independent Publisher)\nBy: Daniel Laskewitz\nRijksmuseum (Independent Publisher)\nBy: Ashwin Ganesh Kumar\nRijksoverheid (Independent Publisher)\nBy: Dennis Goedegebuure\nRiskIQ\nBy: Microsoft\nRiskIQ Digital Footprint\nBy: RiskIQ\nRiskIQ Illuminate\nBy: RiskIQ\nRobohash (Independent Publisher)\nBy: Troy Taylor, Hitachi Solutions\nRobolytix\nBy: Robolytix\nRobots for Power BI\nBy: DevScope S.A.\nRon Swanson Quotes (Independent Publisher)\nBy: Troy Taylor\nRowShare\nBy: ROWSHARE\nRSign\nBy: RPost US Inc\nRSS\nBy: Microsoft\nSalesforce\nBy: Microsoft\nSAP\nBy: Microsoft\nSAP ERP\nBy: Microsoft\nSAP OData\nBy: Microsoft\nSapling.ai (Independent Publisher)\nBy: Fördős András\nSAS Decisioning\nBy: SAS Institute, Inc.\nScanCloud\nBy: Scancloud\nSchiphol Airport (Independent Publisher)\nBy: Michel Gueli\nSchoolDigger (Independent Publisher)\nBy: Troy Taylor\nScrapingBee (Independent Publisher)\nBy: Troy Taylor\nScreenshot One (Independent Publisher)\nBy: Troy Taylor\nScrive eSign\nBy: Scrive\nScryfall (Independent Publisher)\nBy: Troy Taylor\nSearchAPI - Google Search (Independent Publisher)\nBy: Troy Taylor\nSECIB\nBy: SECIB\nSecret Server\nBy: Delinea, Inc.\nSecure Code Warrior (Independent Publisher)\nBy: Hitachi Solutions\nSeeBotRun - Link\nBy: SeeBotRun\nSeekTable\nBy: SeekTable.com\nSeismic\nBy: Seismic Software, Inc.\nSeismic Configuration\nBy: Seismic\nSeismic Content Discovery\nBy: Seismic\nSeismic Engagement\nBy: Seismic\nSeismic for Copilot for Sales\nBy: Seismic Software\nSeismic Library\nBy: Seismic\nSeismic Livedoc\nBy: Seismic\nSeismic Planner\nBy: Seismic\nSeismic Programs\nBy: Seismic Software\nSeismic Workspace\nBy: Seismic\nSendFox (Independent Publisher)\nBy: Troy Taylor\nSendGrid\nBy: Microsoft\nSendmode\nBy: SendMode\nServerless360 BAM & Tracking\nBy: Kovai Limited\nService Bus\nBy: Microsoft\nService Objects\nBy: Service Objects\nServiceDesk Plus Cloud\nBy: ManageEngine (A division of Zoho Corporation)\nServiceNow\nBy: Microsoft\nSerwerSMS\nBy: SerwerSMS\nSessionize (Independent Publisher)\nBy: Nanddeep Nachan, Smita Nachan\nSFTP - SSH\nBy: Microsoft\nSFTP [DEPRECATED]\nBy: Microsoft\nShadify (Independent Publisher)\nBy: Troy Taylor\nShare-Effect\nBy: ShareEffect\nSharePoint\nBy: Microsoft\nSharePoint Embedded\nBy: Microsoft\nSherpa Digital\nBy: Sherpa Digital\nShields.io (Independent Publisher)\nBy: Troy Taylor, Hitachi Solutions\nShifts for Microsoft Teams\nBy: Microsoft\nShipStation IP (Independent Publisher)\nBy: Kristian Matthews\nShop (Independent Publisher)\nBy: Microsoft\nShopify (Independent Publisher)\nBy: Ray Bennett (MSFT)\nShopranos\nBy: SoftOne Technologies S.A\nShort URL\nBy: APPS 365 LTD\nShortySMS (Independent Publisher)\nBy: Troy Taylor\nShowcase Workshop\nBy: Showcase Software Ltd\nShowpad eOS\nBy: Showpad\nSHRTCODE (Independent Publisher)\nBy: Chandra Sekhar Malla\nSigma Conso CR\nBy: Sigma Conso\nSignatureAPI\nBy: SignatureAPI\nSignhost\nBy: Signhost\nSigni.com\nBy: NETWORG\nSigningHub\nBy: Ascertia\nSIGNL4 - Mobile Alerting\nBy: Derdack\nSignNow\nBy: DaDaDocs\nSignNow EU\nBy: DaDaDocs\nSignRequest\nBy: SignRequest B.V.\nSignUpGenius (Independent Publisher)\nBy: Troy Taylor\nSimple EDI\nBy: Weavo Liquid Loom\nSimpleSurvey\nBy: SimpleSurvey\nSinch\nBy: Sinch Sweden AB\nSirva Relocating Employee\nBy: Sirva Relocation\nSkribble Sign\nBy: busitec GmbH\nSkype for Business Online [DEPRECATED]\nBy: Microsoft\nSkyPoint Cloud\nBy: SkyPoint Cloud\nSlack\nBy: Microsoft\nSlascone\nBy: SLASCONE GmbH\nsmapOne\nBy: smapOne AG\nSmarp\nBy: Smarp\nSmartCOMM DocGen\nBy: Smart Communications\nSmartDialog\nBy: Arena Interactive Oy\nSmarter Drafter\nBy: Tensis Group\nSmartsheet\nBy: Microsoft\nSmileBack\nBy: ConnectWise SmileBack\nSMS Wireless Services (Independent Publisher)\nBy: ViaData\nsms77io\nBy: sms77 e.K.\nSMSAPI\nBy: LINK Mobility Poland\nSMSLink\nBy: ASTINVEST COM SRL (SMSLink)\nSMTP\nBy: Microsoft\nSnowflake\nBy: Snowflake\nSociabble\nBy: Sociabble\nSocialinsider\nBy: Socialinsider\nSoft1\nBy: SoftOne Technologies S.A\nSoftone Web CRM\nBy: Softone Technologies\nSoftools\nBy: Softools Limited\nSolarEdge (Independent Publisher)\nBy: Richard Wierenga\nSoloSign HMAC Hash Creator\nBy: Solort\nSOS Inventory (Independent Publisher)\nBy: Harold Anderson\nSparkPost\nBy: Microsoft\nSparse Power Box Tools\nBy: Sparse Development\nSpinpanel\nBy: Spinpanel B.V.\nSpoonacular Food (Independent Publisher)\nBy: Amjed Ayoub\nSpoonacular Meal Planner (Independent Publisher)\nBy: Amjed Ayoub\nSpoonacular Recipe (Independent Publisher)\nBy: Amjed Ayoub\nSpotify (Independent Publisher)\nBy: Daniel Laskewitz\nSpring Global\nBy: Enavate\nSQL Server\nBy: Microsoft\nSquare Business (Independent Publisher)\nBy: Troy Taylor\nSquare Payments (Independent Publisher)\nBy: Troy Taylor\nStability.ai (Independent Publisher)\nBy: Troy Taylor\nStaffbase\nBy: Staffbase GmbH\nStaffCircle\nBy: StaffCircle\nStandard approvals\nBy: Microsoft\nStar Wars (Independent Publisher)\nBy: Paul Culmsee\nStarmind\nBy: Starmind (inc)\nStarRez REST v1\nBy: StarRez, Inc.\nStorm Glass (Independent Publisher)\nBy: Paul Culmsee\nStormboard\nBy: Stormboard\nStraker Verify\nBy: Straker Group\nStrava (Independent Publisher)\nBy: Richard Wierenga\nStripe\nBy: Microsoft\nStudio Ghibli (Independent Publisher)\nBy: Troy Taylor\nSunrise-Sunset (Independent Publisher)\nBy: Fördős András\nSuperMCP\nBy: Supermetrics\nSupportivekoala (Independent Publisher)\nBy: Troy Taylor\nSureXeroLite (Independent Publisher)\nBy: The 848 Group\nSurvalyzer EU\nBy: Survalyzer AG\nSurvalyzer Swiss\nBy: Survalyzer AG\nSurvey123\nBy: ArcGIS Survey123\nSurveyMonkey\nBy: Microsoft\nSurveyMonkey Canada\nBy: SurveyMonkey\nSwagger Converter (Independent Publisher)\nBy: Fordos Andras\nSynthesia (Independent Publisher)\nBy: Troy Taylor\nT.LY (Independent Publisher)\nBy: Troy Taylor\nTabscanner Receipt OCR (Independent Publisher)\nBy: Ben Smith\nTAGGUN Receipt OCR Scanning (Independent Publisher)\nBy: Amjed Ayoub\nTago\nBy: Tago LLC\nTaktikal Core\nBy: Taktikal\nTalkdesk\nBy: Talkdesk\nTallyfy\nBy: Tallyfy, Inc\nTALXIS Data Feed\nBy: TALXIS\nTaqnyat\nBy: Taqnyat Network Operation\nTavily (Independent Publisher)\nBy: Troy Taylor\nTax ID Pro (Independent Publisher)\nBy: Fördős András\nTDox\nBy: Seltris srl\nTeam Forms\nBy: VP LABS PTY LTD\nTeamflect\nBy: Teamflect\nTeams-Spirit\nBy: D.F.K. Digitalteamwork GmbH\nTeamWherx\nBy: Actsoft\nTeamwork Projects\nBy: Microsoft\ntegolySIGN\nBy: tegoly GmbH\nTelegram Bot (Independent Publisher)\nBy: Woong Choi\nTelephony Xtended Serv Interf\nBy: BluIP, Inc.\nTeleSign SMS\nBy: TeleSign Corporation\nTemplafy\nBy: Templafy\nTendocs Documents\nBy: Deepdale BV\nTeradata\nBy: Microsoft\nTesseron Asset Management\nBy: Tesseron by Luithle + Luithle GmbH\nTesseron Basic Data\nBy: Tesseron by Luithle + Luithle GmbH\nTesseron Invoice\nBy: Tesseron by Luithle + Luithle GmbH\nTesseron Ticket\nBy: Tesseron by Luithle + Luithle GmbH\nText Analytics\nBy: MAQ Software\nText Request\nBy: Text Request\nThe Bot Platform\nBy: The Bot Platform\nThe Brønnøysund Registries (Independent Publisher)\nBy: Ahmad Najjar\nThe Color (Independent Publisher)\nBy: Troy Taylor\nThe Events Calendar\nBy: The Events Calendar\nThe Guardian (Independent Publisher)\nBy: Troy Taylor\nThe IT Tipster\nBy: The IT Tipster\nThe Lord of the Rings (Independent Publisher)\nBy: Troy Taylor\nThe SMS Works (Independent Publisher)\nBy: Troy Taylor\nThe Weather Channel (Independent Publisher)\nBy: Roy Paar\nTheGoodAPI (Independent Publisher)\nBy: Troy Taylor\nTheMealDB (Independent Publisher)\nBy: John Muchiri\nThreads (Independent Publisher)\nBy: Troy Taylor\nTicketing.events\nBy: Ventipix\nTicketmaster (Independent Publisher)\nBy: Troy Taylor\nTikit\nBy: Cireson\nTiliter Vision Agents\nBy: Tiliter Pty Ltd\nTilkee\nBy: Microsoft\nTimeAPI (Independent Publisher)\nBy: Fördős András\ntimeghost\nBy: timeghost.io\nTimeneye\nBy: DM Digital Software SRL\nTLDR\nBy: Troy Taylor\nToday in History (Independent Publisher)\nBy: Troy Taylor\nTodoist\nBy: Microsoft\nToggl Plan (Independent Publisher)\nBy: Daniel Laskewitz\nToggl Track (Independent Publisher)\nBy: troystaylor\nTomorrow.io (Independent Publisher)\nBy: Troy Taylor\nToodledo\nBy: Microsoft\nTophhie Cloud\nBy: Chris Greenacre\nTopMessage\nBy: TOP X\ntouchSMS\nBy: Edgility\nTPC Portal\nBy: The Portal Connector\nTraction Guest\nBy: Traction Guest\nTrade.Gov (Independent Publisher)\nBy: Dan Romano\nTransform2All\nBy: GAC Business Solutions\nTree-Nation (Independent Publisher)\nBy: Troy Taylor\nTrello\nBy: Microsoft\nTribal - Maytas\nBy: Tribal Group\nTribal - Platform\nBy: Tribal Group\nTribal - SITS\nBy: Tribal Group\nTRIGGERcmd\nBy: VanderMey Consulting, LLC\nTrovve\nBy: Trovve Inc\nTrueDialog SMS\nBy: TrueDialog Dynamics\nTrustual\nBy: Practical Crypto SpA\nTulip\nBy: Tulip Interfaces\nTumblr (Independent Publisher)\nBy: Troy Taylor\nTuxMailer\nBy: TuxMailer\nTwilio\nBy: Microsoft\nTxtSync\nBy: TxtSync Limited\ntyntec 2FA\nBy: tyntec GmbH\ntyntec Phone Verification\nBy: tyntec GmbH\ntyntec SMS Business\nBy: tyntec GmbH\ntyntec Viber Business\nBy: tyntec GmbH\ntyntec WhatsApp Business\nBy: tyntec GmbH\nTypeform\nBy: Microsoft\nU.S. Bank Treasury Management\nBy: U.S. Bank\nUber Freight\nBy: Microsoft\nUbiqod by Skiply\nBy: Skiply\nUbiqod by Taqt\nBy: Skiply\nUdemy (Independent Publisher)\nBy: Nanddeep Nachan, Smita Nachan\nUiPath\nBy: UiPath Incorporated\nUiPath Orchestrator\nBy: UiPath\nUK Bank Holidays (Independent Publisher)\nBy: Martyn Lesbirel, Troy Taylor\nUK Check VAT (Independent Publisher)\nBy: Fördős András\nUKG Pro HCM\nBy: Dilip Chenani\nUKG PRO WFM Authentication\nBy: UKG, Inc.\nUKG Pro WFM Employee\nBy: Ria Gupta\nUKG Pro WFM People\nBy: Dilip Chenani\nUKG Pro WFM Timekeeping\nBy: Ria Gupta\nUnix Timestamp (Independent Publisher)\nBy: Troy Taylor, Hitachi Solutions\nUnofficial Netflix Search (Independent Publisher)\nBy: Troy Taylor\nUnsplash (Independent Publisher)\nBy: Troy Taylor, Hitachi Solutions\nUpdates App (Microsoft 365)\nBy: Microsoft\nUpdown (Independent Publisher)\nBy: Fordos Andras\nUpland Panviva US\nBy: Upland Software Inc..\nURL.dev (Independent Publisher)\nBy: Troy Taylor\nUrLBae (Independent Publisher)\nBy: Troy Taylor\nUS Congress CRS (Independent Publisher)\nBy: Dan Romano\nUS Patent & Trademark Office (Independent Publisher)\nBy: krautrocker\nUSAJOBS (Independent Publisher)\nBy: Richard Wilson\nUSB4SAP\nBy: Ecoservity\nUserVoice\nBy: Microsoft\nUSGS Earthquake Hazards (Independent Publisher)\nBy: Troy Taylor\nVantage 365 Imaging\nBy: Vantage 365 LTD\nVaruna\nBy: Univera Computer Systems Industry and Trade Inc.\nvatcheckapi\nBy: Fördős András\nVena Solutions\nBy: Vena Solutions\nVentipix Asset and Inventory\nBy: Ventipix\nVerified\nBy: Crm - Konsulterna i Sverige AB\nVeteran Confirmation (Independent Publisher)\nBy: Troy Taylor\nVeterans Affairs Facilities (Independent Publisher)\nBy: Richard Wilson\nVeterans Affairs Forms (Independent Publisher)\nBy: Richard Wilson\nVeterans Affairs Providers (Independent Publisher)\nBy: Richard Wilson\nViafirma\nBy: Viafirma\nVideo Indexer (V2)\nBy: Microsoft\nVIES (Independent Publisher)\nBy: Tomasz Poszytek\nVimeo\nBy: Microsoft\nVirtual Data Platform\nBy: Virtual_Data_Platform_GmbH\nVirus Total\nBy: Microsoft\nViva Engage\nBy: Microsoft\nVocean\nBy: Vocean AB\nVoice Monkey (Independent Pubshisher)\nBy: Richard Wilson\nVoiceRSS (Independent Pubisher)\nBy: Fördős András\nVome\nBy: Vome Volunteer\nVonage\nBy: Vonage\nWaaila\nBy: Cross Masters s.r.o.\nWay We Do\nBy: Way We Do\nWayback Machine (Independent Publisher)\nBy: Fördős András\nWeather Forecast (Independent Publisher)\nBy: Haimantika Mitra\nWeavo Liquid Loom\nBy: Weavo Liquid Loom\nWebex\nBy: Cisco\nWebex Integration (Independent Publisher)\nBy: University College London, Oscar Hui\nWebhood URL Scanner\nBy: Webhood\nWebsite Carbon (ndependent Publisher)\nBy: Clement Olivier\nWenDocs Linker\nBy: WenDocs Ltd\nWhat3Words (Independent Publisher)\nBy: Matt Beard\nWhatIsMyBrowser (Independent Publisher)\nBy: Troy Taylor\nWhatsApp (Independent Publisher)\nBy: Zakariya Fakira\nWindows 365\nBy: Microsoft\nWithoutWire Inventory Platform\nBy: Enavate\nWitivio\nBy: Witivio\nWMATA (Independent Publisher)\nBy: Richard Wilson, Daniel Cox\nWooCommerce\nBy: Reenhanced, LLC\nWoodpecker (Independent Publisher)\nBy: Troy Taylor\nWord Cloud by Textvis (Independent Publisher)\nBy: Troy Taylor\nWord Online (Business)\nBy: Microsoft\nWordPress\nBy: Microsoft\nWorkable (Independent Publisher)\nBy: David Kjell\nWorkday HCM\nBy: Microsoft\nWorkday SOAP\nBy: Microsoft\nWorking days (Independent Publisher)\nBy: Tomasz Poszytek\nWorkMobile\nBy: eSAY Solutions Ltd\nWorkPoint\nBy: WorkPoint\nWorkPoint 365\nBy: WorkPoint 365\nWorkSpan\nBy: WorkSpan\nWorkstem AU\nBy: OneJob Group Limited\nWorkstem HK\nBy: OneJob Group Limited\nWorld Academia\nBy: Kelcho Tech\nWorldTime (Independent Publisher)\nBy: Fördős András\nWorldwide Bank Holidays (Independent Publisher)\nBy: Reshmee Auckloo\nWP Connectr for WordPress\nBy: Reenhanced, LLC\nWPForms by Reenhanced LLC\nBy: Reenhanced, LLC\nWQRM Risk Forecast Services\nBy: Western QRM\nWritesonic (Independent Publisher)\nBy: Troy Taylor\nwttr.in (Independent Publisher)\nBy: Troy Taylor\nX\nBy: Microsoft\nX12\nBy: Microsoft\nXbridger Document Manager\nBy: Xbridger Solutions\nXC-Gate\nBy: TECHNOTREE CO., LTD.\nXero Accounting - Magnetism\nBy: Magnetism\nxkcd (Independent Publisher)\nBy: Troy Taylor\nXooa Blockchain Database\nBy: Xooa Inc\nXooa Blockchain Smart Contract\nBy: Xooa Inc\nXpertdoc (Deprecated) [DEPRECATED]\nBy: Xpertdoc Technologies Inc.\nXSOAR (Independent Publisher)\nBy: Landon Chelf\nXSS PDF Solutions Integrations\nBy: Cross-Service-Solutions\nXSS QR Code Solutions\nBy: Cross-Service-Solutions\nYakChat\nBy: YakChat Ltd.\nYarado\nBy: Yarado\nYeeflow\nBy: YEEFLOW SINGAPORE PTE LTD\nYeelight\nBy: Qingdao Yeelink Information Technology Co., Ltd.\nYelp (Independent Publisher)\nBy: Ahmad Najjar\nYou Need A Budget (Independent Publisher)\nBy: Troy Taylor\nYouTube\nBy: Microsoft\nYouTube Transcript (Independent Publisher)\nBy: troystaylor\nZahara\nBy: Zahara Systems Ltd\nZanran Scaffolder\nBy: Zanran Ltd\nZapier MCP\nBy: Zapier Inc\nZapier NLA (Independent Publisher)\nBy: Troy Taylor\nZellis\nBy: Zellis\nZendesk\nBy: Microsoft\nZenkraft\nBy: Zenkraft\nZenler (Independent Publisher)\nBy: Troy Taylor\nZenlogin (Independent Publisher)\nBy: Troy Taylor\nZeroTrain AI Core\nBy: Leonard Gambrell - DBA Gambrell Software\nZippopotamus (Independent Publisher)\nBy: Tomasz Poszytek\nZIPPYDOC\nBy: ZippyDoc GmbH\nZoho Mail\nBy: Zoho Corporation Private Limited\nZoho Calendar\nBy: Zoho Mail\nZoho Forms\nBy: Zoho Corporation Private Limited\nZoho Invoice Basic (Independent Publisher)\nBy: Troy Taylor\nZoho Sign\nBy: Zoho Corporation\nZoho TeamInbox\nBy: Zoho Corporation Private Limited\nZoho ZeptoMail\nBy: Zoho Corporation Private Limited\nZoom Meetings (Independent Publisher)\nBy: Akuthota Deekshith\nzReports\nBy: zReports Software s.r.o.\nZuva DocAI\nBy: Zuva Inc.\nZvanu Parvaldnieks\nBy: Latvijas Mobilais Telefons\nAdditional resources", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, - "last_changed": "2026-03-11T12:59:29.613756+00:00", + "last_changed": "2026-03-14T06:51:10.083468+00:00", "topic": "Connector Reference", "section": "Power Platform Administration" }, "https://learn.microsoft.com/en-us/connectors/custom-connectors/": { "content_hash": "sha256:379fdf8d0faff9bccec89b5efacf2e1bdf6d46d4f3a666418ac73aa102bceb31", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nCustom connectors overview\nFeedback\nSummarize this article for me\nAzure Logic Apps\n,\nMicrosoft Power Automate\n,\nMicrosoft Power Apps\n, and\nMicrosoft Copilot Studio\noffer over\n1,000 connectors\nto connect to Microsoft and verified services, but you might want to communicate with services that aren't available as prebuilt connectors. Custom connectors address this scenario by allowing you to create (and even share) a connector with its own triggers and actions.\nLifecycle\n1. Build your API\nA custom connector is a wrapper around a REST API that allows Logic Apps,\nPower Automate, Power Apps, or Copilot Studio to communicate with that REST or SOAP API. These APIs can be:\nPublic (visible on the public internet) such as\nSpotify\n,\nSlack\n,\nRackspace\n, or an API you manage.\nPrivate (visible only to your network).\nLogic Apps also supports SOAP APIs.\nFor public APIs that you plan to create and manage, consider using one of these Microsoft Azure products:\nAzure Functions\nAzure Web Apps\nAzure API Apps\nFor private APIs, Microsoft offers on-premises data connectivity through an\non-premises data gateway\n.\n2. Secure your API\nUse one of these standard authentication methods for your APIs and connectors (\nMicrosoft Entra ID\nis recommended):\nGeneric OAuth 2.0\nOAuth 2.0 for specific services, including Microsoft Entra ID, Dropbox, GitHub, and SalesForce\nBasic authentication\nAPI Key\nYou can set up Microsoft Entra ID authentication for your API in the Azure portal so you don't have to implement authentication. Or, you can require and enforce authentication in your API's code. For more information about Microsoft Entra ID for custom connectors, see\nSecure your API and connector with Microsoft Entra ID\n.\n2.1. OAuth 2.0\nNewly created custom connectors that use OAuth 2.0 to authenticate automatically have a per connector redirect URI. Existing OAuth 2.0 connectors must be updated to use a per-connector redirect URI before February 17, 2024.\nIf you created your custom connectors with the web interface, edit your custom connectors, go to the\nSecurity\ntab and check the box,\nUpdate to unique redirect URL\n, and then save to enable the per connector redirect URI.\nIf you created your custom connectors with\nmulti-auth using the command line interface (CLI) tool\n, you need to update your connector using the CLI tool to set\n\"redirectMode\": \"GlobalPerConnector\"\n.\nOnce custom connectors are updated to use the per-connector redirect URI either through the setting in the\nSecurity\ntab or the CLI tool, remove the global redirect URI from your OAuth 2.0 apps. You should add the newly generated unique redirect URL to your OAuth 2.0 apps.\nWe'll enforce this update for existing OAuth 2.0 custom connectors starting on February 17, 2024. Any custom connector not updated to use a per-connector redirect URI stops working for new connections and shows an error message to the user.\nTo find out which custom connectors need an update to migrate to per-connector redirect URL, you can create a flow that uses the\nGet Custom Connectors as Admin\naction of Power Apps for Admin connector and parse its result. The flow attached later in this article fetches all the custom connectors using the same. It then applies a filter condition on the connection parameter's property to filter out non-Oauth custom connector, followed by another filter to select only connectors that don't use the per connector unique redirect URL. Finally, it puts the selected custom connectors into an array variable initialized in the beginning of the flow and generates an HTML table showing name and creator of those connectors. You can import this flow into your environment by importing\nthis solution\n. You can extend the flow further to send the HTML table as an email to yourself. or you can extend it to send emails to the connector creators directly and provide them with the names of the connector that needs to be updated.\n3. Describe the API and define the custom connector\nOnce you have an API with authenticated access, the next thing to do is to describe your API so that Logic Apps, Power Automate, Power Apps, or Copilot Studio can communicate with your API. The following approaches are supported:\nAn OpenAPI definition (formerly known as a Swagger file)\nCreate a custom connector from an OpenAPI definition\nOpenAPI documentation\nA Postman collection\nCreate a Postman collection\nCreate a custom connector from a Postman collection\nPostman documentation\nStart from scratch using the custom connector portal (Power Automate and Power Apps only)\nCreate a custom connector from scratch\nOpenAPI definitions and Postman collections use different formats, but both are language-agnostic, machine-readable documents that describe your API. You can generate these documents from various tools based on the language and platform used by your API. Behind the scenes, Logic Apps, Power Automate, Power Apps, and Copilot Studio use OpenAPI to define connectors.\n4. Use your connector in Copilot Studio, Logic Apps, Power Automate, or a Power Apps app\nCustom connectors are used the same way prebuilt connectors are used. You need to create a connection to your API in order to use that connection to call any operations that you expose in your custom connector.\nConnectors created in Power Automate are available in Power Apps and Copilot Studio, and connectors created in Power Apps are available in Power Automate and Copilot Studio. This availability isn't true for connectors created in Logic Apps. However, you can reuse the OpenAPI definition or Postman collection to recreate the connector in any of these services. For more information, see the appropriate tutorial:\nUse a custom connector from a flow\nUse a custom connector from an app\nUse a custom connector from a logic app\nUse connector actions in Copilot Studio\nTip\nIf you update (remove, add, or change) a field in the API, perform these steps:\nRepublish the connector so it looks at the updated Swagger for the API.\nRemove any connection / data source in any app that used that connector.\nRe-add the connection / data source for that connector back into the apps.\n5. Share your connector\nYou can share your connector with users in your organization the same way that you share resources in Copilot Studio, Logic Apps, Power Automate, or Power Apps. Sharing is optional, but you might have scenarios where you want to share your connectors with other users.\nLearn more in\nShare custom connectors in your organization\n.\n6. Certify your connector\nIf you want to share your connector with all users of Copilot Studio, Logic Apps, Power Automate, and Power Apps, you need to\nsubmit your connector for Microsoft certification\n. Microsoft reviews your connector, checks for technical and content\ncompliance, and validates functionality.\nVirtual Network support\nWhen the connector is used in a\nPower Platform environment linked to a Virtual Network\n, limitations apply:\nWhen custom code is used, limitations are explained in\nWrite code in a custom connector\n.\nCustom connectors created before the environment was associated to a Virtual Network need to be resaved.\nTriggers that return location header which do not call back into custom connector are not supported.\nProvide feedback\nWe greatly appreciate feedback on issues with our connector platform, or new feature ideas. To provide feedback, go to\nSubmit issues or get help with connectors\nand select your feedback type.\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Custom Connectors", @@ -143,7 +143,7 @@ "https://learn.microsoft.com/en-us/power-platform/admin/security/security-overview": { "content_hash": "sha256:9f3b970311c5ac0923298f989fdcbcda3af603b2961a3fc0d763c9dbd973ae37", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nSecurity overview\nFeedback\nSummarize this article for me\nThe\nSecurity\n>\nOverview\npage in the Power Platform admin center is designed to enhance your organization's security and streamline management. It provides a centralized location where you can view and manage security recommendations, assess your security score, and implement proactive policies to safeguard your organization.\nAdministrators can complete these tasks:\nAssess your security score\n: Use the security score to understand and improve your organization's security policies. The security score is shown on a qualitative scale (\nLow\n,\nMedium\n, or\nHigh\n). It helps you measure your organizational security position for Microsoft Power Platform and Dynamics 365 workloads.\nAct on recommendations\n: Identify and implement impactful recommendations that the system generates. These recommendations are based on best practices for improving a tenant's security score.\nManage proactive policies\n: Manage proactive policies for governance and security.\nPrerequisite\nTo view your security score, you must turn on tenant-wide analytics. You can find instructions in\nHow do I turn on tenant-level analytics?\nNote\nAfter you turn on tenant-wide analytics, it might take up to 24 hours for the\nSecurity\n>\nOverview\npage to be populated with data. Until then, most sections of the page show the message \"Calculating security score.\"\nAccess the Security > Overview page\nTo access the\nSecurity\n>\nOverview\npage, you must have Microsoft Entra ID roles such as Power Platform administrator or Dynamics 365 administrator. Learn more about these roles in\nUse service admin roles to manage your tenant\n. Environment administrators can manage security and compliance features for owned environments by opening the\nSecurity\npage as explained in the following procedure.\nSign in to the\nPower Platform admin center\n.\nIn the navigation pane, select\nSecurity\n.\nIn the\nSecurity\npane, select the page that you want to open. You can open pages for the overview,\ndata protection and privacy\n,\nidentity and access management\n, and\ncompliance\n.\nNote\nOnly tenant administrators can access the scorecard and recommendations on the\nSecurity\n>\nOverview\npage.\nOnly tenant administrators can convert an environment to a managed type.\nOn every security page, features that apply to Managed Environments are marked with the following meter symbol:\nSecurity score (preview)\n[This section is prerelease documentation and is subject to change.]\nImportant\nThis is a preview feature.\nPreview features aren’t meant for production use and might have restricted functionality. These features are subject to\nsupplemental terms of use\n, and are available before an official release so that customers can get early access and provide feedback.\nThe security score is calculated based on the security features that are turned on in your environment. It provides a measurement of your organizational security position for Microsoft Power Platform and Dynamics 365 workloads.\nQualitative scale\n: The security score is shown on a qualitative scale that uses three assessment labels:\nLow\n: For scores from 0 through 50\nMedium\n: For scores from 51 through 80\nHigh\n: For scores from 81 through 100\nThe more security features are turned on in your environment, the higher your security score. The\nMedium\nand\nHigh\nassessment labels indicate that more recommended actions were taken and led to an improvement in the security position of the tenant.\nFeature impact\n: Each security feature is assigned a score, based on the feature's scope and the number of resources that are affected by turning it on or off. As new security features are added, the total possible score might change. Therefore, your overall score might be affected even if your settings remain the same.\nScore calculation formula\n: The security score is expressed as a percentage and is calculated by using the following formula:\n(\nYour score\n÷\nTotal possible score\n) × 100\nFor example, your tenant has 10 environments, five Managed Environments and five non-Managed Environments. The following features are configured:\nIP firewall\n: Turned on in two of the 10 environments (2 points).\nTenant isolation\n: Turned on in all 10 environments (10 points).\nEnvironment security group\n: Turned on in five of the 10 environments (5 points).\nIn this case, your total score is 2 + 10 + 5 = 17, and the total possible score is 30. Therefore, your security score is (17 ÷ 30) × 100 = 56.66%.\nImportant\nThe security score is updated every 24 hours. Therefore, any action that is taken might take up to 24 hours to reflect the updated score.\nThe score calculation considers all environments, both Managed Environments and non-Managed Environments.\nIf there are no Managed Environments that you can take action on in the recommendation pane, no environments are listed.\nTurn on environment management to unlock full security benefits\nNote\nThis feature is in the process of rolling out and might not be available in your region yet.\nTo ensure your organization benefits from the complete suite of managed security features, each environment must be configured as a managed environment.\nAs an admin, you can now view the percentage of environments in your tenant that are currently unmanaged. This new experience allows you to convert environments from unmanaged to managed at scale—with just a few clicks.\nSelect\nGet started\nto begin the conversion process. The\nGet enhanced security features\npane appears.\nSelect environments from the\nRecommended environments\ntab, which prioritizes environments based on data volume. Alternatively, switch to the\nAll eligible environments\ntab to manually select environments you want to convert.\nReview and accept the terms and conditions.\nSelect\nTurn on environment management\nto complete the conversion.\nIf you prefer to turn on environment management later, select\nNot now\nto dismiss the prompt and revisit when ready.\nBy using\nenvironment management\n, you’re taking a proactive step toward stronger, more consistent security across your organization.\nReactive governance through recommendations\nThe system generates various recommendations, based on common best practices that improve the security score of your tenant. Recommendations refer to actions or measures that the administrator can take to enhance the overall security status.\nAdministrators are guided through an intuitive experience where they take relevant actions on environments, based on specific recommendations.\nEach recommendation shows the potential increase to the overall security score.\nAlthough the recommendations span all environments, you can act on them only in Managed Environments. If non-Managed Environments, you can turn on recommended features by opening the\nSettings\npage, finding the required feature, and turning it on for those environments.\nConditions that trigger feature recommendations\nThe following table outlines the conditions that trigger specific feature recommendations.\nFeature\nScope\nCondition that triggers recommendations\nAdministrator privileges\nEnvironment\nEnvironments that have more than 10 administrators\nAuditing\nEnvironment\nEnvironments where auditing is turned off\nCustomer Lockbox\nTenant\nTenants where Customer Lockbox is turned on, but that have no Managed Environments\nClient application access control\nEnvironment\nEnvironments where auditing is turned on and client application access control isn't configured\nData policy\nTenant\nNo tenant-level policy is set.\nEnvironments Azure Virtual Network\nEnvironment\nEnvironments that have no Virtual Network policy\nEnvironment security group\nEnvironment\nEnvironments that have no security group\nGuest access\nEnvironment\nEnvironments where restricted guest access is turned off\nIP firewall\nEnvironment\nEnvironments where IP firewall isn't configured\nIP address-based cookie binding\nEnvironment\nEnvironments where IP address-based cookie binding isn't configured\nSharing\nEnvironment\nEnvironments that have no sharing limit\nTenant isolation\nTenant\nThe tenant isolation setting is turned off.\nManage proactive policies for governance and security\nSeveral security features are available to help secure your tenant. For some of these features, a Managed Environment is a prerequisite. Therefore, before you can configure such a feature, you're asked to convert the environment to a managed type if it isn't one.\nUse the following links to view and manage proactive policies for governance and security:\nData protection and privacy\n: Ensure that personal information is securely handled, stored, and protected; prevent unauthorized access to data; and protect apps and cloud workloads from network-based cyberattacks through features such as\ncustomer-managed keys\n, data policies, and Azure Virtual Network.\nIdentity and access management\n: Ensure that authorized users are the only people who can access sensitive data in items across the tenant, through features such as IP firewall, IP address-based cookie binding, tenant isolation, environment security groups, sharing controls, and guest access.\nCompliance\n: Implement robust compliance measures to safeguard organizational data and ensure adherence to industry regulations, through features such\nCustomer Lockbox\nand auditing.\nDismiss recommendations\nAdministrators now have the ability to dismiss security recommendations that have been mitigated through alternative solutions. Previously, unaddressed recommendations could result in a stagnant security score, despite proactive measures taken outside the recommended solutions.\nDismissed recommendations no longer negatively impact the security score, ensuring an accurate reflection of the organization's security posture.\nYour dismissed recommendations are always accessible, meaning that you can review their history at any time. If circumstances change or you wish to revisit a previously dismissed recommendation, you can easily reactivate it to ensure continuous security optimization.\nTo dismiss a recommendation, complete the following steps.\nSign in to the\nPower Platform admin center\n.\nIn the navigation pane, select\nSecurity\n.\nIn the\nSecurity\npane, select\nOverview\n.\nThe\nOverview\npage appears. Scroll down to the\nTake action to increase your security score\nsection.\nIn the\nActive\ntab, select the recommendations that you want to dismiss.\nSelect the\nX\nicon to dismiss the recommendation.\nThe\nDismiss\nwindow is displayed. Select a reason for dismissing the recommendation from the dropdown list. Then select\nDismiss\n.\nThe recommendation moves to the\nDismissed\ntab.\nTo make a recommendation active again, complete the following steps.\nSign in to the\nPower Platform admin center\n.\nIn the navigation pane, select\nSecurity\n.\nIn the\nSecurity\npane, select\nOverview\n.\nThe\nOverview\npage appears. Scroll down to the\nTake action to increase your security score\nsection.\nSelect the\nDismissed\ntab.\nSelect the recommendation that you want to make active.\nSelect the\nArrows\nicon to make the recommendation active.\nThe recommendation moves to the\nActive\ntab.\nManage security settings at an environment group-level\nManaging Power Platform at scale presents challenges for IT teams overseeing numerous environments. To streamline security governance, administrators can configure security settings at the\nenvironment group\nlevel, ensuring uniform enforcement of policies across all environments within a group.\nCurrently, security management at the environment group-level is available for\nSharing\n,\nIP Firewall\n, and\nIP address-based cookie binding\nfeatures, with plans to extend support to other security capabilities soon. This structured approach simplifies administration, enhances security, and optimizes large-scale environment management for both startups and enterprises.\nTo configure security settings at the environment group-level, complete the following steps.\nSign in to the\nPower Platform admin center\n.\nIn the navigation pane, select\nSecurity\n.\nIn the\nSecurity\npane, select\nOverview\n.\nThe\nOverview\npage appears. Scroll down to the\nTake action to increase your security score\nsection.\nSelect a recommendation.\nIn the pane that is displayed, select the\nEnvironment groups\ntab and the\nEnvironments\ntab to select the environment groups or environments to which you want the security setting applied.\nSelect the\nManage sharing\nbutton.\nNote\nThe name of the button is determined by the security setting you're applying. In this specific example, we're applying a\nSharing\nsecurity setting, that's why\nManage sharing\nis the name of the button mentioned in this step.\nSelected settings are applied to all the environments in that environment group.\nProvide feedback\nEvery security page includes a\nFeedback\nbutton in the lower-right corner. Select this button to open a Microsoft Form where you can submit feedback and suggestions about the\nSecurity\npage and related features.\nFrequently asked questions (FAQ)\nHow is the security score calculated?\nThe security score is calculated based on the security features that are turned on in your environment. Each security feature is assigned a score, based on the feature's scope and the number of resources that are affected by turning it on or off. It's important to note that the total possible score might change as new security features are added. Therefore, your overall security score might be affected even if your settings remain the same.\nWhy don't all environments appear in the recommended action?\nAlthough the recommendations span all environments, you can act on them only in Managed Environments. If non-Managed Environments, you can turn on recommended features by opening the\nSettings\npage, finding the required feature, and turning it on for those environments.\nCan customers modify the recommendations based on their needs?\nNo. The recommendations are system-generated and are based on Microsoft best practices and guidance.\nWhen can the security score be updated after I take recommended actions?\nAfter you take action to turn on the feature, it might take up to 24 hours to reflect the overall security score. The security score isn't updated in real time.\nWhy don't administrator privileges work for environment administrators, such as the System Administrator role?\nThis issue is a known limitation. Only tenant administrators can manage the administrator privileges.\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Security", @@ -152,7 +152,7 @@ "https://learn.microsoft.com/en-us/power-platform/admin/security-roles-privileges": { "content_hash": "sha256:535732ea1924b96efd26b91885b91b15349f8053f624ee3cae8c16081cab0425", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nSecurity roles and privileges for Dataverse\nFeedback\nSummarize this article for me\nTo control who can access restricted or sensitive data and resources and what they can do with them, assign users to security roles. This article provides an overview of security roles and their associated privileges.\nSecurity roles for users\nSecurity roles define how different users access different types of records. To control access to data and resources, you can create or modify security roles and change the security roles that are assigned to users.\nA user can have multiple security roles. Security role privileges are cumulative. Users are granted the privileges that are available in each role assigned to them.\nView a list of security roles in an environment\nTo view a list of security roles for an environment, take the following steps:\nSign in to the\nPower Platform admin center\n.\nSelect\nManage\nin the navigation pane.\nIn the\nManage\npane, select\nEnvironments\n. Then select an environment.\nSelect\nSettings\non the command bar. The\nSettings\npage for that environment is displayed.\nSelect\nUsers + Permissions\n>\nSecurity roles\n.\nRole name and description of a security role\nGive the Dataverse security role a descriptive name, include a brief statement of its purpose, define the\nApplies To\nscope (such as the service or application where the role is enforced), and summarize the key business tables for which the role grants permissions.\nView and update the security role description\nNote\nThe description, applies to and summary are protected and cannot be updated for system security roles.\nSign in to the\nPower Platform admin center\n.\nSelect\nManage\nin the navigation pane.\nIn the\nManage\npane, select\nEnvironments\n. Then select an environment.\nSelect\nSettings\nin the command bar. The\nSettings\npage for that environment is displayed.\nSelect\nUsers + Permissions\n>\nSecurity roles\n.\nSelect a security role and select\nSettings\non the action bar, or select a security role and then select the\nMore actions (...)\nicon, and select\nSettings\n.\nDefine the privileges and properties of a security role\nAfter you\ncreated a security role\nor while you're\nediting one\n, set the\nMember's privilege inheritance\noption:\nTeam privileges only\n: A user is granted these privileges as a member of a team. Team members who don't have user privileges of their own can create records with the team as the owner. They can access records that the team owns if they're given the\nUser\naccess level for Create and Read privileges.\nDirect User (Basic) access level and Team privileges\n: A user is granted these privileges directly when the security role is assigned. Users can create records with themselves as the owner. They can access records that they created or owned when the\nUser\naccess level for Create and Read privileges was given to them. This setting is the default for new security roles.\nThen, configure the privileges associated with the security role.\nA security role consists of record-level privileges and task-based privileges of the following three types:\nTables:\nTable privileges define which tasks a user with access to a table record can do, such as Read, Create, Delete, Write, Assign, Share, Append, and Append To.\nAppend\nmeans to attach another record, such as an activity or note, to a record.\nAppend to\nmeans to be attached to a record.\nSet table privileges\n.\nMiscellaneous privileges:\nThese task-based privileges give a user permission to perform specific, miscellaneous (nonrecord) tasks, such as publish articles or activate business rules.\nLearn more about miscellaneous privileges\n.\nPrivacy-related privileges\n: These privileges give a user permission to perform tasks that involve data that's integrated, downloaded, or exported outside of Dataverse, such as exporting data to Microsoft Excel or printing.\nLearn more about privacy-related privileges\n.\nEach set of privilege types has its own tab. For each tab, you can filter the view by all privileges, assigned privileges, or unassigned privileges for the selected security role.\nTable privileges\nThe\nTables\ntab lists the Dataverse tables in the environment. The following table describes the attributes that are shown in the security role editor when the\nCompact Grid View\noption is off.\nProperty\nDescription\nTable\nThe name of the Dataverse table\nName\nThe logical name of the Dataverse table; helpful for developers\nRecord ownership\nWhether records are owned by the organization or business unit or can be owned by a user or team\nPermission Settings\nWhich predefined set of permissions the table is using, or custom permissions\nTables are grouped into the following categories:\nBusiness Management\nBusiness Process Flows\nCore Records\nCustom Tables\nCustomization\nMissing Tables\nSales\nService\nService Management\nTo quickly find a specific table or privilege, enter its name in the search box at the upper-right corner of the page, and then select the magnifying glass icon or press\nEnter\n. To clear your search, select the\nX\nicon.\nYou can only edit one table at a time, but you can copy settings from one table to multiple tables in a single action.\nWhen you configure a security role, you need to determine the privileges it should grant for each table related to the application.\nThe following table describes the table privileges you can grant in a security role. In all cases, which records a privilege applies to depends on the access level of the permission defined in the security role.\nPrivilege\nDescription\nCreate\nRequired to make a new record\nRead\nRequired to open a record to view the contents\nWrite\nRequired to make changes to a record\nDelete\nRequired to permanently remove a record\nAppend\nRequired to associate the current record with another record; for example, if users have Append rights on a note, they can attach the note to an opportunity\nFor many-to-many relationships, a user must have Append privilege for both tables being associated or disassociated.\nAppend to\nRequired to associate a record with the current record; for example, if users have Append To rights on an opportunity, they can add a note to the opportunity\nAssign\nRequired to give ownership of a record to another user\nShare\nRequired to give access to a record to another user while keeping your own access\nAccess levels\nEach privilege has a menu that allows you to define its\naccess level\n. Access levels determine how deep in the business unit hierarchy the user can perform the privilege.\nThe following table describes the levels of access. For organization-owned tables, miscellaneous privileges and privacy-related privileges only have access levels of\nOrganization\nor\nNone\n.\nType\nDescription\nOrganization\nUsers can access all records in the organization, regardless of the business unit hierarchical level they or the environment belong to. Users with organization access automatically have all other types of access as well.\nBecause this level gives access to information throughout the organization, it should be restricted to match the organization's data security plan. This level of access is reserved for managers with authority over the organization.\nParent: Child Business Unit\nUsers can access records in their business unit and all business units subordinate to it.\nUsers with this access automatically have business unit and user access.\nBecause this level gives access to information throughout the business unit and subordinate business units, it should be restricted to match the organization's data security plan. This level of access is reserved for managers with authority over the business units.\nBusiness Unit\nUsers can access records in their business unit.\nUsers with business unit access automatically have user access.\nBecause this access level gives access to information throughout the business unit, it should be restricted to match the organization's data security plan. This level of access is reserved for managers with authority over the business unit.\nUser\nUsers can access records they own, objects that are shared with the organization, objects that are shared with them, and objects that are shared with a team that they're a member of.\nThis level of access is typical for sales and service representatives.\nNone\nNo access is allowed.\nFor each table, select the appropriate type for each privilege. Select\nSave\nwhen you're finished.\nCopy table permissions\nSetting the privileges for each table in your app can be time-consuming and tedious. To make it easier, you can copy the permissions from one table to one or more other tables.\nTip\nCreate your new security roles by copying the\npredefined template security roles\nin an environment.\nUse\nApp Opener\nrole, which has the minimum privileges to run an app.\nUse\nBasic User\nrole for the minimum privileges and including privileges to the core business tables.\nSelect a table, and then select\nCopy table permissions\n.\nSearch for and select the table or tables you want to copy the permissions to.\nRemember, the new configuration overwrites any previous settings.\nSelect\nSave\n.\nLet's take a closer look at how copy table permissions work with privileges and access levels.\nFor permissions that exist in both the source table and the target tables:\nIf the source permission settings depth exists in the target, then the copy is successful.\nIf the source permission settings depth\ndoesn't\nexist in the target, then the copy fails and an error message is displayed.\nFor permissions that only exist in either the source table or the target tables:\nIf the permission exists in the source but not in the target, then the permission is ignored in the target. The copy for the remaining permissions is successful.\nIf the permission\ndoesn't\nexist in the source but does exist in the target, then the depth of the permission is retained in the target. The copy for the remaining permissions is successful.\nPermission settings\nAnother way to speed up the configuration of table permissions is to use predefined groups of permissions and assign them to tables.\nThe following table describes the permission setting groups that you can assign.\nPermission setting\nDetails\nNo Access\nNo users can access the table.\nFull Access\nUsers can view and edit all records in the table.\nCollaborate\nUsers can view all records, but they can only edit their own.\nPrivate\nUsers can only view and edit their own records.\nReference\nUsers can only view records, not edit them.\nCustom\nIndicates that permission settings have been changed from the default value.\nSelect a table, and then select\nPermission Settings\nin the command bar or select\nMore Actions\n(\n…\n) >\nPermission Settings\n.\nSelect the appropriate setting.\nRemember, the new configuration overwrites any previous settings.\nSelect\nSave\n.\nAdd users to a security role\nFollow these steps to add users to a security role.\nSign in to the\nPower Platform admin center\n.\nSelect\nManage\nin the navigation pane.\nIn the\nManage\npane, select\nEnvironments\n. Then select an environment.\nSelect\nSettings\nin the command bar. The\nSettings\npage for that environment is displayed.\nSelect\nUsers + permissions\n>\nSecurity roles\n.\nSelect a security role and then select the\nMore actions\n(\n...\n) icon.\nSelect\nMembers\nin the menu that appears.\nIn the\nMembers\npage, select the\n+ Add people\n.\nIn the\nAdd people\npane, enter a name, email address, or team name to search for the users you want to add to the security role.\nSelect\nAdd\nto add those users to the security role.\nRemove users from a security role\nYou can remove users from a security role through the modern UI. Follow these steps to remove users from a security role.\nSign in to the\nPower Platform admin center\n.\nSelect\nManage\nin the navigation pane.\nIn the\nManage\npane, select\nEnvironments\n. Then select an environment.\nSelect\nSettings\nin the command bar. The\nSettings\npage for that environment is displayed.\nSelect\nUsers + permissions\n>\nSecurity roles\n.\nSelect a security role and then select the\nMore actions\n(\n...\n) icon.\nSelect\nMembers\nin the menu that appears.\nIn the\nMembers\npage, select the users you want to remove from the security role.\nSelect\nRemove\nat the top of the page.\nThe\nRemove from role?\nwindow appears, asking you to confirm that you want privileges associated with that role removed for the selected user. Select\nRemove\n.\nRelated information\nVideo: Administer application users, security roles, teams, and users in the Power Platform admin center\nVideo: Check Access feature\nPredefined security roles\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Security Roles", @@ -161,7 +161,7 @@ "https://learn.microsoft.com/en-us/power-platform/admin/create-edit-security-role": { "content_hash": "sha256:d35d0bd828bacc80b551fbc21bd448187950fa9a13ec21d667587a4560c8fe9a", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nCreate or edit a security role to manage access\nFeedback\nSummarize this article for me\nCreate security roles or edit the privileges associated with an existing security role to accommodate changes in your business requirements. You can\nexport your changes as a solution\nto make a backup or for use in a different implementation.\nThis article also helps you make sure that your users have a security role with the minimum privileges that are needed for common tasks like opening model-driven apps. Be sure to watch the video in\nMinimum privileges for common tasks\n.\nPrerequisites\nMake sure you have the System Administrator permission\n. If you don't, contact your system administrator.\nCreate a security role\nTo create a security role, take these steps:\nSign in to the\nPower Platform admin center\n.\nSelect\nManage\non the navigation pane.\nOn the\nManage\npane, select\nEnvironments\nand then select an environment.\nOn the command bar, select\nSettings\n.\nExpand\nUsers + permissions\nand select\nSecurity roles\n.\nOn the command bar, select\n+ New role\nto access the\nCreate New Role\npanel.\nEnter a\nrole name\n.\nSelect a business unit from the dropdown.\nEnter a\ndescription\n. For example, a brief statement of its purpose.\nEnter an\napplies to\n. For example, identify the service or application where this role is used.\nEnter a\nsummary of core table privileges\n. For example,the key business tables for which the role grants permissions.\nTo allow team members to inherit the privileges of this role when you assign it to a team, accept the default\nMember's privilege inheritance\nsetting, which is\nDirect User (Basic) access level and Team privileges\n. Learn more about the\nMember's privilege inheritance\nsetting in\nSecurity roles and privileges\n.\nTo use the new role to run model-driven apps, accept the default\nInclude App Opener privileges for running Model-Driven apps\nsetting, which is set to\nOn\n.\nSelect\nSave\n. The new role's properties are displayed.\nGrant table privileges\nTo grant table privileges, follow these steps:\nYou need to grant your app's table privileges to this newly created security role. Review and update the default privileges copied from the\nApp Opener security role's minimum privileges for common tasks\n. Some privileges grant organization-level read access, such as process (flows), that allow the user to run system-supplied flows. If your app or user doesn't need to run system-supplied flows, you can change this privilege to\nUser\n(basic) level.\nEnter your table name in the\nSearch\ninput field to find your app's table.\nSelect your table and set the permission settings.\nOn the command bar, select\nSave\n.\nRepeat the steps to grant table privileges to each table in your app.\nCreate a security role by copying an existing role\nTo create a security role by copying an existing role, follow these steps:\nSign in to the\nPower Platform admin center\n.\nSelect\nManage\non the navigation pane.\nOn the\nManage\npane, select\nEnvironments\nand then select an environment.\nOn the command bar, select\nSettings\n.\nExpand\nUsers + permissions\nand select\nSecurity roles\n.\nChoose the security role you want to copy.\nOn the command bar, select\nCopy security role\nto display the\nCopy role\ndialog box.\nEnter a\nname\nfor the new role. Select\nCopy\n.\nEnter a\ndescription\n. For example, a brief statement of its purpose.\nEnter an\napplies to\n. For example, identify the service or application where this role is used.\nEnter a\nsummary of core table privileges\n. For example,the key business tables for which the role grants permissions.\nGo back to the\nSecurity roles\npage and select the new role you created.\nSpecify privileges for the security role. For more information, see\nSecurity roles and privileges\n.\nSelect\nSave + close\n.\nEdit settings of a security role\nTo edit settings, like name, description, applies to, and summary, of a security role, take these steps:\nNote\nYou can't edit the settings of system security roles.\nSign in to the\nPower Platform admin center\n.\nSelect\nManage\non the navigation pane.\nOn the\nManage\npane, select\nEnvironments\nand then select an environment.\nOn the command bar, select\nSettings\n.\nExpand\nUsers + permissions\nand select\nSecurity roles\n.\nChoose the security role you want to edit.\nOn the command bar, select\nSettings\n.\nUpdate the\nname\n.\nUpdate the\ndescription\n. For example, a brief statement of its purpose.\nUpdate the\napplies to\n. For example, identify the service or application where this role is used.\nUpdate the\nsummary of core table privileges\n. For example,the key business tables for which the role grants permissions.\nSelect\nSave + close\n.\nEdit privileges of a security role\nBefore you edit a security role, make sure you understand the principles of\ncontrolling data access\n. To edit privileges of a security role, take these steps:\nNote\nYou can't edit the System Administrator security role. Instead, copy the System Administrator security role and make changes to the new role.\nSign in to the\nPower Platform admin center\n.\nSelect\nManage\non the navigation pane.\nOn the\nManage\npane, select\nEnvironments\nand then select an environment.\nOn the command bar, select\nSettings\n.\nExpand\nUsers + permissions\nand select\nSecurity roles\n.\nChoose the security role you want to edit.\nSpecify privileges for the security role.\n.\nSelect\nSave + close\n.\nMinimum privileges for common tasks\nMake sure that your users have a security role with the minimum privileges that they need for common tasks like opening model-driven apps.\nDon't use the\nmin prv apps use\nrole\nthat's available in the Microsoft Download Center. It's retiring soon. Instead, use or\ncopy the predefined security role App Opener\n, and then set the appropriate privileges.\nTo allow users to open a model-driven app or any Dynamics 365 customer engagement app, assign the\nApp Opener\nrole.\nTo allow users to view tables, assign the following privileges:\nCore Records:\nRead privilege on the table, Read Saved View, Create/Read/Write User Entity UI Settings\nand assign the following privilege on the Business Management tab: Read User.\nWhen signing in to Dynamics 365 for Outlook:\nTo render navigation for customer engagement apps and all buttons: assign the min prv apps use security role or a copy of this security role to your user\nTo render a table grid: assign Read privilege on the table\nTo render tables: assign Read privilege on the table\nPrivacy notices\nLicensed Dynamics 365 Online users with specific security roles are automatically authorized to access the service by using Dynamics 365 for phones, and other clients. Examples of authorized roles include: CEO, Business Manager, Sales Manager, Salesperson, System Administrator, System Customizer, and Vice President of Sales.\nAn admin has full control, at the user's security role or entity level, to access and the level of authorized access associated with the phone client. Users can then access Dynamics 365 Online by using Dynamics 365 for phones. Customer data will be cached on the device running the specific client.\nBased on the specific settings at the user security and entity levels, the types of customer data that can be exported from Dynamics 365 Online. The data that can be cached on an end user’s device include record data, record metadata, entity data, entity metadata, and business logic.\nThe Dynamics 365 for tablets and phones, and Project Finder for Project Finder for Dynamics 365 (the \"App\") enables users to access their Microsoft Dynamics CRM or Dynamics 365 instance from their tablet and phone device. In order to provide this service, the App processes and stores information, such as user's credentials and the data the user processes in Microsoft Dynamics CRM or Dynamics 365. The App is provided for use only by end users of Microsoft customers who are authorized users of Microsoft Dynamics CRM or Dynamics 365. The App processes user's information on behalf of the applicable Microsoft customer, and Microsoft may disclose information processed by the App at the direction of the organization that provides users access to Microsoft Dynamics CRM or Dynamics 365. Microsoft does not use information users process via the App for any other purpose.\nIf users use the App to connect to Microsoft Dynamics CRM (online) or Dynamics 365, by installing the App, users consent to transmission of their organization's assigned ID and assigned end user ID, and device ID to Microsoft for purposes of enabling connections across multiple devices, or improving Microsoft Dynamics CRM (online), Dynamics 365 or the App.\nLocation data.\nIf users request and enable location-based services or features in the App, the App may collect and use precise data about their location. Precise location data can be Global Position System (GPS) data, as well as data identifying nearby cell towers and Wi-Fi hotspots. The App may send location data to Microsoft Dynamics CRM or Dynamics 365. The App may send the location data to Bing Maps and other third party mapping services, such as Google Maps and Apple Maps, a user designated in the user's phone to process the user's location data within the App. Users may disable location-based services or features or disable the App's access to user's location by turning off the location service or turning off the App's access to the location service. Users' use of Bing Maps is governed by the Bing Maps End User Terms of Use available at\nhttps://go.microsoft.com/?linkid=9710837\nand the Bing Maps Privacy Statement available at\nhttps://go.microsoft.com/fwlink/?LinkID=248686\n. Users' use of third party mapping services, and any information users provide to them, is governed by their service specific end user terms and privacy statements. Users should carefully review these other end user terms and privacy statements.\nThe App may include links to other Microsoft services and third party services whose privacy and security practices may differ from those of Microsoft Dynamics CRM or Dynamics 365.  IF USERS SUBMIT DATA TO OTHER MICROSOFT SERVICES OR THIRD PARTY SERVICES, SUCH DATA IS GOVERNED BY THEIR RESPECTIVE PRIVACY STATEMENTS. For the avoidance of doubt, data shared outside of Microsoft Dynamics CRM or Dynamics 365 is not covered by users' Microsoft Dynamicss CRM or Dynamics 365 agreement(s) or the applicable Microsoft Dynamics Trust Center. Microsoft encourages users to review these other privacy statements.\nLicensed Dynamics 365 Online users with specific Security Roles (CEO – Business Manager, Sales Manager, Salesperson, System Administrator, System Customizer, and Vice President of Sales) are automatically authorized to access the service by using Dynamics 365 for tablets, as well as other clients.\nAn administrator has full control (at the user security role or entity level) over the ability to access and the level of authorized access associated with the tablet client. Users can then access Dynamics 365 (online) by using Dynamics 365 for tablets, and Customer Data will be cached on the device running the specific client.\nBased on the specific settings at the user security and entity levels, the types of Customer Data that can be exported from Dynamics 365 (online) and cached on an end user’s device include record data, record metadata, entity data, entity metadata, and business logic.\nIf you use Microsoft Dynamics 365 for Outlook, when you go offline, a copy of the data you are working on is created and stored on your local computer. The data is transferred from Dynamics 365 (online) to your computer by using a secure connection, and a link is maintained between the local copy and Dynamics 365 Online. The next time you sign in to Dynamics 365 (online), the local data will be synchronized with Dynamics 365 (online).\nAn administrator determines whether or not an organization’s users are permitted to go offline with Microsoft Dynamics 365 for Outlook by using security roles.\nUsers and administrators can configure which entities are downloaded via Offline Sync by using the\nSync Filters\nsetting in the\nOptions\ndialog box. Alternatively, users and Administrators can configure which fields are downloaded (and uploaded) by using\nAdvanced Options\nin the\nSync Filters\ndialog box.\nIf you use Dynamics 365 (online), when you use the Sync to Outlook feature, the Dynamics 365 data you are syncing is “exported” to Outlook. A link is maintained between the information in Outlook and the information in Dynamics 365 (online) to ensure that the information remains current between the two. Outlook Sync downloads only the relevant Dynamics 365 record IDs to use when a user attempts to track and set regarding an Outlook item. The company data is not stored on the device.\nAn administrator determines whether your organization’s users are permitted to sync Dynamics 365 data to Outlook by using security roles.\nIf you use Microsoft Dynamics 365 (online), exporting data to a\nstatic\nworksheet creates a local copy of the exported data and stores it on your computer. The data is transferred from Dynamics 365 (online) to your computer by using a secure connection, and no connection is maintained between this local copy and Dynamics 365 (online).\nWhen you export to a\ndynamic\nworksheet or PivotTable, a link is maintained between the Excel worksheet and Dynamics 365 (online). Every time a dynamic worksheet or PivotTable is refreshed, you’ll be authenticated with Dynamics 365 (online) using your credentials. You’ll be able to see the data that you have permissions to view.\nAn administrator determines whether or not an organization’s users are permitted to export data to Excel by using security roles.\nWhen Dynamics 365 (online) users print Dynamics 365 data, they are effectively “exporting” that data from the security boundary provided by Dynamics 365 (online) to a less secure environment, in this case, to a piece of paper.\nAn administrator has full control (at the user security role or entity level) over the data that can be extracted. However, after the data has been extracted it is no longer protected by the security boundary provided by Dynamics 365 (online) and is instead controlled directly by the customer.\nRelated content\nSecurity concepts\nPredefined security roles\nCopy a security role\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Create Security Roles", @@ -170,7 +170,7 @@ "https://learn.microsoft.com/en-us/power-platform/admin/database-security": { "content_hash": "sha256:59b6466c9a3a08063770cdf73f908b7525b88fa28c2197f79ddb4e7fed20f9fc", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nRole-based security roles for Dataverse\nFeedback\nSummarize this article for me\nMicrosoft Dataverse uses a role-based security model to control access to a database and its resources in an environment. Use security roles to configure access to all resources in an environment or to specific apps and data in the environment. A combination of access levels and permissions in a security role determines which apps and data users can view and how they can interact with those apps and data.\nAn environment can have no or one Dataverse database. You assign security roles differently for\nenvironments that have no Dataverse database\nand\nenvironments that have a Dataverse database\n.\nPredefined security roles\nEnvironments include predefined security roles that reflect common user tasks. The predefined security roles follow the security best practice of \"minimum required access\": provide the least access to the minimum business data that a user needs to use an app. These security roles can be assigned to a user,\nowner team\n, and\ngroup team\n. The predefined security roles that are available in an environment depend on the environment type and the apps installed in it.\nAnother set of security roles is assigned to\napplication users\n. Those security roles are installed by our services and can't be updated.\nEnvironments without a Dataverse database\nEnvironment Maker and Environment Admin are the only predefined roles for environments that have no Dataverse database. These roles are described in the following table.\nSecurity role\nDescription\nEnvironment Admin\nThe Environment Admin role can perform all administrative actions on an environment, including:\nAdd or remove a user from either the Environment Admin or Environment Maker role.\nProvision a Dataverse database for the environment. After a database is provisioned, assign the System Customizer role to an Environment Admin to give them access to the environment's data.\nView and manage all resources created in an environment.\nCreate\ndata loss prevention policies\n.\nEnvironment Maker\nCan create new resources associated with an environment, including apps, connections, custom APIs, and flows using Microsoft Power Automate. However, this role doesn't have privileges to access data in an environment.\nEnvironment makers can also\ndistribute the apps they build\nin an environment to other users in your organization. They can share the app with individual users, security groups, or all users in the organization.\nEnvironments with a Dataverse database\nIf the environment has a Dataverse database, a user must be assigned the System Administrator role instead of the Environment Admin role to have full admin privileges.\nUsers who make apps that connect to the database and need to create or update entities must have the System Customizer role in addition to the Environment Maker role. The Environment Maker role doesn't have privileges on the environment's data. These security roles do not have the privileges to create or update security roles.\nThe following table describes the predefined security roles in an environment that has a Dataverse database. You can't edit these roles.\nSecurity role\nDescription\nApp Opener\nHas\nminimum privileges for common tasks\n. This role is primarily used as a template to\ncreate a custom security role\nfor model-driven apps. It doesn't have any privileges to the core business tables, such as Account, Contact, and Activity. However, it has\nOrganization\n-level read access to system tables, such as\nProcess\n, to support reading system-supplied workflows. This security role is used when a\nnew, custom security role is created\n.\nBasic User\nFor out-of-the-box entities only, can run an app in the environment and perform common tasks on the records they own. It has privileges to the core business tables, such as Account, Contact, Activity, and Process.\nNote\n: The Common Data Service\nUser\nsecurity role was renamed\nBasic User\n. Only the name was changed; user privileges and role assignment are the same. If you have a solution with the Common Data Service\nUser\nsecurity role, you should update the solution before you import it again. Otherwise, you might inadvertently change the security role name back to\nUser\nwhen you import the solution.\nDelegate\nAllows code to\nimpersonate\n, or run as, another user\n. Typically used with another security role to allow access to records.\nDynamics 365 Administrator\nDynamics 365 administrator\nis a Microsoft Power Platform service admin role. Users of this role can do admin functions on Microsoft Power Platform after they\nself-elevate\nto the system administrator role.\nEnvironment Maker\nCan create new resources associated with an environment, including apps, connections, custom APIs, and flows using Microsoft Power Automate. However, this role doesn't have any privileges to access data in an environment.\nEnvironment makers can also\ndistribute the apps they build\nin an environment to other users in your organization. They can share the app with individual users, security groups, or all users in the organization.\nGlobal Administrator\nGlobal administrator\nis a Microsoft 365 administrator role. A person who purchases the Microsoft business subscription is a global administrator and has unlimited control over products in the subscription and access to most data. Users of this role must\nself-elevate\nto the system administrator role.\nGlobal Reader\nThe\nGlobal Reader\nrole isn't supported yet in the Power Platform admin center.\nOffice Collaborator\nHas Read permission to tables in which a record was shared with the organization. Doesn't have access to any other core and custom table records. This role is assigned to the Office Collaborators owner team and not to an individual user.\nPower Platform administrator\nPower Platform administrator\nis a Microsoft Power Platform service administrator role. Users of this role can do admin functions on Microsoft Power Platform after they\nself-elevate\nto the system administrator role.\nService Deleted\nHas full Delete permission to all entities, including custom entities. This role is primarily used by the service and requires deleting records in all entities.\nThis role can't be assigned to a user or team.\nService Reader\nHas full Read permission to all entities, including custom entities. This role is primarily used by the service and requires reading all entities.\nThis role can't be assigned to a user or team.\nService Writer\nHas full Create, Read, and Write permission to all entities, including custom entities. This role is primarily used by the service and requires creating and updating records.\nThis role can't be assigned to a user or team.\nSupport User\nHas full Read permission to customization and business management settings, which allow support staff to troubleshoot environment configuration issues. This role doesn't have access to core records.\nThis role can't be assigned to a user or team.\nSystem Administrator\nHas full\npermission to customize\nor administer the environment, including creating, modifying, and assigning security roles. Can view all data in the environment.\nSystem Customizer\nHas full\npermission to customize the environment\n. Can view all custom table data in the environment. However, users with this role can only view records that they create in Account, Contact, Activity tables.\nWebsite App Owner\nA user who owns the\nwebsite application registration\nin the\nAzure portal\n.\nWebsite Owner\nThe user who created the Power Pages website.\nThis role is managed and can't be changed.\nIn addition to the predefined security roles described for Dataverse, other security roles might be available in your environment depending on the Power Platform components—Power Apps, Power Automate, Microsoft Copilot Studio—you have. The following table provides links to more information.\nPower Platform component\nInformation\nPower Apps\nPredefined security roles for environments with a Dataverse database\nPower Automate\nSecurity and privacy\nPower Pages\nRoles required for website administration\nMicrosoft Copilot Studio\nAssign environment security roles\nDataverse for Teams environments\nLearn more about\npredefined security roles in Dataverse for Teams environments\n.\nApp-specific security roles\nIf you deploy Dynamics 365 apps in your environment, other security roles are added. The following table provides links to more information.\nDynamics 365 app\nSecurity role docs\nDynamics 365 Sales\nPredefined security roles for Sales\nDynamics 365 Marketing\nSecurity roles added by Dynamics 365 Marketing\nDynamics 365 Field Service\nDynamics 365 Field Service roles + definitions\nDynamics 365 Customer Service\nRoles in Omnichannel for Customer Service\nDynamics 365 Customer Insights\nCustomer Insights roles\nApp profile manager\nRoles and privileges associated with app profile manager\nDynamics 365 Finance\nSecurity roles in the public sector\nFinance and operations apps\nSecurity roles in Microsoft Power Platform\nSummary of resources available to predefined security roles\nThe following table describes which resources each security role can author.\nResource\nEnvironment Maker\nEnvironment Admin\nSystem Customizer\nSystem Admin\nCanvas app\nX\nX\nX\nX\nCloud flow\nX (non–solution-aware)\nX\nX\nX\nConnector\nX (non–solution-aware)\nX\nX\nX\nConnection\n*\nX\nX\nX\nX\nData gateway\n-\nX\n-\nX\nDataflow\nX\nX\nX\nX\nDataverse tables\n-\n-\nX\nX\nModel-driven app\nX\n-\nX\nX\nSolution framework\nX\n-\nX\nX\nDesktop flow\n**\n-\n-\nX\nX\nAI Builder\n-\n-\nX\nX\n*Connections are used in\ncanvas apps\nand\nPower Automate\n.\n**Dataverse for Teams users don't get access to desktop flows by default. You need to upgrade your environment to full Dataverse capabilities and acquire\ndesktop flow license plans\nto use desktop flows.\nRelated content\nAssign a security role to a user\nSecurity roles and privileges\nHow access to a record is determined\nConfigure user security in an environment\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Database Security", @@ -179,7 +179,7 @@ "https://learn.microsoft.com/en-us/power-platform/admin/field-level-security": { "content_hash": "sha256:315fe900ab98ec6dcbc9788fb2ebd49c47e507567166d5498aaf7a777b003005", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nColumn-level security to control access\nFeedback\nSummarize this article for me\nManage access to records at the table level using\nprivileges associated with security roles\n. Some columns in a table might contain data that is more sensitive than others. Use column-level security to manage access to data in specific columns. Column-level security configurations are organization-wide and apply to all data access requests.\nYou can use column level security to prevent certain users from:\nSetting the value of a column in a record.\nViewing the data in a column. You can choose to mask this value to show a portion of it, or not return any data at all.\nNote\nTo configure column-level security, you need the system administrator role.\nColumn-level security doesn't apply for users who have the system administrator role. Data is never hidden from system administrators. To verify the configured results, you must use an account that doesn't have the system administrator security role assigned.\nColumn-level security is available\nfor most columns\nusing this process:\nEnable column-level security\non one or more columns for a given table.\nOptionally, select a\nmasking rule\n.\nAssociate one more existing security profiles\n, or create one or more new security profiles to grant the appropriate access to specific users or teams.\nEnable column security\nUse the following steps to secure a column:\nSign in to\nPower Apps\n.\nSelect\nSolutions\n.\nSelect the unmanaged solution that contains the table that has the column, or create a new solution to hold your changes and add the table to it.\nWithin the solution, in\nObjects\n, within\nTables\n, select the table.\nUnder\nSchema\n, select\nColumns\n.\nIn the\nColumns\nlist, select a column.\nExpand\nAdvanced options\n, and then under\nGeneral\n, select\nEnable column security\n.\nSelect\nSave\n.\nTip\nLearn how a developer can retrieve a list of all the secured columns in an environment\nor\nsecure a column using code\nAdd teams or users to a column security profile to control access\nA column security profile determines:\nUsers and teams assigned access.\nPermissions to the secure columns.\nUse a column security profile to grant user or team members the following permissions:\nPermission\nOptions\nResult\nRead\nAllowed\nNot Allowed\nWhether people can view the data for the column.\nMasked values are shown if masking rule is applied to the column.\nRead unmasked\nAll Records\nOne record\nNot Allowed\nWhen a secured column has a masking rule, a developer can write code to request unmasked data be returned.\nThis setting controls whether or not that request succeeded.\nThe default setting is\nNot Allowed\n.\nLearn more about granting permissions to a secured column with a masking rule\nUpdate\nAllowed\nNot Allowed\nWhether people can update the data in the column.\nCreate\nAllowed\nNot Allowed\nWhether people can set the data in the column when creating a record.\nConfigure a combination of these four permissions to determine the user privileges for a specific data column.\nImportant\nUnless one or more security profiles are assigned to a column with security, only users with the system administrator security role can access the column.\nAny users not defined in the column security profiles won't have access to the column on forms or views. The column value displays\n********, indicating that the column is secured.\nAdd a column and set permissions for a column security profile\nTo add a column and set permissions for a column security profile, use the following steps:\nSign in to the \nPower Platform admin center\n.\nSelect\nManage\nin the navigation pane.\nIn the\nManage\npane, select\nEnvironments\n. Then select an environment.\nSelect\nSettings\n>\nUsers + permissions\n>\nColumn security profiles\n.\nSelect an existing profile, or select\nNew Profile\n, enter a name, enter a description, and then select\nSave\n.\nSelect the\nTeams\nor\nUsers\ntab, select\n+ Add Teams\nor\n+ Add Users\n, select the teams or users that you want to control access, and then select\nAdd\n.\nSelect the\nColumn Permission\ntab, in the\nName\ncolumn select one or more columns, and then select\nEdit\n. Configure the four properties for the desired access. These permissions control whether people in this security profile can read or set column values.\nSelect\nSave\n.\nTip\nLearn how a developer can provide access to secured columns using code\nWhich columns can be secured?\nWhen a column is eligible for column-level security, the\nEnable column security\ncheckbox is enabled in the\nAdvanced options\narea of the column definition in\nPower Apps\n.\nYou can view this area when you\ncreate or edit a column\n.\nColumns that can't be secured include:\nColumns in virtual tables\nLookup columns\nFormula columns\nPrimary name columns (The single-line of text column each table has to show the value in a lookup field. Typically with a name ending with\nname\n.)\nSystem columns like\ncreatedon\n,\nmodifiedon\n,\nstatecode\n, and\nstatuscode\n.\nNote\nFile and Image data types can be secured, but they can't be masked.\nText data type with Rich text format can be secured, but an embedded image in Rich text can't be masked or bypassed for masking.\nWhether the\nEnable column security\ncheckbox is enabled depends on the value of these column properties:\nCanBeSecuredForCreate\n,\nCanBeSecuredForRead\n, and\nCanBeSecuredForUpdate\n. You can view this data by installing the Metadata Browser solution described in\nBrowse table definitions in your environment\n.\nTip\nLearn how a developer can query Dataverse to get a list of all the columns that can be secured\nBest practices\nWhen a\ncalculated column\nincludes a column that is secured, data might be displayed in the calculated column to users that don't have permission to the secured column. Both the original column and the calculated column should be secured.\nComposite columns\ninclude data from multiple columns. For example, the\ncontact\ntable\nfullname\nand\naddress1_composite\ncolumns are composite columns. To completely secure data included in composite columns, you must secure and configure the appropriate column security profiles on multiple columns for the table. For example, to completely secure the\naddress1_composite\ncolumn, you need to secure all of these the columns that begin with\naddress1_\nin both the\ncontact\nand\naddress (\ncustomeraddress\n)\ntables.\nNote\nChanges to column security require a browser refresh from the end user on the client (like a model-driven app) for the changes to take effect. This should be considered when dynamically adjusting access rules.\nActivity logging data\nThe column values in the before-and-after audit change events show as \"*\" in the\nCreate\nand\nUpdate\nPurview activity logs\n.\nRelated information\nEnable or disable security for a column to control access\nColumn-level security example\nHierarchy security\nColumn-level security with code\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Column-Level Security", @@ -188,7 +188,7 @@ "https://learn.microsoft.com/en-us/power-platform/admin/manage-high-privileged-admin-roles": { "content_hash": "sha256:c6268573cbb2109bf57439b9b46dd6d7bb810fe3795f760174b98d4fd9a3077e", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nManage admin roles with Microsoft Entra Privileged Identity Management\nFeedback\nSummarize this article for me\nManage high-privileged admin roles in the Power Platform admin center using Microsoft Entra Privileged Identity Management (PIM).\nPrerequisites\nRemove old system administrator role assignments in your environments. You can use\nPowerShell scripts\nto inventory and remove unwanted users from the\nSystem Administrator\nrole in one or more Power Platform environments.\nChanges to feature support\nMicrosoft no longer automatically assigns the\nSystem Administrator\nrole to users with global or service level admin roles such as Power Platform Administrator and Dynamics 365 Administrator.\nThese admins can continue to sign in, to the Power Platform admin center, with these privileges:\nEnable or disable tenant level settings\nView analytics information for environments\nView capacity consumption\nThese admins can't perform activities that require direct access to Dataverse data without a license. Examples of these activities include:\nUpdating the security role for a user in an environment\nInstalling apps for an environment\nImportant\nGlobal admins, Power Platform admins, and Dynamics 365 service administrators must complete another step before they can perform activities requiring access to Dataverse. They must elevate themselves to the\nSystem Administrator\nrole in the environment where they need access. All elevation actions are logged to Microsoft Purview.\nIf you use Privileged Identity Management to get just-in-time access to admin roles in Microsoft Entra ID and then self-elevate, Microsoft removes your\nSystem Administrator\nrole when role assignment expires in Privileged Identity Management, usually after a short duration.\nKnown limitations\nWhen using the API, if the caller is a system administrator, the self-elevate call returns a success instead of indicating that they are already exit.\nThe user making the call must have the tenant admin role assigned. For a full list of users who meet the tenant admin criteria, see\nChanges to feature support\nIf you're a Dynamics 365 administrator and the environment is protected by a security group, you must be a member of the security group. This rule doesn't apply to users with the global administrator or Power Platform administrator roles.\nThe user who needs to elevate their status must invoke the elevation API. It does not allow API calls to elevate another user's status.\nA workaround is available for customers using the Microsoft Power Platform CoE Starter Kit. See\nPIM Issue and Workaround #8119\nfor more information and details.\nRole assignments through groups aren't supported. Make sure that you assign roles directly to the user.\nSelf-elevate to the system administrator role\nWe support elevation using either PowerShell or through an intuitive experience in Power Platform admin center.\nNote\nUsers who attempt to self-elevate must be a Global admin, Power Platform admin, or Dynamics 365 admin. The user interface in Power Platform admin center isn't available for users with other Entra ID admin roles and attempting to self-elevate through the PowerShell API returns an error.\nSelf-elevate through PowerShell\nTo self-elevate through PowerShell, install the\nMSAL\nPowerShell module and follow the steps in this section.\nInstall-Module -Name MSAL.PS\nYou only need to install the module once. For more information about setting up PowerShell, see\nQuick Start Web API with PowerShell and Visual Studio Code\n.\nStep 1: Run the script to elevate\nIn this PowerShell script, you:\nAuthenticate, using the Power Platform API.\nBuild an\nhttp\nquery with your environment ID.\nRequest elevation, using the Power Platform API.\nLocate and add your environment ID\nSign in to the\nPower Platform admin center\n.\nIn the navigation pane, select\nManage\n.\nIn the\nManage\npane, select\nEnvironments\n.\nOn the\nEnvironments\npage, choose the environment you want to modify.\nLocate the\nEnvironment ID\nin the\nDetails\npane.\nAdd your unique\n\nto the script.\nRun the script\nCopy and paste the script into a PowerShell console.\n# Set your environment ID\n$environmentId = \"\"\n$clientId = \"\"\n\nImport-Module MSAL.PS\n\n# Authenticate\n$AuthResult = Get-MsalToken -ClientId $clientId -Scope 'https://api.powerplatform.com/.default'\n\n$Headers = @{\n Authorization = \"Bearer $($AuthResult.AccessToken)\"\n 'Content-Type' = \"application/json\"\n} \n\n$uri = \"https://api.powerplatform.com/usermanagement/environments/$environmentId/user/applyAdminRole?api-version=2022-03-01-preview\";\n\ntry { \n\n $postRequestResponse = Invoke-RestMethod -Method Post -Headers $Headers -Uri $uri \n \n} \n \ncatch { \n \n # Dig into the exception to get the Response details. \n \n Write-Host \"Response CorrelationId:\" $_.Exception.Response.Headers[\"x-ms-correlation-id\"] \n \n Write-Host \"StatusCode:\" $_.Exception.Response.StatusCode.value__ \n \n Write-Host \"StatusDescription:\" $_.Exception.Response.StatusDescription \n \n $result = $_.Exception.Response.GetResponseStream() \n \n $reader = New-Object System.IO.StreamReader($result) \n \n $reader.BaseStream.Position = 0 \n \n $reader.DiscardBufferedData() \n \n $responseBody = $reader.ReadToEnd(); \n \n Write-Host $responseBody \n \n} \n \n$output = $postRequestResponse | ConvertTo-Json -Depth 2 \n \nWrite-Host $output\nStep 2: Confirm the result\nUpon success, you see an output similar to the following output. Look for\n\"Code\": \"UserExists\"\nas evidence that you successfully elevated your role.\n{\n \"errors\": [],\n \"information\": [\n {\n \"Subject\": \"Result\",\n \"Description\": \"[\\\"SyncMode: Default\\\",\\\"Instance df12c345-7b56-ee10-8bc5-6045bd005555 exists\\\",\\\"Instance df85c664-7b78-ee11-8bc5-6045bd005555 in enabled state\\\",\\\"Instance Url found https://orgc1234567.crm.dynamics.com\\\",\\\"User found in AD tenant\\\",\\\"User in enabled state in AD tenant\\\",\\\"SystemUser with Id:11fa11ab-4f75-ee11-9999-6045bd12345a, objectId:aaaaaaaa-0000-1111-2222-bbbbbbbbbbbb exists in instance\\\"]\",\n \"Code\": \"UserExists\"\n },\n { ... }\n}\nErrors\nYou might see an error message if you don't have the right permissions.\n\"Unable to assign System Administrator security role as the user is not either a Global admin, Power Platform admin, or Dynamics 365 admin. Please review your role assignments in Entra ID and try again later. For help, please reach out to your administrator.\"\nExample script\nRemove-RoleAssignmentFromUsers\n-roleName \"System Administrator\" \n-usersFilePath \"C:\\Users\\\\Desktop\\\"\n-environmentUrl \"-environment.crm.dynamics.com\"\n# Or, include all your environments\n-processAllEnvironments $true\n-geo \"NA\"\n-outputLogsDirectory \"C:\\Users\\\\Desktop\\\"\nSelf-elevate through Power Platform admin center\nTo self-elevate through Power Platform center, take the following steps:\nSign in to the\nPower Platform admin center\n.\nIn the navigation pane, select\nManage\n.\nIn the\nManage\npane, select\nEnvironments\n.\nOn the\nEnvironments\npage, choose the environment you want to modify.\nIn the command bar, select\nMembership\nto request self-elevation.\nIn the\nSystem Administrators\npane, select\nAdd me\nto add yourself to the system administrator role.\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "High-Privileged Admin Roles", @@ -197,7 +197,7 @@ "https://learn.microsoft.com/en-us/power-platform/admin/ip-firewall": { "content_hash": "sha256:a9f2914ad772f82e822f4c86e466587df68f0819d14b3bacd84b3e73637af643", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nIP firewall in Power Platform environments\nFeedback\nSummarize this article for me\nThe IP firewall protects your organizational data by ensuring users can only access Microsoft Dataverse from allowed IP locations. The IP firewall analyzes the IP address of each request in real time. For example, you can turn on the IP firewall in your production Dataverse environment and set allowed IP addresses in the ranges associated with your office locations and not any external IP location, like a coffee shop. If a user tries to access organizational resources from a coffee shop, Dataverse denies access in real time.\nKey benefits\nTurning on the IP firewall in your Power Platform environments offers several key benefits.\nMitigate insider threats like data exfiltration\n: A malicious user who tries to download data from Dataverse using a client tool like Excel or Power BI from a disallowed IP location is blocked from doing so in real time.\nPrevent token replay attacks\n: If a user steals an access token and tries to use it to access Dataverse from outside allowed IP ranges, Dataverse denies the attempt in real time.\nIP firewall protection works in both interactive and noninteractive scenarios.\nHow does the IP firewall work?\nWhen a request is made to Dataverse, the request IP address is evaluated in real time against the IP ranges configured for the Power Platform environment. If the IP address is in the allowed ranges, the request is allowed. If the IP address is outside the IP ranges configured for the environment, the IP firewall denies the request with an error message:\nThe request you are trying to make is rejected as access to your IP is blocked. Contact your administrator for more information\n.\nPrerequisites\nThe IP firewall is a feature of\nManaged Environments\n.\nYou must have a Power Platform admin role to enable or disable the IP firewall.\nEnable the IP firewall\nYou can enable the IP firewall in a Power Platform environment by using either Power Platform admin center or the Dataverse OData API.\nEnable the IP firewall using Power Platform admin center\nSign in to\nPower Platform admin center\n as an administrator.\nIn the navigation pane, select\nSecurity\n.\nIn the\nSecurity\npane, select\nIdentity and access\n.\nIn the\nIdentity and access management\npage, select\nIP firewall\n.\nIn the\nSet up IP firewall\npane, select an environment. Then select\nSet up IP firewall\n.\nIn the\nSet up IP firewall for this environment\npane, select\nIP Firewall\nto\nOn\n.\nUnder\nAllowed list of IP addresses\n, specify the allowed IP ranges in classless interdomain routing (CIDR) format as per\nRFC 4632\n. If you have multiple IP ranges, separate them with a comma. This field accepts up to 4,000 alphanumeric characters and allows a maximum of 200 IP ranges. IPv6 addresses are allowed both in hexadecimal and compressed format.\nSelect other advanced settings, as appropriate:\nAllowed list of service tags\n: From the list, select service tags that can bypass IP firewall restrictions.\nAllow access for Microsoft trusted services\n: This setting enables Microsoft trusted services like monitoring and\nsupport user\netc. to bypass the IP firewall restrictions to access the Power Platform environment with Dataverse. Enabled by default.\nAllow access for all application users\n: This setting allows\nall application users\nthird-party and first-party access to Dataverse APIs. Enabled by default. If you clear this value, it only blocks third-party application users.\nEnable IP firewall in audit-only mode\n: This setting enables the IP firewall but allows requests regardless of their IP address. Enabled by default.\nReverse proxy IP addresses\n: If your organization has reverse proxies configured, enter the IP addresses separated by commas. The reverse proxy setting applies to both\nIP-based cookie binding\nand the IP firewall. Reach out to your network administrator to get the reverse proxy IP addresses.\nNote\nReverse proxy must be configured to send user client IP addresses in the\nforwarded\nheader.\nSelect\nSave\n.\nEnable IP firewall at an environment group-level\nTo configure IP firewall settings at the environment group-level, complete the following steps.\nSign in to the\nPower Platform admin center\n.\nIn the navigation pane, select\nSecurity\n.\nIn the\nSecurity\npane, select\nIdentity and access\n.\nSelect a IP firewall pane.\nIn the pane that is displayed, select the\nEnvironment groups\ntab to which you want the security setting applied.Then select\nSet up IP firewall\n.\nIn the\nSet up IP firewall\npane, select\nIP Firewall\nto\nOn\n.\nUnder\nAllowed list of IP addresses\n, specify the allowed IP ranges in classless interdomain routing (CIDR) format as per\nRFC 4632\n. If you have multiple IP ranges, separate them with a comma. This field accepts up to 4,000 alphanumeric characters and allows a maximum of 200 IP ranges. IPv6 addresses are allowed both in hexadecimal and compressed format.\nSelect other advanced settings, as appropriate:\nAllowed list of service tags\n: From the list, select service tags that can bypass IP firewall restrictions.\nAllow access for Microsoft trusted services\n: This setting enables Microsoft trusted services like monitoring and\nsupport user\netc. to bypass the IP firewall restrictions to access the Power Platform environment with Dataverse. Enabled by default.\nAllow access for all application users\n: This setting allows\nall application users\nthird-party and first-party access to Dataverse APIs. Enabled by default. If you clear this value, it only blocks third-party application users.\nEnable IP firewall in audit-only mode\n: This setting enables the IP firewall but allows requests regardless of their IP address. Enabled by default.\nReverse proxy IP addresses\n: If your organization has reverse proxies configured, enter the IP addresses separated by commas. The reverse proxy setting applies to both\nIP-based cookie binding\nand the IP firewall. Reach out to your network administrator to get the reverse proxy IP addresses.\nSelect\nSave\n.\nNote\nReverse proxy must be configured to send user client IP addresses in the\nforwarded\nheader.\nSelected settings are applied to all the environments in that environment group.\nEnable IP firewall using the Dataverse OData API\nYou can use the Dataverse OData API to retrieve and modify values within a Power Platform environment. For detailed guidance, see\nQuery data using the Web API\nand\nUpdate and delete table rows using the Web API (Microsoft Dataverse)\n.\nYou have the flexibility to select the tools that you prefer. Use the following documentation to retrieve and modify values through the Dataverse OData API:\nUse Insomnia with Dataverse Web API\nQuick Start Web API with PowerShell and Visual Studio Code\nConfigure the IP firewall by using the OData API\nPATCH https://{yourorg}.api.crm*.dynamics.com/api/data/v9.2/organizations({yourorgID})\nHTTP/1.1\nContent-Type: application/json\nOData-MaxVersion: 4.0\nOData-Version: 4.0\nPayload\n[\n {\n \"enableipbasedfirewallrule\": true,\n \"allowediprangeforfirewall\": \"18.205.0.0/24,21.200.0.0/16\",\n \"enableipbasedfirewallruleinauditmode\": true,\n \"allowedservicetagsforfirewall\": \"AppService,ActionGroup,ApiManagement,AppConfiguration,AppServiceManagement,ApplicationInsightsAvailability,AutonomousDevelopmentPlatform,AzureActiveDirectory,AzureAdvancedThreatProtection,AzureArcInfrastructure,AzureAttestation,AzureBackup,AzureBotService\",\n \"allowapplicationuseraccess\": true,\n \"allowmicrosofttrustedservicetags\": true\n }\n]\nenableipbasedfirewallrule\n– Enable the feature by setting the value to\ntrue\n, or disable it by setting the value to\nfalse\n.\nallowediprangeforfirewall\n— List the IP ranges that should be allowed. Provide them in CIDR notation, separated by a comma.\nImportant\nMake sure that the service tag names exactly match what you see on the IP firewall's settings page. If there's any discrepancy, IP restrictions might not work correctly.\nenableipbasedfirewallruleinauditmode\n– A value of\ntrue\nindicates audit-only mode, whereas a value of\nfalse\nindicates enforcement mode.\nallowedservicetagsforfirewall\n– List the service tags that should be allowed, separated by a comma. If you don't want to configure any service tags, leave the value null.\nallowapplicationuseraccess\n– The default value is\ntrue\n.\nallowmicrosofttrustedservicetags\n– The default value is\ntrue\n.\nImportant\nWhen\nAllow Access for Microsoft trusted services\nand\nAllow access for all application users\nare disabled, some services that use Dataverse, such as Power Automate flows, might no longer work.\nTest the IP firewall\nYou should test the IP firewall to verify that it's working.\nFrom an IP address that isn't in the allowed list of IP addresses for the environment, browse to your Power Platform environment URI.\nYour request should be rejected with a message that says, \"The request you are trying to make is rejected as access to your IP is blocked. Contact your administrator for more information.\"\nFrom an IP address that's in the allowed list of IP addresses for the environment, browse to your Power Platform environment URI.\nYou should have the access to the environment that's defined by your security role.\nYou should test the IP firewall in your test environment first, followed by audit-only mode in Production environment before enforcing the IP firewall on your Production environment.\nNote\nBy default,\nTDS endpoint\nis turned on within the Power Platform environment.\nSPN filtering for application users\nThe IP Firewall feature in Power Platform allows administrators to restrict access to environments based on IP address ranges. For scenarios where specific application users (Service Principal Names or SPNs) need to bypass these restrictions, you can enable SPN filtering using an API-based approach.\nSteps to enable SPN filtering\nAdd the application user.\nIf not already added, add the\napplication user\nto the target environment and assign the appropriate security roles.\nExample:\nAdd the app user with ID 123 and name TestSPN to the environment and assign the necessary roles\nRetrieve the system user ID.\nUse the following API call to fetch the\nsystemuserid\nfor the application user:\nGET https://{root-url}/api/data/v9.0/systemusers?$filter=applicationid eq {application-id}&$select=systemuserid\nHTTP/1.1\nContent-Type: application/json\nOData-MaxVersion: 4.0\nOData-Version: 4.0\nAllowlist the application user.\nPOST https://{yourorg}.api.crm*.dynamics.com/api/data/v9.2/systemusers(SystemuserID)\nHTTP/1.1\nContent-Type: application/json\nOData-MaxVersion: 4.0\nOData-Version: 4.0\nPayload\n[\n {\n \"isallowedbyipfirewall\": true\n }\n]\nConfigure IP firewall settings in PPAC.\nNavigate to the Power Platform Admin Center (PPAC) and configure the IP Firewall settings.\nEnsure that the option \"Allow access for all application users\" is unchecked to enforce filtering.\nLicensing requirements for IP firewall\nThe IP firewall is only enforced on environments that are activated for Managed Environments. Managed Environments are included as an entitlement in standalone Power Apps, Power Automate, Microsoft Copilot Studio, Power Pages, and Dynamics 365 licenses that give premium usage rights. Learn more about\nManaged Environment licensing\nwith the\nLicensing overview for Microsoft Power Platform\n.\nIn addition, access to using IP firewall for Dataverse requires users in the environments where the IP firewall is enforced to have one of these subscriptions:\nMicrosoft 365 or Office 365 A5/E5/G5\nMicrosoft 365 A5/E5/F5/G5 Compliance\nMicrosoft 365 F5 Security & Compliance\nMicrosoft 365 A5/E5/F5/G5 Information Protection and Governance\nMicrosoft 365 A5/E5/F5/G5 Insider Risk Management\nLearn more about Microsoft 365 licenses\nFrequently asked questions (FAQ)\nWhat does the IP firewall cover in Power Platform?\nThe IP firewall is supported in any Power Platform environment that includes Dataverse.\nHow soon does a change to the IP address list take effect?\nChanges to the list of allowed IP addresses or ranges typically take effect in about 5-10 minutes.\nDoes this feature work in real time?\nIP firewall protection works in real time. Since the feature works at the network layer, it evaluates the request after the authentication request is completed.\nIs this feature enabled by default in all environments?\nThe IP firewall isn't enabled by default. The Power Platform administrator needs to enable it for Managed Environments.\nWhat is audit-only mode?\nIn audit-only mode, the IP firewall identifies the IP addresses that are making calls to the environment and allows them all, whether they're in an allowed range or not. It's helpful when you're configuring restrictions on a Power Platform environment. We recommend that you enable audit-only mode for at least a week and disable it only after careful review of the\naudit logs\n.\nIs this feature available in all the environments?\nThe IP firewall is available for\nManaged Environments\nonly.\nIs there a limit on the number of IP addresses that I can add in the IP address text box?\nYou can add up to 200 IP addresses ranges in CIDR format as per\nRFC 4632\n, separated by commas.\nWhat should I do if requests to Dataverse start to fail?\nAn incorrect configuration of IP ranges for IP firewall might be causing this issue. You can check and verify the IP ranges on the IP firewall settings page. We recommend that you turn on the IP firewall in Audit-only mode before enforcing it.\nHow do I download the audit log for audit-only mode?\nUse the Dataverse OData API to download the audit log data in JSON format. The format of the audit log API is:\nhttps://[orgURI]/api/data/v9.1/audits?$select=createdon,changedata,action&$filter=action%20eq%20118&$orderby=createdon%20desc&$top=1\nReplace\n[orgURI]\nwith the Dataverse environment URI.\nSet the action value to\n118\nfor this event.\nSet the number of items to return in\ntop=1\nor specific the number you want to return.\nMy Power Automate flows aren't working as expected after configuring the IP firewall on my Power Platform environment. What should I do?\nIn the IP firewall settings, allow the service tags listed in\nManaged connectors outbound IP addresses\n.\nI have configured the reverse proxy address correctly, but the IP firewall isn't working. What should I do?\nMake sure your reverse proxy is configured to send the client IP address in the forwarded header.\nIP firewall audit functionality isn't working in my environment. What should I do?\nIP firewall audit logs aren't supported in tenants enabled for bring-your-own-key\n(BYOK)\nencryption keys. If your tenant is enabled for bring-your-own-key, then all environments in a BYOK-enabled tenant are locked down to SQL only, therefore audit logs can only be stored in SQL. We recommend that you migrate to\ncustomer-managed key\n. To migrate from BYOK to customer-managed key (CMKv2), follow the steps in\nMigrate bring-your-own-key (BYOK) environments to customer-managed key\n.\nDoes IP firewall support IPv6 IP ranges?\nYes, IP firewall supports IPv6 IP ranges.\nNext steps\nSecurity in Microsoft Dataverse\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "IP Firewall", @@ -206,7 +206,7 @@ "https://learn.microsoft.com/en-us/power-platform/admin/cross-tenant-restrictions": { "content_hash": "sha256:78f71d18380e7458aadfd2f0604b4f05e54b39a5eb4d2920c99cb8bbd30516a4", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nCross-tenant inbound and outbound restrictions\nFeedback\nSummarize this article for me\nMicrosoft Power Platform has a rich ecosystem of connectors based on Microsoft Entra that allow authorized Microsoft Entra users to build compelling apps and flows establishing connections to the business data available through these data stores. Tenant isolation makes it easy for administrators to ensure that these connectors can be harnessed in a safe and secure way within the tenant while minimizing the risk of data exfiltration outside the tenant. Tenant isolation allows Power Platform administrators to effectively govern the movement of tenant data from Microsoft Entra authorized data sources to and from their tenant.\nPower Platform tenant isolation is different from Microsoft Entra ID-wide tenant restriction. It\ndoesn't\nimpact Microsoft Entra ID-based access outside of Power Platform. Power Platform tenant isolation only works for connectors using Microsoft Entra ID-based authentication such as Office 365 Outlook or SharePoint.\nWarning\nThere's a\nknown issue\nwith\nAzure DevOps connector\nthat results in tenant isolation policy to not be enforced for connections established using this connector. If an insider attack vector is a concern, we recommend you limit using the connector or its actions using data policies.\nThe default configuration in Power Platform with tenant isolation\nOff\nis to allow cross-tenant connections to be established seamlessly, if the user from tenant A establishing the connection to tenant B presents appropriate Microsoft Entra credentials. If admins want to allow only a select set of tenants to establish connections to or from their tenant, they can turn tenant isolation\nOn\n.\nWith tenant isolation\nOn\n,\nall\ntenants are restricted. Inbound (connections to the tenant from external tenants) and outbound (connections from the tenant to external tenants) cross-tenant connections are blocked by Power Platform even if the user presents valid credentials to the Microsoft Entra-secured data source. You can use rules to add exceptions.\nAdmins can specify an explicit allow list of tenants that they want to allow\ninbound\n,\noutbound\n, or both, which bypasses tenant isolation controls when configured. Admins can use a special pattern “*” to allow\nall\ntenants in a specific direction when tenant isolation is turned on. All other cross-tenant connections except the ones in the allow list are rejected by Power Platform.\nTenant isolation can be configured in the Power Platform admin center. It affects Power Platform canvas apps and Power Automate flows. To set up tenant isolation, you need to be a tenant admin.\nPower Platform tenant isolation ability is available with two options: one-way or two-way restriction.\nUnderstand tenant isolation scenarios and impact\nBefore you begin configuring the tenant isolation restrictions, review the following list to understand the scenarios and impact of tenant isolation.\nAdmin wants to turn on tenant isolation.\nAdmin is concerned that existing apps and flows using cross tenant connections stop working.\nAdmin decides to enable tenant isolation and add exception rules to eliminate the impact.\nAdmin runs the cross-tenant isolation reports to determine the tenants that need to be exempt. More information:\nTutorial: Create cross tenant isolation reports (preview)\nTwo-way tenant isolation (inbound and outbound connection restriction)\nTwo-way tenant isolation blocks connection establishment attempts to your tenant from other tenants. Additionally, two-way tenant isolation also blocks connection establishment attempts from your tenant to other tenants.\nIn this scenario, the tenant admin allows two-way tenant isolation on the Contoso tenant while the external Fabrikam tenant hasn't been added to the allow list.\nUsers signed in to Power Platform in the Contoso tenant can’t establish outbound Microsoft Entra ID-based connections to data sources in the Fabrikam tenant despite presenting appropriate Microsoft Entra credentials to establish the connection. This is outbound tenant isolation for the Contoso tenant.\nSimilarly, users signed in to Power Platform in the Fabrikam tenant can’t establish inbound Microsoft Entra ID-based connections to data sources in the Contoso tenant despite presenting appropriate Microsoft Entra credentials to establish the connection. This is inbound tenant isolation for the Contoso tenant.\nConnection creator tenant\nConnection sign-in tenant\nAccess allowed?\nContoso\nContoso\nYes\nContoso (tenant isolation\nOn\n)\nFabrikam\nNo (outbound)\nFabrikam\nContoso (tenant isolation\nOn\n)\nNo (inbound)\nFabrikam\nFabrikam\nYes\nNote\nA connection attempt initiated by a guest user, from their host tenant that targets data sources within the same host tenant, isn't evaluated by the tenant isolation rules.\nTenant isolation with allow lists\nOne-way tenant isolation or inbound isolation blocks connection establishment attempts to your tenant from other tenants.\nScenario: Outbound allow list – Fabrikam is added to the outbound allow list of the Contoso tenant\nIn this scenario, the admin adds the Fabrikam tenant in the outbound allow list while tenant isolation is\nOn\n.\nUsers signed in to Power Platform in the Contoso tenant can establish outbound Microsoft Entra ID-based connections to data sources in the Fabrikam tenant if they present appropriate Microsoft Entra credentials to establish the connection. Outbound connection establishment to the Fabrikam tenant is permitted by virtue of the configured allowl ist entry.\nHowever, users signed in to Power Platform in the Fabrikam tenant still can't establish inbound Microsoft Entra ID-based connections to data sources in the Contoso tenant despite presenting appropriate Microsoft Entra credentials to establish the connection. Inbound connection establishment from the Fabrikam tenant is still disallowed even as the allow list entry is configured and permits outbound connections.\nConnection creator tenant\nConnection sign-in tenant\nAccess allowed?\nContoso\nContoso\nYes\nContoso (tenant isolation\nOn\n)\nFabrikam added to outbound allow list\nFabrikam\nYes\nFabrikam\nContoso (tenant isolation\nOn\n)\nFabrikam added to outbound allow list\nNo (inbound)\nFabrikam\nFabrikam\nYes\nScenario: Bidirectional allow list – Fabrikam is added to the inbound and outbound allow lists of the Contoso tenant\nIn this scenario, the admin adds the Fabrikam tenant to both the inbound and outbound allow lists while tenant isolation is\nOn\n.\nConnection creator tenant\nConnection sign-in tenant\nAccess allowed?\nContoso\nContoso\nYes\nContoso (tenant isolation\nOn\n)\nFabrikam added to both allow lists\nFabrikam\nYes\nFabrikam\nContoso (tenant isolation\nOn\n)\nFabrikam added to both allow lists\nYes\nFabrikam\nFabrikam\nYes\nAllow tenant isolation and configure the allow list\nGo to the\nPower Platform admin center\n.\nIn the navigation pane, select\nSecurity\n.\nIn the\nSecurity\npane, select\nIdentity and access\n.\nIn the\nIdentity and access management\npage, select\nTenant isolation\n.\nTo allow tenant isolation, turn on the\nRestrict cross-tenant connections\noption.\nTo allow cross tenant communication, select\nAdd exceptions\nin the\nTenant isolation\npane.\nIf tenant isolation is\nOff\n, you can still add or edit the exception list. However, the exception lists aren't enforced until you turn on tenant isolation.\nFrom the\nAllowed direction\ndropdown list, select the direction of the allow list entry.\nEnter the value of the allowed tenant as either the tenant domain or tenant ID in the\nTenant ID\nfield. Once saved, the entry gets added to the allow list along with other allowed tenants. If you use the tenant domain to add the allow list entry, the Power Platform admin center automatically calculates the tenant ID.\nYou can use \"*\" as a special character to signify all tenants are allowed in the designated direction when tenant isolation is turned on.\nSelect\nSave\n.\nNote\nYou must have a Power Platform administrator role to see and set the tenant isolation policy.\nTo ensure that tenant isolation doesn't block any calls when used, turn tenant isolation\nOn\n, add a new tenant rule, set\nTenant ID\nas \"*\", and set allowed direction to\ninbound\nand\noutbound\n.\nDue to technical limitations, the threshold limit for rules is 500.\nYou can perform all the allow list operations like add, edit, and delete while tenant isolation is turned\nOn\nor\nOff\n. Allow list entries do have an effect on the connection behavior when tenant isolation is turned\nOff\nsince all cross-tenant connections are allowed.\nDesign time impact on apps and flows\nUsers who create or edit a resource, affected by the tenant isolation policy, see a related error message. For example, Power Apps makers see the following error when they use cross-tenant connections in an app that's blocked by tenant isolation policies. The app doesn't add the connection.\nSimilarly, Power Automate makers see the following error when they try to save a flow that uses connections in a flow that's blocked by tenant isolation policies. The flow itself is saved, but it's marked as \"Suspended\" and isn't executed unless the maker resolves the data loss prevention policy (DLP) violation.\nRuntime impact on apps and flows\nAs an admin, you can decide to modify the tenant isolation policies for your tenant at any point. If apps and flows were created and executed in compliance with earlier tenant isolation policies, some of them might be negatively affected by any policy changes you make. Apps or flows that are in violation of the tenant isolation policy don't run successfully. For example, run history within Power Automate indicates that the flow run failed. Further, selecting the failed run shows details of the error.\nFor existing flows that don’t run successfully because of the latest tenant isolation policy, run history within Power Automate indicates that the flow run failed.\nSelecting the failed run shows details of the failed flow run.\nNote\nIt takes about an hour for the latest tenant isolation policy changes to be assessed against active apps and flows. This change isn't instantaneous.\nKnown issues\nAzure DevOps connector\nuses Microsoft Entra authentication as the identity provider, but uses its own OAuth flow and STS for authorizing and issuing a token. Since the token returned from the ADO flow based on that Connector’s configuration isn't from Microsoft Entra ID, the tenant isolation policy isn't enforced. As a mitigation, we recommend using other types of\ndata policies\nto limit the use of the connector or its actions.\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Cross-Tenant Restrictions", @@ -215,7 +215,7 @@ "https://learn.microsoft.com/en-us/power-platform/admin/manage-encryption-key": { "content_hash": "sha256:60e05a60697f1591375706ccebd8057b63c91104f35d41777dfbfe9efbd26420", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nManage the encryption key\nFeedback\nSummarize this article for me\nAll environments of Microsoft Dataverse use SQL Server Transparent Data Encryption (TDE) to perform real-time encryption of data when written to disk. This is also known as encryption at rest.\nBy default, Microsoft stores and manages the database encryption key for your environments so you don't have to. The managed keys feature in the Microsoft Power Platform admin center gives administrators the ability to self-manage the database encryption key that is associated with the Dataverse tenant.\nImportant\nStarting January 6, 2026, we'll discontinue support for bring-your-own-key (BYOK). Customers are encouraged to transition to customer-managed keys (CMK), an enhanced solution that offers improved functionality, broader support for data sources, and better performance. Learn more in\nManage your customer-managed encryption key\nand\nMigrate bring-your-own-key (BYOK) environments to customer-managed key\n.\nEncryption key management is only applicable to Azure SQL environment databases. The following features and services continue to use the Microsoft-managed encryption key to encrypt their data and can't be encrypted with the self-managed encryption key:\nCopilots and generative AI features in\nMicrosoft Power Platform and Microsoft Dynamics 365\nDataverse search\nElastic tables\nMobile Offline\nActivity Log (Microsoft 365 portal)\nExchange (Server-side sync)\nNote\nThe self-managed database encryption key feature must be turned on by Microsoft for your tenant before you can use the feature.\nTo use the data encryption management features for an environment, the environment must be created\nafter\nthe self-managed database encryption key feature is turned on by Microsoft.\nAfter the feature is turned on in your tenant, all new environments are created with Azure SQL storage only. These environments, regardless of whether they're encrypted with bring-your-own-key (BYOK) or a Microsoft-managed key, have restrictions with file upload size, can't use Azure Cosmos DB and data lake services, and Dataverse search indexes are encrypted with a Microsoft-managed key. To use these services, you must\nmigrate to a customer-managed key\n.\nFiles\nand\nImages\nwith sizes of less than 128 MB can be used if your environment is version 9.2.21052.00103 or higher.\nMost existing environments have file and log stored in non-Azure SQL databases. These environments can't be opted in to the self-managed encryption key. Only new environments (once you sign up for this program) can be enabled with a self-managed encryption key.\nIntroduction to key management\nWith key management, administrators can provide their own encryption key or have an encryption key generated for them, which is used to protect the database for an environment.\nThe key management feature supports both PFX and BYOK encryption key files, such as those stored in a hardware security module (HSM). To use the upload encryption key option, you need both the public and private encryption key.\nThe key management feature takes the complexity out of encryption key management by using Azure Key Vault to securely store encryption keys. Azure Key Vault helps safeguard cryptographic keys and secrets used by cloud applications and services. The key management feature doesn't require that you have an Azure Key Vault subscription and for most situations there's no need to access encryption keys used for Dataverse within the vault.\nThe managed keys feature lets you perform the following tasks:\nEnable the ability to self-manage database encryption keys that are associated with environments.\nGenerate new encryption keys or upload existing PFX or BYOK encryption key files.\nLock and unlock tenant environments.\nWarning\nWhile a tenant is locked, no one can access any environments within the tenant. More information:\nLock the tenant\nUnderstand the potential risk when you manage your keys\nAs with any business-critical application, personnel within your organization who have administrative-level access must be trusted. Before you use the key management feature, you should understand the risk when you manage your database encryption keys. It's conceivable that a malicious administrator (a person who is granted or has gained administrator-level access with intent to harm an organization's security or business processes) working within your organization might use the managed keys feature to create a key and use it to lock all environments in the tenant.\nConsider the following sequence of events.\nThe malicious administrator signs in to the Power Platform admin center, goes to the\nEnvironments\npage and selects\nManage encryption key\n. The malicious administrator then creates a new key with a password and downloads the encryption key to their local drive, and activates the new key. Now all the environment databases are encrypted with the new key. Next, the malicious administrator locks the tenant with the newly downloaded key, and then takes or deletes the downloaded encryption key.\nThese actions result in disabling all the environments within the tenant from online access and make all database backups unrestorable.\nImportant\nTo prevent the malicious administrator from interrupting the business operations by locking the database, the managed keys feature doesn't allow tenant environments to be locked for 72 hours after the encryption key has changed or activated. This provides up to 72 hours for other administrators to roll back any unauthorized key changes.\nEncryption key requirements\nIf you provide your own encryption key, your key must meet these requirements that are accepted by Azure Key Vault.\nThe encryption key file format must be PFX or BYOK.\n2048-bit RSA.\nRSA-HSM key type (requires a Microsoft Support request).\nPFX encryption key files must be password protected.\nFor more information about generating and transferring an HSM-protected key over the internet, see\nHow to generate and transfer HSM-protected keys for Azure Key Vault\n. Only\nnCipher Vendor HSM key\nis supported. Before generating your HSM key, go to the Power Platform admin center\nManage encryption keys\n>\nCreate New key\nwindow to obtain the subscription ID for your environment region. You need to copy and paste this subscription ID into your HSM to create the key. This ensures that only our Azure Key Vault can open your file.\nKey management tasks\nTo simplify the key management tasks, the tasks are broken down into three areas:\nGenerate or upload the encryption key for a tenant\nActivate an encryption key for a tenant\nManage encryption for an environment\nAdministrators can use the\nPower Platform admin center\nor the\nPower Platform administration module\ncmdlets to perform the tenant protection key management tasks described here.\nGenerate or upload the encryption key for a tenant\nAll encryption keys are stored in the Azure Key Vault, and there can only be one active key at any time. Since the active key is used to encrypt all the environments in the tenant, managing the encryption is operated at the tenant level. Once the key is activated, each individual environment can then be selected to use the key for encryption.\nUse this procedure to set the managed key feature the first time for an environment or to change (or roll over) an encryption key for an already self-managed tenant.\nWarning\nWhen you perform the steps described here for the first time, you're opting in to self-managing your encryption keys. More information:\nUnderstand the potential risk when you manage your keys\nSign in to the\nPower Platform admin center\nas an admin (Dynamics 365 admin or Microsoft Power Platform admin).\nSelect the\nManage\nin the navigation pane.\nIn the\nManage\npane, select\nEnvironments\n. The\nEnvironments\npage is displayed.\nSelect\nManage encryption keys\non the toolbar.\nSelect\nConfirm\nto acknowledge the managed key risk.\nSelect\nNew key\non the toolbar.\nOn the left pane, complete the details to generate or upload a key:\nSelect a\nRegion\n. This option is only shown if your tenant has multiple regions.\nEnter a\nKey name\n.\nChoose from the following options:\nTo create a new key, select\nGenerate new (.pfx)\n. More information:\nGenerate a new key (.pfx)\nTo use your own generated key, select\nUpload (.pfx or .byok)\n. More information:\nUpload a key (.pfx or .byok)\nSelect\nNext\n.\nGenerate a new key (.pfx)\nEnter a password and then reenter the password to confirm.\nSelect\nCreate\nand then select the created file notification on your browser.\nThe encryption key .pfx file is downloaded to your web browser's default download folder. Save the file in a secure location (we recommend that this key is backed up along with its password).\nUpload a key (.pfx or .byok)\nSelect\nUpload the Key\n, select the .pfx or .byok\n1\nfile, and then select\nOpen\n.\nEnter the password for the key and then select\nCreate\n.\n1\nFor .byok encryption key files, make sure you use the subscription ID as shown on the screen when you export the encryption key from your local HSM. More information:\nHow to generate and transfer HSM-protected keys for Azure Key Vault\nNote\nTo reduce the number of steps for the administrator to manage the key process, the key is automatically activated when it's uploaded the first time. All subsequent key uploads require an extra step to activate the key.\nActivate an encryption key for a tenant\nOnce an encryption key is generated or uploaded for the tenant, it can be activated.\nSign in to the\nPower Platform admin center\nas an admin (Dynamics 365 admin or Microsoft Power Platform admin).\nSelect the\nManage\nin the navigation pane.\nIn the\nManage\npane, select\nEnvironments\n. The\nEnvironments\npage is displayed.\nSelect\nManage encryption keys\non the toolbar.\nSelect\nConfirm\nto acknowledge the managed key risk.\nSelect a key that has an\nAvailable\nstate and then select\nActivate key\non the toolbar.\nSelect\nConfirm\nto acknowledge the key change.\nWhen you activate a key for the tenant, it takes a while for the key management service to activate the key. The status of the\nKey state\ndisplays the key as\nInstalling\nwhen the new or uploaded key is activated.\nOnce the key is activated, the following occurs:\nAll encrypted environments automatically get encrypted with the active key (there's no downtime with this action).\nWhen activated, the encryption key is applied to all environments that are changed from Microsoft-provided to self-managed encryption key.\nImportant\nTo streamline the key management process so that all environments are managed by the same key, the active key can't be updated when there are locked environments. All locked environments must be unlocked before a new key can be activated. If there are locked environments that don't need to be unlocked, they must be deleted.\nNote\nAfter an encryption key is activated, you can't activate another key for 24 hours.\nManage encryption for an environment\nBy default, each environment is encrypted with the Microsoft-provided encryption key. Once an encryption key is activated for the tenant, administrators can elect to change the default encryption to use the activated encryption key. To use the activated key, follow these steps.\nApply encryption key to an environment\nSign in to the\nPower Platform admin center\nas an admin (Dynamics 365 admin or Microsoft Power Platform admin).\nSelect the\nManage\nin the navigation pane.\nIn the\nManage\npane, select\nEnvironments\n. The\nEnvironments\npage is displayed.\nOpen a\nMicrosoft-provided\nencrypted environment.\nSelect\nSee all\n.\nIn the\nEnvironment Encryption\nsection, select\nManage\n.\nSelect\nConfirm\nto acknowledge the managed key risk.\nSelect\nApply this key\nto accept changing the encryption to use the activated key.\nSelect\nConfirm\nto acknowledge that you're managing the key directly and that there's downtime for this action.\nReturn a managed encryption key back to Microsoft-provided encryption key\nReturning to the Microsoft-provided encryption key configures the environment back to the default behavior where Microsoft manages the encryption key for you.\nSign in to the\nPower Platform admin center\nas an admin (Dynamics 365 admin or Microsoft Power Platform admin).\nSelect the\nManage\nin the navigation pane.\nIn the\nManage\npane, select\nEnvironments\n. The\nEnvironments\npage is displayed.\nSelect an environment that is encrypted with a self-managed key.\nSelect\nSee all\n.\nIn the\nEnvironment Encryption\nsection, select\nManage\nand then select\nConfirm\n.\nUnder\nReturn to standard encryption management\n, select\nReturn\n.\nFor production environments, confirm the environment by entering the environment's name.\nSelect\nConfirm\nto return to standard encryption key management.\nLock the tenant\nSince there's only one active key per tenant, locking the encryption for the tenant\ndisables all the environments\nthat are in the tenant. All locked environments remain inaccessible to everyone, including Microsoft, until a Power Platform admin in your organization unlocks it by using the key that was used to lock it.\nCaution\nYou should never lock the tenant environments as part of your normal business process. When you lock a Dataverse tenant, all the environments are taken offline and they can't be accessed by anyone, including Microsoft. Additionally, services such as synchronization and maintenance are all stopped. If you decide to leave the service, locking the tenant can ensure that your online data is never accessed again by anyone.\nNote the following about tenant environments locking:\nLocked environments can't be restored from backup.\nLocked environments are deleted if not unlocked after 28 days.\nYou can't lock environments for 72 hours after an encryption key change.\nLocking a tenant\nlocks all active environments\nwithin the tenant.\nImportant\nYou must wait at least one hour after you lock active environments before you can unlock them.\nOnce the lock process begins, all encryption keys with either an Active or Available state are deleted. The lock process can take up to an hour and during this time unlocking locked environments isn't allowed.\nSign in to the\nPower Platform admin center\nas an admin (Dynamics 365 admin or Microsoft Power Platform admin).\nSelect the\nManage\nin the navigation pane.\nIn the\nManage\npane, select\nEnvironments\n. The\nEnvironments\npage is displayed.\nSelect\nManage encryption keys\non the toolbar.\nSelect the\nActive\nkey and then select\nLock active environments\n.\nOn the right pane select\nUpload active key\n, browse to and select the key, enter the password, and then select\nLock\n.\nWhen prompted, enter the text that is displayed on your screen to confirm that you want to lock all environments in the region, and then select\nConfirm\n.\nUnlock locked environments\nTo unlock environments, you must first\nupload\nand then\nactivate\nthe tenant encryption key with the same key that was used to\nlock the tenant\n. Locked environments don't get unlocked automatically once the key has been activated. Each locked environment has to be unlocked individually.\nImportant\nYou must wait at least one hour after you lock active environments before you can unlock them.\nThe unlock process can take up to an hour. Once the key is unlocked, you can use the key to\nManage encryption for an environment\n.\nYou can't generate a new or upload an existing key until all locked environments are unlocked.\nUnlock encryption key\nSign in to the\nPower Platform admin center\nas an admin (Dynamics 365 admin or Microsoft Power Platform admin).\nSelect the\nManage\nin the navigation pane.\nIn the\nManage\npane, select\nEnvironments\n. The\nEnvironments\npage is displayed.\nSelect\nManage encryption keys\non the toolbar.\nSelect the key that has a\nLocked\nstate, and then on the command bar select\nUnlock key\n.\nSelect\nUpload locked key\n, browse to and select the key that was used to lock the tenant, enter the password, and then select\nUnlock\n.\nThe key goes into an\nInstalling\nstate. You must wait until the key is in an\nActive\nstate before you can unlock locked environments.\nTo unlock an environment, see the next section.\nUnlock environments\nGo to the\nEnvironments\npage, and then select the locked environment name.\nTip\nDon't select the row. Select the environment name.\nIn the\nDetails\nsection, select\nSee all\nto display the\nDetails\npane on the right.\nIn the\nEnvironment\nencryption section on the\nDetails\npane, select\nManage\n.\nOn the\nEnvironment encryption\npage, select\nUnlock\n.\nSelect\nConfirm\nto confirm that you want to unlock the environment.\nRepeat the previous steps to unlock other environments.\nEnvironment database operations\nA customer tenant can have environments that are encrypted using the Microsoft managed key and environments that are encrypted with the customer managed key. To maintain data integrity and data protection, the following controls are available when managing environment database operations.\nRestore\nThe environment to overwrite (the restored to environment) is restricted to the same environment that the backup was taken from or to another environment that is encrypted with the same customer-managed key. Additionally, a past backup taken when the environment was encrypted with Microsoft-managed key can't be restored to an environment that is currently encrypted with the customer-managed key. In other words, restoring a backup to an environment is allowed when the current environment encryption state (whether Microsoft-managed key or customer-managed key) matches the environment encryption state at the time of when the backup was taken.\nCopy\nThe environment to overwrite (the copied to environment) is restricted to another environment that is encrypted with the same customer-managed key.\nNote\nIf a Support Investigation environment was created to resolve a support issue in a customer-managed environment, the encryption key for the Support Investigation environment must be changed to customer-managed key before the Copy environment operation can be performed.\nReset\nThe environment's encrypted data is deleted, including backups. After the environment is reset, the environment encryption will revert back to the Microsoft-managed key.\nRelated content\nSQL Server: Transparent Data Encryption (TDE)\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Encryption", @@ -224,7 +224,7 @@ "https://learn.microsoft.com/en-us/power-platform/admin/analytics-common-data-service": { "content_hash": "sha256:a29226d1f141b0ff2ca23b9eb927b98a58e589320eafd4972b039397837337d4", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nView and download Microsoft Dataverse analytics\nFeedback\nSummarize this article for me\nViewing metrics for your organization is now an improved experience. You no longer need to install or update a solution. Instead, you can view Dataverse analytics right from the Microsoft Power Platform admin center to quickly view adoption and user metrics for your organization.\nTo access these reports:\nSign in to the\nPower Platform admin center\n.\nIn the navigation pane, select\nManage\n.\nIn the\nManage\npane, under\nProducts\n, select\nDataverse\n.\nView the Dataverse analytics options on the page.\nRequired roles to view the reports\nAdmins with these roles and a\nlicense\ncan view the reports in Dataverse analytics:\nEnvironment Admin - can view reports for the environments that the admin has access to.\nPower Platform admin – can view reports for all environments.\nDynamics 365 admin - can view reports for all environments.\nMicrosoft 365 Global admin – can view reports for all environments.\nFor more information on the different roles for managing your tenant across the platform, see\nUse service admin roles to manage your tenant\n.\nKey highlights\nMonitor adoption and use\n: Use data to work toward your goals over a period of time. You can identify your most active users, the number and types of operations they're performing, number of pages requests, most-used entities, workflows, plug-ins, and more.\nManage storage and performance\n: Optimize performance by monitoring storage quotas, storage use, and top tables by size.\nTroubleshoot effectively\n: Quickly diagnose and troubleshoot errors by drilling down into the details of your top failing workflows and API calls.\nHome (default) dashboard\nThe home (default) dashboard shows you information on the number of active Dataverse users, storage usage, the most active workflows, and more.\nHome dashboard details\nChart element\nDescription\nActive Users\nNumber of active users (unique users) who performed an operation that caused one of these SDK calls:\nRetrieve\n,\nRetrieve Multiple\n,\nDelete\n,\nCreate\n, and\nUpdate\n.\nAPI Calls\nNumber of API calls made by the environment with a Dataverse database for a selected time period.\nAPI Pass Rate\nPercentage of API calls pass rate out of total API calls made in the environment with a Dataverse database over a specified time period.\nExecutions\nNumber of plug-ins were executed in the environment with a Dataverse database over a specified time period.\nTotal Operations\nNumber of operations (\nCreate\n,\nUpdate\n,\nDelete\n,\nRead\n) occurred in the environment with a Dataverse database over a specified time period.\nMost Active Users Performing Operations\nList of most active users who performed an operation that caused a\nCreate\n,\nUpdate\n,\nRead\n, or\nDelete\nSDK call in the Dynamics 365 environment over a selected time period.\nTop Plug-ins by Failures\nNumber of 10 most failing plug-ins in the environment with a Dataverse database over a specified time period.\nActive users dashboard\nThe Active users dashboard shows you how many Dynamics 365 users there are, how many licenses are in use, what custom entities are used most frequently, and more.\nActive users dashboard details\nNote\nExports are limited to a maximum of 3000 records.\nChart element\nDescription\nTotal Active Users\nTotal number of active users (unique users) who performed an operation that caused one of these SDK calls:\nRetrieve\n,\nRetrieve Multiple\n,\nDelete\n,\nCreate\n, and\nUpdate\n.\nMost Used Entities\nTen Entities which had the most\nRetrieve\n,\nRetrieve Multiple\n,\nDelete\n,\nCreate\n, and\nUpdate SDK Calls\n.\nTotal Page Requests\nThe number of page load requests for forms, dashboards, and reports. This is the count of requests received by the Dynamics 365 server. Pages that are cached while browsing won't be counted.\nTotal Operations\nThis chart shows how many operations (create, update, deletes, reads) have occurred in the environment with a Dataverse database for the selected time period.\nActive Users Performing Specific Operations\nTotal number of active users (unique users) over time who performed an operation that caused one of these SDK calls:\nRetrieve\n,\nRetrieve Multiple\n,\nDelete\n,\nCreate\n, and\nUpdate\n.\nActive Users\nNumber of active users (unique users) in your environment who performed an operation that caused one of these SDK calls:\nRetrieve\n,\nRetrieve Multiple\n,\nDelete\n,\nCreate\n, and\nUpdate\nover time.\nMost Active Users Performing Operations\nList of most active users (unique users) over time who performed an operation that caused one of these SDK calls:\nRetrieve\n,\nRetrieve Multiple\n,\nDelete\n,\nCreate\n, and\nUpdate\n.\nMost Used Custom Entities\nList of custom entities which had the most\nRetrieve\n,\nRetrieve Multiple\n,\nDelete\n,\nCreate\n, and\nUpdate SDK Calls\n.\nMost Used OOB Entities\nList of out-of-box entities which had the most\nRetrieve\n,\nRetrieve Multiple\n,\nDelete\n,\nCreate\n, and\nUpdate SDK Calls\n.\nUsage Active Users by OS\nThe number of active users by operating system.\nActive Users by Device Type\nThe number of active users by device type.\nActive Users by Browser\nThe number of active users by browser.\nActive Users by Security Roles\nThe number of active users by security roles.\nUsers by Business Unit\nThe number of active users by business unit.\nNumber of Creates by Entity\nHow many create operations are performed by the selected user in the environment with a Dataverse database for the selected time period.\nNumber of Updates by Entity\nHow many update operations are performed on different entities by the selected user in the environment with a Dataverse database for the selected time period.\nNumber of Reads by Entity\nHow many read operations are performed on different entities by the selected user in the environment with a Dataverse database for the selected time period.\nNumber of Deletes by Entity\nHow many delete operations are performed on different entities by the selected user in the environment with a Dataverse database for the selected time period.\nTotal Operations Over Time\nThe total operations performed by the selected user in the environment with a Dataverse database over the selected time period.\nTotal Operations by Entity\nThe total operations performed on different entities by the selected user in the environment with a Dataverse database for the selected time period.\nActive Users by Entities\nShow the active users distributed over different entities\nNote\nRetrieve\nand\nRetrieveMultiple\nSDK calls are reported as\nReads\n.\nActive usage chart update frequency\nThis table details the frequency of different active usage chart updates.\nChart\nUpdate frequency\nTotal Active Users\n24 hours\nMost Used Entities\n24 hours\nMost Active Users (Reads)\n24 hours\nTotal API Calls\n24 hours\nTotal Page Requests\n24 hours\nMost Active Users (Changes)\n24 hours\nTotal Operations\n24 hours\nActive Users Performing Specific Operations\n24 hours\nActive Users\n24 hours\nMost Active Users Performing Operations\n24 hours\nMost Used Custom Entities\n24 hours\nMost Used OOB Entities\n24 hours\nSystem Jobs dashboard\nThis dashboard helps you monitor and troubleshoot workflows.\nSystem jobs dashboard details\nChart element\nDescription\nWorkflow Executions\nThis chart shows the number of workflows executed in a environment with a Dataverse database over a specified time.\nSystem Jobs Pass Rate\nThis chart shows the system job's pass rate as percentage of system jobs executed in the environment with a Dataverse database over a specified time.\nSystem Jobs Throughput/Minute\nThis chart shows the average of system jobs executed per hour in the environment with a Dataverse database over a specified time.\nExecutions and Backlog\nThis chart shows the number of executions and the backlog for system jobs in the environment with a Dataverse database over a specified time.\nMost Active Workflows\nThis chart shows the top-10 most executed workflows in the environment with a Dataverse database over a specified time.\nTop Workflows by Failures\nThis chart shows the top-10 most failing workflows in the environment with a Dataverse database over the specified time. Select a workflow to see the failures and their number of occurrences.\nSystem jobs chart update frequency\nThis table details the frequency of different system jobs chart updates:\nChart\nUpdate frequency\nWorkflow Executions\n24 hours\nSystem Jobs Pass Rate\n24 hours\nSystem Jobs Throughput / Hour\n24 hours\nMost Active Workflows\n24 hours\nSystem Jobs Executions and Backlog\n24 hours\nTop Workflows by Failures\n24 hours\nPlug-ins dashboard\nThis dashboard helps you monitor and troubleshoot plug-ins.\nPlug-in dashboard details\nChart element\nDescription\nPlug-in Success Rate\nThis chart shows the plug-in pass rate as percentage of total plug-in executions executed in the environment with a Dataverse database over a specified time.\nPlug-in Executions\nThis chart shows how many plug-ins executed in the environment with a Dataverse database over a specified time.\nAverage Plug-in Execution Time\nThis chart shows average time taken to successfully execute a plug-in in the environment with a Dataverse database over a specified time.\nMost Active Plug-ins\nThis chart shows the top-10 most executed plug-ins in the environment with a Dataverse database over a specified time.\nTop Plug-ins by Failures\nThis chart shows the top-10 most failing plug-ins in the environment with a Dataverse database over a specified time.\nPlug-in chart update frequency\nThis table details the frequency of different plug-in chart updates:\nChart\nUpdate frequency\nPlug-in Success Rate\n24 hours\nMost Active Plug-ins\n24 hours\nPlug-in Executions\n24 hours\nAverage Plug-in Execution Time\n24 hours\nTop Plug-ins by Failures\n24 hours\nAPI calls statistics dashboard\nThis dashboard helps you monitor and troubleshoot API calls.\nAPI calls statistics dashboard details\nChart element\nDescription\nAPI Success Rate\nThis chart shows the API success rate as percentage of total API calls made in the environment with a Dataverse database over a specified time.\nTop API by Failures\nThis chart shows top-10 failing API calls in the environment with a Dataverse database over a specified time.\nTotal API Calls\nThis chart shows total number of API calls made in the environment with a Dataverse database over a specified time.\nMost Used API\nThis chart shows top-10 most executed API calls in the environment with a Dataverse database database. Adding the individual counts provide the total of the top-10 API calls. This is not be the same as the all-up Total API Calls metric.\nAPI Calls\nThis chart shows the number of API calls made over time in the environment with a Dataverse database over a specified time. Adding up the individual counts equals the Total API Calls count.\nAPI peak call rate\nThis chart shows capacity consumption relative to the API call limit. More information:\nAPI peak call rate report\nAPI calls statistics chart update frequency\nThis table details the frequency of API calls statistics chart updates:\nChart\nUpdate frequency\nAPI Success Rate\n24 hours\nTop API by Failures\n24 hours\nMost Used API\n24 hours\nTotal API Calls\n24 hours\nAPI Calls\n24 hours\nAPI peak call rate\n24 hours\nAPI peak call rate report (preview)\nImportant\nThis is a preview feature.\nPreview features aren't meant for production use and may have restricted functionality. These features are available before an official release so that customers can get early access and provide feedback.\nThe API peak call rate report shows API usage graph with the number of requests per user/application for the selected interval. This report helps you monitor the API usage, and avoid hitting the\nservice protection limits\n.\nChart element\nDescription\nSDSService and OData\nThe bars show the max number of API requests by app/users within 5-min interval. The maximum is the number of requests per user per five minutes that is based on your licenses and capacity add-ons.\nAPI Peak limit\nThe peak requests per second recorded by the request count API limit. This is a measure of request count per unit time.\nAnalyze API peak call rate\nTo help interpret and act on the capacity, the graph shows the API peak limit. The bars show the max number of API requests by app/users within a 5-minute interval. The maximum is the number of requests per user per five minutes that is based on your licenses and capacity add-ons.\nYou can also have a direct view of where your actual use of capacity is relative to the limit so you can be sure that you are within the limit.\nWhen the graph shows that your requests per user/app are beyond the peak limit (identified with a red line), it means that you have reached a peak and your requests are being throttled. The report shows data using a single unit of measure to make it easy to get an overview of API utilization.\nAPI peak call rate is calculated as the maximum of one of the following:\nThe peak requests per second (RPS) recorded by the request count API limit. This is a measure of request count per unit time.\nThe peak cumulative execution time recorded by the time API limit. Each 150 ms of request execution time is counted as one API call, and then summed up for every 5-minute interval. This is a measure of compute time, converted to an equivalent number of API calls per unit time.\nFor more information about the API count and time limits, refer to\nservice protection API limits\n.\nAPI peak call rate example scenarios\nAPI peak call rate is based on either the number of requests or execution time measured by the service protection limits, whichever is greater. One request is equivalent to 150ms of execution time measured by the time limit.\nThe scenarios below show how the peak call rate is derived based on either request count or execution time using 150ms as the conversion factor from time to count.\nScenario 1\n: Client sends 150,000 web API calls in 5 minutes that each execute for 50ms.\nCount is 500 requests per second (150,000 per 5 minutes is 30,000 per minute)\nTime is equivalent to 17 requests per second (750,000ms total time, or 5000 calls per 5 minutes (750,000ms / 150ms))\nRequest count is higher in this case, so the peak rate displayed is 500 requests per second.\nScenario 2\n: Client sends 300 web API calls in 5 minutes that each execute for 10 seconds.\nCount is 1 request per second (there are 300 seconds in 5 minutes)\nTime is equivalent to 67 requests per second (3,000,000ms total time, or 20,000 API calls per 5 minutes (3,000,000ms / 150ms)).\nExecution time converts to a higher request count in this case, so the peak rate displayed is 67 API calls per second.\nOptimize API peak call rate\nThe usage graph can help you identify what users/applications are approaching or exceeding the service protection limits and take actions to mitigate it as necessary.\nTo optimize limit, consider reducing the number of API requests by user/app or increase the limit by adding more capacity and bring the peak limit (identified with a red line) higher.\nMailbox Usage dashboard\nThis dashboard helps you monitor email mailbox usage.\nMailbox usage dashboard details\nChart element\nDescription\nMailbox Details by GEO\nThis chart shows mailbox details like:\nthe number of server-side synch configured mailboxes.\nthe number of server-side synch enabled mailboxes.\nthe number of server-side synch Appointments, Contacts, and Tasks enabled mailboxes.\nthe number of server-side synch incoming enabled mailboxes.\nthe number of server-side synch outgoing enabled mailboxes categorized by the geo location the mailbox is hosted in.\nMailboxes by Server Type\nThis chart shows the mailbox distribution by server type.\nActive Email Server Profiles by Geo\nThis chart shows active server-side synch enabled mailboxes distributed over the geo location they are hosted in.\nMailboxes by Exchange Configuration\nThis chart shows the number of mailboxes categorized by their Exchange configuration.\nNumber of Mailbox Configuration Errors\nThis chart shows the number of mailboxes configuration errors which occurred over the user-selected time frame.\nMailbox Usage\nThis chart shows the number of server-side synch mailboxes over the time range selected by the user.\nNumber of Outlook Mailboxes\nThis chart shows the number of Outlook mailboxes configured for the organization.\nNumber of Active Email Server Profiles\nThis chart shows the number of active email server profiles for the time range configured by the user.\nMailbox usage chart update frequency\nThis table details the frequency of mailbox usage chart updates:\nChart\nUpdate frequency\nMailbox Details by Geo\n24 hours\nActive Email Server Profiles by Geo\n24 hours\nMailboxes by Server Type\n24 hours\nMailbox Usage\n24 hours\nNumber of Mailbox Configuration Errors\n24 hours\nNumber of Active Email Server Profiles\n24 hours\nNumber of Outlook Mailboxes\n24 hours\nMailboxes by Exchange Configuration\n24 hours\nDownload reports\nSelect\nDownload\nto view available downloads and then select any of the reports to download them into Microsoft Excel.\nAll the download reports, except\nActive Dynamics 365 Customer Engagement Plan Users by Application\n, show data for an environment and per the timeline in the filters for the out-of-box Dataverse analytics reports. If you select a certain date range for the out-of-box Dataverse reports, the same time filter applies to the downloads. The maximum duration for data availability is 30 days.\nThe\nActive Dynamics 365 Customer Engagement Plan Users by Application\nreport always shows the last 30 days of data at the tenant level.\nDownload reports dashboard details\nChart element\nDescription\nActive users by device type\nList of active users by device type used to access Dynamics 365.\nActive users by business unit\nList of active users by their business unit.\nNOTE\n: This is not specific to UI calls, and includes system calls in the context of the user.\nActive users by security role\nList of active users by their security roles.\nNOTE\n: This is not specific to UI calls, and includes system calls in the context of the user.\nActive users by client\nList of active users, by client type used to access Dynamics 365.\nActive users by entities\nList of active users distributed by entity.\nMost active users performing operations\nList of most active users (unique users) over time who perform an operation that causes one of these SDK calls:\nRetrieve\n,\nRetrieve Multiple\n,\nDelete\n,\nCreate\n, and\nUpdate\n.\nMost used custom entities\nList of custom entities that had the most\nRetrieve\n,\nRetrieve Multiple\n,\nDelete\n,\nCreate\n, and\nUpdate SDK Calls\n.\nMost used OOB entities\nList of out-of-box entities that had the most\nRetrieve\n,\nRetrieve Multiple\n,\nDelete\n,\nCreate\n, and\nUpdate SDK Calls\n.\nMost active workflows\nList of top-10 most executed workflows in the environment with a Dataverse database over a specified time.\nMost active plug-ins\nList of top-10 most executed plug-ins in the environment with a Dataverse database over a specified time.\nMost used API\nList of top-10 most executed API calls in the Dataverse environment database.\nActive Dynamics 365 Customer Engagement Plan Users by Application\nActive Dynamics 365 Customer Engagement plan users by application. Helps customers to know usage across different apps so that when it is time to renew their subscription, they can choose the individual apps to be bought (for example Dynamics 365 for Sales, Dynamics 365 for Customer Service, and more). The Customer Engagement plan, which was a suite of all Customer Engagement applications, is no longer being sold and people need to choose the individual apps to be bought.\nNon-conformant usage by users with Team Member license\nShows customers how their users (with team-member licenses) are using the product in ways that are deemed not conformant with the use rights entitled to this license, as per licensing guide.\nEnvironment and date-time range data\nYou can view data for different environment and date-time ranges. Take these steps to get started:\nSelect\nChange filters\n.\nSelect the desired\nenvironment\nand\ntime-period\nfrom the drop-down lists.\nSelect\nApply\nto save the changes. All Dataverse analytics reports are available using this selection process.\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Analytics", @@ -233,7 +233,7 @@ "https://learn.microsoft.com/en-us/power-platform/admin/self-service-analytics": { "content_hash": "sha256:b6da68ed9b3158671b0ed29987ae2dc62e3c06922df5aefb53d3f69ea09e0bd5", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nSet up Microsoft Power Platform self-service analytics to export inventory and usage data (preview)\nFeedback\nSummarize this article for me\n[This article is pre-release documentation and is subject to change.]\nWith the Power Platform admin center, you can export Power Platform inventory and usage data directly into Azure Data Lake Storage for your organization's business needs. Having the data in your own data lake means you can store data for the durations specified in your organization's data retention policies.\nYou can also create custom reports with Power BI, with views at the business unit level and detailed app reports at the tenant and environment level.\nImportant\nThis is a preview feature.\nPreview features aren’t meant for production use and might have restricted functionality. These features are available before an official release so that customers can get early access and provide feedback.\nDuring this preview, the user experience displaying the list of established subscriptions is limited to show only first 50 subscriptions created within the tenant.\nData Lake Storage seamlessly integrates with Azure Synapse Analytics, Power BI, and Azure Data Factory. Data Lake Storage offers a comprehensive cloud platform designed for handling large-scale data and advanced analytics.\nArchitected from the ground up for cloud scale and performance, Data Lake Storage is a cost-effective solution to run big data workloads. With Data Lake Storage, your organization can analyze all its data in a single place with no artificial constraints.\nThe enablement of data export is limited to customers with a paid, premium Microsoft Dataverse license available for the tenant. Details of other licensing requirements are provided in admin documentation and in general availability\nrelease plans\n. More details about minimum Dataverse capacity requirements to access the data export features are announced in advance of general availability.\nGovernment Community Cloud (GCC) customers who need to configure integration to Data Lake Storage hosted in an Azure Government subscription should open a\nsupport request\n.\nPrerequisites\nTo access data export in the\nPower Platform admin center\n, you must have one of these roles: Power Platform admin, Dynamics 365 admin, or Microsoft 365 Global admin.\nCreate a storage account\nto use with Azure Data Lake Storage Gen2. Make sure you select the same location for the Data Lake Storage account as your Power BI tenant. To learn more about how to determine your Power BI tenant location, go to\nWhere is my Power BI tenant located\n?\nThis preview feature supports the following Azure Data Lake Storage Gen2 configurations:\nStorage Account Types: Standard general-purpose v2 or Premium block blobs.\nHierarchical Namespace:\nEnable hierarchical namespace\nmust be selected.\nNetwork Connectivity, Network Access:\nEnable public access from all networks\nmust be selected.\nNetwork Routing, Routing Preference:\nMicrosoft network routing\nis recommended.\nSecurity:\nRequire secure transfer for REST API operations\nmust be selected.\nSimplify data with Data Lake Storage\nData Lake Storage\nenables you to store captured data of any size, type, or ingestion speed in one single, secure location for operational and exploratory analytics. You can use Microsoft Power Platform self-service analytics to export Power Apps inventory and usage data directly to your\nData Lake Storage Gen2\nlocations.\nYou can store exported data for extended durations, and you can move data to data warehouses. To learn more about building custom reports at tenant and environment levels across business units, see\nCreate custom dashboards by using Power Platform inventory and usage data\n.\nExtensible analytics with Data Lake Storage\nYou can use self-service options in the Power Platform admin center with Data Lake Storage to extend Power Apps remote monitoring using data from various sources. You can also use cloud analytics and AI to take advantage of predictive analytics within service monitoring solutions. The diagram illustrates an example of how to derive intelligence from Power Platform data collection.\nA diagram of limitless extensibility options through using cloud analytics and AI is divided into three areas. Microsoft Power Platform apps - Power BI, Power Apps, and Power Automate - are shown collectively supplying governance, monitoring, and management to the middle area, the customer's Data Lake Storage. The data lake includes Power Platform admin center analytics and organizational datasets, all informed by cloud intelligence. On the right, the customer's dashboard is the core of an app workspace where data lake data is analyzed and acted on.\nData\nThe amount of data that you can export depends on your app and flow usage. The initial export includes inventory data of all the Power Apps and Cloud flows across your environment. After the initial export, an incremental data push occurs daily.\nFor example, an enterprise customer with two years' worth of inventory data might have 300 MB of data to export. After the initial export, approximately five MB to 10 MB of that data is pushed daily.\nSet up the data export process for your tenant\nAdmins should use the Power Platform admin center to set up the data export. Before you export data, make sure that your Data Lake Storage Gen2 account is set up as described in this section. Make sure that the admin who sets up the data export already has access to your storage account.\nFollow these steps to set up your data lake:\nSign in to the\nPower Platform admin center\nas a Microsoft Entra Global Admin, and then select\nManage\n>\nData export\n.\nOn the command bar, select\nNew data export\n.\nSelect either\nPower Apps\nor\nPower Automate\n. If not already enabled, set\nEnable tenant-level analytics\nto\nOn\n, and then select\nNext\n.\nNote\nThe Global Admin user must have specific roles described in\nFirst-time setup of data export\n.\nChoose a subscription to associate with the Azure storage account.\nIn the list of resource groups under this subscription, select a resource group.\nSelect the Azure storage account, in the list of storage accounts under the selected resource group.\nSelect\nCreate\nto set up the connection to Data Lake Storage Gen2.\nAllow up to 12 hours after you set up the data export for resource inventory and 30 days of historical usage data to be exported to the Azure Data Lake Storage account.\nFirst-time setup of a data export\nWhen setting up the first data export to your organization's data lake, Microsoft requires your Microsoft Entra global admin be the one to create the connection.\nImportant\nTo enable principal access to your organization's property, specifically a\nData Lake Storage Gen2 account\n, a connection with Microsoft's tenant service is necessary. A one-time connection setup must be performed by a user who is a member of your organization's Microsoft Entra (Microsoft Entra ID) Global Admin built-in role\nwith elevated access\nto subscriptions. Or a Global Admin who has at least a \"Contributor\" Azure role-based access control (RBAC) on the Azure Subscription with a \"User Access Administrator\" and \"Contributor\" Azure RBAC role on the target Azure Storage account. This is required because the tenant must allow the service to access and assign specific permissions on the Data Lake Storage account.\nRelated articles\nCreate custom dashboards by using Power Platform inventory and usage data\nAzure Data Lake Storage\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Export Analytics to Azure", @@ -242,7 +242,7 @@ "https://learn.microsoft.com/en-us/power-platform/admin/monitoring/monitor-copilot-studio": { "content_hash": "sha256:14d888f74e33ce4ce14a1c72186c975459117b188891c369a7d700ce7c72baa7", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nMetrics and recommendations for Copilot Studio\nFeedback\nSummarize this article for me\nCopilot Studio operational health metrics are available in the Power Platform admin center.\nView Copilot Studio metrics\nSign in to the\nPower Platform admin center\n.\nIn the navigation pane, select\nMonitor\n.\nIn the\nMonitor\npane, under\nProducts\n, select\nCopilot Studio\n.\nThe\nCopilot Studio\npage displays the metrics for conversational and autonomous agents created in Copilot Studio.\nCopilot Studio metrics and recommendations\nMetric definitions\nType\nMetric\nDefinition\nSupport\nAgents\nAgent session success rate\nA percentage that describes how often an autonomous agent is able to successfully execute its task, or how often a conversational agent's session was successfully resolved.\nPreview\nAgents\nAgent session success rate\nThe number of distinct user sessions in an agent in one day. A session begins when a user opens the agent and ends after a period of inactivity or when the agent is closed.\nPreview\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Monitor Copilot Studio", @@ -260,7 +260,7 @@ "https://learn.microsoft.com/en-us/power-platform/admin/business-continuity-disaster-recovery": { "content_hash": "sha256:54cd098843037545567df2f81b72a7e51087de17bf28467a3d1b574adb7af1f9", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nBusiness continuity and disaster recovery\nFeedback\nSummarize this article for me\nNote\nAs of September 3, 2025, the self-service disaster recovery feature supports failover for\nDynamics 365 Contact Center\n. With this enhancement, organizations can seamlessly initiate failover for their contact center environments, ensuring smooth execution of disaster recovery drills or continued operations from an alternate region when needed.\nSelf-service disaster recovery for finance and operations applications is now available in preview. Sign up\nusing this form\nif you're interested in participating in the preview.\nBusinesses expect their applications and customer data to be protected and resilient during unavoidable outages and disruptions. It's important to document a business continuity plan that minimizes the effects of outages. To recover and resume operations, make sure the plan lists stakeholders, processes, and specific steps.\nMicrosoft provides business continuity and disaster recovery capabilities to all\nproduction type environments\nin Dynamics 365 and Power Platform software as a service (SaaS) applications. This article describes how Microsoft keeps your production data resilient during outages.\nThe diagram shows a typical architecture of a geography that serves one or more countries or regions. Power Platform admins only need to know the geography location, but within each geography, Microsoft deploys more infrastructure to provide scale and extra protection for your data.\nA geography has at least one Azure region, which usually includes three\navailability zones\nbut never has fewer than two availability zones.\nBuilt-in disaster recovery in-region with Azure availability zones\nInfrastructure components like network, power, or cooling can fail unexpectedly, for example, because of a lightning strike, and can affect one or more data centers. To ensure resilience, Microsoft deploys availability zones, so your environment is replicated across at least two distinct zones.\nMicrosoft automatically detects availability zone-level failures and switches to other availability zones in the region almost instantly to protect you from data loss while keeping downtime near zero in most cases. This in-region capability is for production environments that host business-critical application processes and data. To avoid disruption, don't deploy production processes and data in nonproduction types like sandbox, developer, or trial environments.\nAvailability zones provide built-in resilience for seamless disaster recovery without manual intervention. Zone-redundant data services replicate data across multiple zones, so a failure in one zone doesn't affect data availability. The recovery point objective is near zero, and the recovery time objective is less than five minutes. If one zone fails, traffic is automatically rerouted to the remaining zones with minimal service disruption.\nBackup of production environments\nThe transition to availability zones significantly improves backup and failover processes for Dynamics 365 and Power Platform workloads. These workloads typically require contacting customer support for manual intervention. Your data and services stay highly available within the primary region, with built-in real-time redundancy across multiple zones.\nKey improvements include:\nAlways-on resilience\n: Your environments automatically replicate across multiple availability zones, so you don't need separate geo-secondary backups.\nFaster recovery\n: Synchronous replication across zones enables failover within a region to happen almost instantly, minimizing disruptions and data loss.\nSeamless experience\n: Unlike traditional backups that require restoration, availability zones keep your environment continuously active.\nReduced support dependency\n: Automated failover within the primary region means you don't need to contact Microsoft support for most disaster recovery scenarios.\nA limited number of customers in certain regions are transitioning to the improved architecture. Whether the region transitioned or is transitioning, the service always keeps a backup of environment data in more than one data center.\nAvailability zones are far enough apart to reduce the chance of an outage affecting more than one zone, but close enough to maintain low-latency connections to other availability zones. Availability zones are typically separated by several kilometers and are usually within 100 kilometers.\nCustomers who need greater distance within a geography can use self-service disaster recovery to keep a copy in a secondary region. With this feature, customers control failover operations and run disaster recovery drills as described in the following section.\nCross-region self-service disaster recovery\nMost geographies have region pairs separated by at least 300 miles when possible, to help protect your data in large-scale disasters.\nSelf-service disaster recovery is a Power Platform infrastructure capability that lets you replicate your environment across long distances and start environment failover between regions yourself.\nYou usually have multiple environments of different types in your tenant. This capability is available only for production environments.\nTo turn on self-service disaster recovery, make sure your environment is managed and linked to a\npay-as-you-go billing plan\n. For more information about managed environments, go to\nManaged Environments\n.\nAllow Virtual Network pairing for self-service disaster recovery in Dynamics 365\nIf you deploy your Dynamics 365 environment within a Virtual Network and plan to use self-service disaster recovery, you need to configure a\nVirtual Network pair\n. This pairing ensures that your primary and secondary environments can communicate securely during failover and failback operations. Without a Virtual Network pair, disaster recovery operations fail because network connectivity between regions can't be established.\nFor setup instructions, go to\nSet up virtual network support for Power Platform\n.\nTurn on self-service disaster recovery\nThis action sets up resources and starts replicating data between the primary and secondary locations. The process can take up to 48 hours to finish. Admins get a notification when the process finishes.\nTurning on disaster recovery in an environment doesn't affect the environment or its data.\nTo turn on disaster recovery, follow these steps.\nSign in to the\nPower Platform admin center\nas a system administrator.\nIn the navigation pane, select\nManage\n.\nIn the\nManage\npane, select\nEnvironments\n. The\nEnvironments\npage appears.\nSelect the production environment where you want to turn on self-service disaster recovery.\nSelect\nDisaster Recovery\nin the command bar at the top of the page. The\nDisaster Recovery\npane appears.\nSelect the checkbox to turn on\nDisaster Recovery\n.\nSelect\nSave\n.\nThe environment briefly displays the\nEdit details\npage.\nThe\nEnvironment details\npage displays that the process of turning on the feature has started.\nYou might also want to turn on disaster recovery for other events, like:\nDisaster recovery drill\nEmergency response for a major regional outage\nDisaster recovery drills\nYour company might have disaster recovery drills documented as a requirement in your internal business continuity plans. Some industries and companies might be required by government regulations to perform audits on their business continuity disaster recovery capabilities. In these cases, you can run a disaster recovery drill on an environment. A disaster recovery drill lets you do self-service disaster recovery without losing any data. The duration of the failover action can be slightly longer while all remaining data is replicated to the secondary region.\nWe recommend doing drills on a copy of a production environment, since this process involves downtime when failing over to remote region that can last for minutes. For example, you might want to copy a production environment to a sandbox environment and then change the type from sandbox to production.\nEmergency response failover\nChoose this option during an emergency, when the primary region has an outage and you can't use environments or data. If you select this option, the environment fails and doesn't copy any more data except for data that's already replicated before the outage.\nWhen you start an emergency response, you see the amount of data loss shown in time. Compare this data loss to your recovery point objective to check if it's acceptable before you continue. The environment stays in a Running state until disaster recovery finishes and normal operation resumes from the secondary region.\nNote\nDatabase backups are\nnot replicated to secondary regions\nfor scenarios supported by self-service disaster recovery, unless you explicitly allow self-service disaster recovery. Without self-service disaster recovery, backups remain in the primary region only, which means cross-region failover can't be guaranteed. To ensure business continuity and compliance with your disaster recovery strategy, configure self-service disaster recovery for your environment.\nSwitch back to primary region\nAfter you fix an outage or finish your drill, switch the environment back to its primary region. The environment can operate with limited resources in the paired region. You don't lose data during this action.\nEnvironment disaster recovery status\nAdmins check the current disaster recovery state and location of an environment in the\nEnvironment details\npage. Admins also select\nDisaster Recovery\nin the command bar to open the\nDisaster Recovery\npane.\nTo check data replication latency at any time, select\nDisaster Recovery\n, and then select\nEmergency response\nas the disaster recovery reason. This action opens a confirmation dialog that shows the last replication time between regions for that environment. You can select\nCancel\nif your only purpose is to check the potential loss of data if there's a failover operation. Remember, the last sync time always changes because data is replicated continuously.\nDocument your business continuity plan\nWe recommend that you perform disaster recovery drills or an emergency response before a real disaster strikes, so you can document all steps required for any integration points that are external to Power Platform. Your company is then more prepared for recovery if there's a real disaster.\nFrequently asked questions (FAQs)\nWhy use self-service disaster recovery?\nSuper storms, natural calamities, and unforeseen political uncertainties that have the potential to bring an entire region down are becoming more common. To minimize the impact of a disaster that brings an entire region down, maintain an asynchronous copy in a remote region. You might also want to maintain a copy in a remote region for compliance audits.\nSelf-service disaster recovery gives you control to fail over to a secondary region with the push of a button and failback with the push of a button when the primary region is restored to ensure business continuity. You can also simulate the primary region being down to run a real failover and failback to the secondary region to test a real compliance drill. Run drills with a copy of the production environment to avoid any downtime.\nWhy do I need self-service disaster recovery if I already have a secondary copy maintained in a remote, secondary region?\nFor the public cloud, the system doesn't maintain secondary copies in a remote, secondary region unless you turn on self-service disaster recovery.\nThe system maintains at least two—and in some cases, three—synchronous copies of production environments within a region, at no extra cost to you. These copies are hosted in availability zones in physically separated data centers with independent power, cooling, and networking, in compliance with legislated data residency regulations.\nWith the implementation of\navailability zones\n, these cross-region copies became redundant. Recovering from these copies was a complex and manual process that affected recovery times.\nWhat are the costs associated with using self-service disaster recovery?\nYou must turn on\npay-as-you-go\nfor the environment as a prerequisite to turning on self-service disaster recovery on that environment.\nThe selected environment must be a\nManaged Environment\n. This environment is a premium license tier.\nCapacity charges are based on the storage consumption of the environment's paired secondary region for database, file, and log storage types.\nCapacity consumption is reflected in the familiar licensing experience within the Power Platform admin center. Learn more in\nView usage and billing information\n.\nFor example, suppose a user has 10-GB capacity consumption in the primary location. When self-service disaster recovery is turned on, a copy of data is created in the remote secondary region and this copy consumes another 10 GB. You can pay for this 10 GB in the secondary region through storage entitlements. Only if you exceed your available free storage or available entitlements does a pay-as-you-go plan actively start billing.\nPay-as-you-go is designed to generate various alerts and warnings at various thresholds to warn administrators of depleting storage. Use the alert mechanism to your advantage.\nPay-as-you-go links the selected environment to the Azure subscription by using a billing policy. Once you link an environment to an Azure subscription, the usage of apps and any Dataverse or Power Platform usage that goes above the included storage amounts are billed against the Azure subscription by using Azure meters. For more information, go to\nPay-as-you-go meters\n. If you acquire more storage entitlements, the pay-as-you-go plan stops running the meters and consuming from available free storage and entitlements take precedence.\nHow does billing work for self-service disaster recovery?\nIf you configure your environment to draw capacity from your tenant's Dataverse capacity entitlement, the system consumes the entitled capacity first. You still need a pay-as-you-go billing plan to avoid capacity overages.\nThe pay-as-you-go plan generates multiple warnings at various thresholds to ensure that you're well-informed and can take appropriate action to avoid pay-as-you-go charges.\nAdmins can allocate capacity to the environment, after which the pay-as-you-go plan is billed.\nYou can't turn off the pay-as-you-go plan in the billing experience if you turn on self-service disaster recovery.\nCan I switch regions during a regional outage?\nIf there's a regional outage, the system supports failover only to the designated secondary region as part of self-service disaster recovery. It doesn't support switching to any other arbitrary region.\nIs my region supported for self-service disaster recovery?\nSelf-service disaster recovery depends on Azure region pairs. Regions that don't have a regional Azure pair aren't supported. For more information, go to\nAzure supported regions\n.\nAs of November 2025, Austria East, Belgium Central, Chile Central, Indonesia Central, Israel Central, Italy North, Malaysia West, Mexico Central, New Zealand North, and Poland Central are single regions and aren't supported. Once a region gets a regional pair, it's on our roadmap for Power Platform geo build-out and for supporting self-service disaster.\nNote\nUAE, Brazil, and South Africa have regional pairs in constrained regions and are on the roadmap for Power Platform geo buildout followed by self-service disaster recovery support. Geo build-out prioritization is influenced by impact, opportunity, and resource constraints.\nWhat should I know about the capacity experience?\nWhen you allow self-service disaster recovery, you see more storage consumption displayed in the Dataverse capacity graph, clearly indicating the extra capacity used by the cross-region backup.\nWhen you don't allow self-service disaster recovery, the capacity graph shows standard usage without the extra storage for replication.\nWhen self-service disaster recovery is active, the capacity graph displays the extra consumption from cross-region replication, with a\nDisaster recovery active\ntag in the Dataverse capacity summary.\nHow do I disable self-service disaster recovery?\nTo disable self-service disaster recovery, go to the\ndisaster recovery pane\nin Power platform admin center and uncheck the\nTurn on disaster recovery\ncheckbox.\nWhat happens when I disable self-service disaster recovery?\nDisabling self-service disaster recovery deletes all replicated environment data in the paired region. You're prompted to confirm the environment's name before proceeding.\nCan I disable self-service disaster recovery while in a paired region (in a failed over state)?\nNo, you can't disable self-service disaster recovery while the environment is in a failed over state. You must switch to the primary region first.\nAre Power Apps and Power Pages supported with self-service disaster recovery?\nYes, self-service disaster recovery is supported for Power Apps and Power Pages.\nIs Power Automate supported with self-service disaster recovery?\nAs of October 2025:\nPower Automate desktop flows are fully supported for failover and failback with self-service disaster recovery.\nPower Automate cloud flows are now available in preview. Don't use features in preview with production workloads.\nHow can I find out where my data is being replicated to? Can I change my secondary destination region?\nMicrosoft reserves the rights to disclose the exact details of where the customer's data is residing for security and if it may need to be moved or replicated for various, high availability and resiliency scenarios. Customers can be assured that their data at rest respects geographical boundaries and abides by legislated residency laws. Even if self-service disaster recovery isn't turned on, Microsoft reserves the right to replicate, move, and relocate the data within a region for high availability and operational needs. The location of customer data within a geography (for example,\nAPAC\n) isn't disclosed and may change based on Azure capacity constraints.\nIs Field service supported for self-service disaster recovery?\nField service now supports self-service disaster recovery. You can now manage work orders, scheduling, inventory, and customer communications in one unified platform and in a disaster, fail over your automated service workflows, orders, inventory, and dispatching to a remote region for business continuity.\nAre there any known limitations during a region-wide outage that self-service disaster recovery can't mitigate?\nCopilot Studio conversation runtime requests fail until Microsoft restores the service in the primary region. Custom agents successfully failover and failback since they're saved on Dataverse.\nIn Dynamics 365, analytics and automation in sales observe latency impact. Relationship analytics KPIs aren't computed and new models for scoring aren't created during an outage.\nIn Dynamics 365 Customer Insights - Data, real-time updates are impacted. It doesn't support self-service disaster recovery today.\nIn Dynamics 365 Customer service, basic scenarios that are 100% dependent on Dataverse, such as case creation, or Knowledge Base articles work. Case knowledge base access in customer service is unavailable.\nDynamics 365 Project Operations features aren't yet supported.\nData lake failover has known issues. Self-service disaster recovery isn't supported yet.\nConnectors may have recovery issues when dependent on external systems, like SharePoint, SQL Server or third-party applications.\nFor Dynamics 365 Sales, analytics, reporting, and functions dependent on automation, such as sales forecasting, are unavailable.\nFinance and operations products aren't currently supported for self-serve disaster recovery during regional outages.\nAI Builder may see latency impact.\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Business Continuity", @@ -269,7 +269,7 @@ "https://learn.microsoft.com/en-us/power-platform/admin/backup-restore-environments": { "content_hash": "sha256:09052b3e1f545b99df06948f579a190bd6980c1126a8a3e901d36a62abc14813", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nBack up and restore environments\nFeedback\nSummarize this article for me\nIt's important to protect your data on Microsoft Power Platform and in Dataverse and to provide continuous availability of service through system or manual backups.\nSystem backups are automatically created for environments that have a database. System backups of production environments that have a database and Dynamics 365 applications are retained for up to 28 days. By default, backups of production environments without Dynamics 365 applications and other nonproduction environments are retained for seven days. However, for managed production environments without Dynamics 365 applications, the retention period can be extended up to 28 days using PowerShell.\nManual backups are backups that the user initiates. It's recommended for creating manual backups before performing major customizations, applying a version update, or making significant changes to the environment. You can create these backups for production and sandbox environments, but not for the default environment. Manual backups of production environments that have Dynamics 365 applications are kept for up to 28 days. Backups of environments that don't have Dynamics 365 applications are kept for seven days.\nSupported retention period\nEnvironment types\nSystem backup\nManual backup\nProduction with Dynamics 365 apps\n28 days\n28 days\nProduction without Dynamics 365 apps*\n7 days\n7 days\nSandbox\n7 days\n7 days\nDeveloper\n7 days\n7 days\nTeams\n7 days\n7 days\nDefault**\n7 days\nNot supported\nTrial\nNot backed up\nNot supported\nTrial (subscription-based)\nNot backed up\nNot supported\n* For managed production environments that don't have Dynamics 365 applications, we allow you to extend the retention period beyond seven days, to a maximum of 28 days, through PowerShell. Learn more in\nChange the backup retention period for production environments without Dynamics 365 applications\n.\n** We don't support restoring a system backup of the default environment through the Power Platform admin center. Learn more in\nBackup and restoration of the default environment\n.\nSystem backup and restore operations aren't supported for trial-type environments. To use the full set of features, including system backup and restore options, go to\nConvert either type of trial environment to a production environment\n.\nSystem backups\nEnvironments that have a database are automatically backed up and can be restored. All your environments, except trial environments (both standard and subscription-based), have system backups. System backups are created continuously using the Azure SQL Database automated backup feature. Learn more in\nAutomated backups\n.\nSign in to the\nPower Platform admin center\n.\nIn the navigation pane, select\nManage\n, then in the\nManage\npane, select\nEnvironments\n.\nOn the\nEnvironments\npage, select an environment.\nIn the command bar, click\nBackup & Restore\n, then select\nRestore or manage\n.\nOn the\nSystem\ntab, select an available system backup by choosing a date and time.\nClick\nContinue\n.\nThe\nBackup retention\nside panel displays the backup details.\nAbout system backups\nSystem backups aren't counted toward storage capacity. To restore an environment, you need\n1 gigabyte (GB)\nof free capacity. If you're over capacity, learn more in:\nIs there a database size restriction for backing up or restoring an organization through the user interface or API?\n.\nCopying and restoring data might take more than one day, depending on the size of the data, especially if you must copy\naudit data\n.\nBackup and restore operations include only apps (created by using Power Apps) and flows (created by using Power Automate) in a Dataverse solution.\nDownloading a copy of a database backup for offline use isn't supported.\nChange the backup retention period for production environments without Dynamics 365 applications\nFor environments without Dynamics 365 applications, the default backup retention period is seven days. Admins who run production\nManaged Environments\nof this type can use PowerShell to change the retention period to 7, 14, 21, or 28 days. To change this setting, you must have an admin role, such as Power Platform admin or Dynamics 365 admin in Microsoft Entra ID.\nKeep the these points in mind:\nIf you adjust the backup retention period, the new setting applies to all future backups and existing backups within the retention period. The change may take up to 24 hours to apply, and some older backups may be removed earlier than expected. Because the change might take up to 24 hours to affect existing backups, some backups might be removed earlier than you expect.\nFor all other nonproduction environments, including default-type environments, the backup retention period is seven days by default.\nFor example, you create an environment on January 1. On that day, the system starts to make backups of your environment, and it stores those backups for a default period of seven days. Therefore, on January 8, backups from January 1 to January 8 are available for restoration. If you change the retention period to 14 days on January 8, the system starts to keep the backups for a longer time. Therefore, on January 16, backups from January 3 to January 16 are available for restoration. In this way, you have more flexibility and control over your backup data.\nPrepare your environment for PowerShell\nThe PowerShell module for Power Platform Administrators is the recommended tool for managing administrative capabilities in Power Platform environments. For information that helps you get started with the PowerShell for Power Platform Administrators module, go to\nGet started with PowerShell for Power Platform Administrators\n.\nNote\nYou can extend the backup retention period only for production environments where Dynamics 365 applications aren't enabled. For production environments where Dynamics 365 applications are enabled, a retention period of 28 days is used. For all other nonproduction environments, the default backup retention period of seven days is used, regardless of the setting's value.\nSet the retention period\nSet-AdminPowerAppEnvironmentBackupRetentionPeriod\nSupply values for the following parameters:\nSet the\nEnvironmentName\nparameter to the environment ID of your environment.\nSet the\nNewBackupRetentionPeriodInDays\nparameter to\n7\n,\n14\n,\n21\n, or\n28\n.\nVerify the retention period\nGet-AdminPowerAppEnvironment -EnvironmentName \"Environment ID\"\nSet the\nEnvironmentName\nparameter to the environment ID of your environment.\nRestore system backups\nYou can't directly restore backups to production environments. To restore a backup to a production environment, you must first change the environment type to sandbox, perform the restore, and then switch the environment type back to production. If you want to restore a system backup to a production environment, you must first\nchange the environment type\nto sandbox. Then, after the restore is completed, you can then switch the environment type back to production. Learn more in:\nCan I restore to a production environment?\n.\nYou must restore an environment in the same region where it was backed up. The target and source environments should be in the same region. When an environment is restored onto itself, audit logs aren't deleted. For example, when an environment is restored onto itself to a past time (t1), full audit data for the environment is available. This data includes any audit logs that were generated after t1.\nSign in to the\nPower Platform admin center\n.\nIn the navigation pane, select\nManage\n, then in the\nManage\npane, select\nEnvironments\n.\nOn the\nEnvironments\npage, select an environment.\nIn the command bar, click\nBackup & Restore\n, then select\nRestore or manage\n.\nUnder the\nSystem\ntab, select an available system backup by choosing a date and time.\nSelect\nContinue\n.\nOn the\nBackup retention\nside panel, select the target environment to overwrite.\nSelect\nRestore\n, then select\nConfirm\nto proceed with overwriting the environment.\nNote\nOnly sandbox environments can be restored to. For information about the effects of changing the environment type, go to the section:\nCan I restore to a production environment?\n.\nUnder\nEdit details\n, you can change the environment name.\nIf you don't see the environment that you want to restore to\nThese restrictions apply to restoration from both system backups and manual backups:\nYou must restore an environment in the same region where it was backed up. The target and source environments should be in the same region.\nThe source environment can be a production, sandbox, or developer environment. No other types of environments are supported.\nThe target environment can be a sandbox or developer environment. If the target is a developer environment, the source must also be a developer environment.\nA Managed Environment can be restored only to another Managed Environment. A non-Managed Environment can't be restored to a Managed Environment.\nIf the source environment has a customer-managed encryption key applied, the target environment must also have the same customer-managed encryption key applied.\nBackup and restore operations work only with source and target environments that have Dataverse.\nIf there are any enterprise policies applied to the source environment, then the target environment should also have the same set of policies applied.\nSandbox, Teams, and developer environments support self-restore backups.\nSource type\nTarget type\nProduction\nSandbox\nSandbox\nSandbox\nDeveloper\nSandbox, Developer\nTeams\nTeams (self-restore only)\nDefault\nDeveloper\nFor more information about how to restore to a production environment, go to the section:\nCan I restore to a production environment?\n.\nManual backups\nAlthough automated system backups are great, you should create your own backups before you do major customization or apply a version update. Manual backups might take up to 10 minutes to process before they're available for restoration. It's recommended to wait at least 10–15 minutes before attempting to restore from a manual backup. Therefore, wait at least 10 to 15 minutes before you try to restore your data from a manual backup.\nAbout manual backups\nYou can create backups of production, sandbox, Teams, and developer environments.\nYou can't create backups of the default environment.\nManual backups of production environments that have a database and Dynamics 365 applications are kept for up to 28 days. Manual backups for production environments that don't have Dynamics 365 applications are kept for seven days.\nSandbox backups are kept for up to seven days.\nCheck your expiration date.\nThe label of the backup file that is created reflects the restore point timestamp. The restore point timestamp is the closest available time to the time when the manual backup was created. The timestamp label can't be edited.\nThere's no limit on the number of manual backups that you can create.\nManual backups don't count against your storage capacity limits, but restoring an environment requires at least 1 GB of available capacity.\nYou must restore an environment in the same region where it was backed up.\nIf you don't see your target environment, refer to the\nIf you don't see the environment that you want to restore to\nsection for possible reasons and troubleshooting steps.\nCreate a manual backup\nSign in to the\nPower Platform admin center\n.\nIn the navigation pane, select\nManage\n, then in the\nManage\npane, select\nEnvironments\n.\nOn the\nEnvironments\npage, select an environment.\nIn the command bar, click\nBackup & Restore\n, then select\nRestore or manage\n.\nSelect the\nManual\ntab, then click\nCreate a manual backup\n.\nFill in the information, then select\nCreate\nto proceed.\nThere's no real-time status indicator while the backup is being processed. However, you receive a confirmation message once the backup is successfully created. When the backup is completed, you receive the following message: \"The <\nbackup name\n> backup was successfully created.\"\nRestore a manual backup\nYou can restore backups only to sandbox environments. You can't restore them to production environments. If you want to restore a manual backup to a production environment, you must first change the environment type to sandbox. Then, after the restore is completed, you can switch the environment type back to production.\nImportant\nChanging the environment type to sandbox affects database retention. For more information about the effects of changing the environment type, go to the section:\nCan I restore to a production environment?\n.\nSign in to the\nPower Platform admin center\n.\nIn the navigation pane, select\nManage\n, then in the\nManage\npane, select\nEnvironments\n.\nOn the\nEnvironments\npage, select an environment.\nIn the command bar, select\nBackup & Restore\n, then select\nRestore or manage\n.\nOn the\nManual\ntab, select a manual backup to restore, then select\nRestore\nin the command bar.\nOn the\nBackup retention\nside panel, select the target environment to overwrite.\nSelect whether you want to include audit logs. The inclusion of audit logs can significantly increase the time that's required to restore an environment. Therefore, audit logs are excluded by default. Learn more in\nRestore audit logs\n.\nSelect\nRestore\n, then select\nConfirm\nto proceed with overwriting the environment.\nRestore audit logs\nRestoration of audit logs can significantly increase the time that is required to restore an environment. Therefore, audit logs are excluded by default. Follow these steps to include audit logs when you restore a manual backup.\nComplete steps 1 through 6 of the previous procedure.\nUnder\nAudit logs\n, select\nClick here\n.\nEnable copying of audit logs.\nContinue with step 8 of the previous procedure.\nDelete a manual backup\nYou can delete manual backups. You can't delete system backups.\nSign in to the\nPower Platform admin center\n.\nIn the navigation pane, select\nManage\n, then in the\nManage\npane, select\nEnvironments\n.\nOn the\nEnvironments\npage, choose an environment.\nIn the command bar, select\nBackup & Restore\n, then select\nRestore or manage\n.\nNavigate to the\nManual\ntab. Select the backup to delete, then select\nDelete\nin the command bar.\nSelect\nContinue\nto confirm the deletion.\nApp-specific backups\nFor information about backup and restore for specific apps, refer to the documentation for the appropriate app:\nDynamics 365 Marketing\nDynamics 365 Finance\nDynamics 365 Customer Service\nAzure Synapse Link for Dataverse\nPower Apps portals\nFAQ\nHow are system backups made?\nIn the current version of the product, system backups occur continuously. The underlying technology is Azure SQL Database. Learn more in\nAutomated backups\n.\nHow are manual, on-demand backups made?\nIn the current version of the product, system backups occur continuously. The underlying technology is Azure SQL Database. Learn more in\nAutomated backups\n.\nBecause Azure SQL Database continuously makes backups, there's no need to make other backups. Your on-demand backup is just a timestamp and a label that reflects that timestamp. We store this information in our system and use it during restore requests. This behavior differs from the behavior in previous versions that took a full backup during an on-demand backup.\nWhy can't I see the status of the manual backup?\nThere's no real-time status indicator while the backup is being processed. However, you receive a confirmation message once the backup is successfully created. When the backup is completed, you receive the following message: \"The <\nbackup name\n> backup was successfully created.\"\nShould I open a support ticket to make a full backup?\nNo. In the current version of the product, system backups occur continuously. This behavior differs from the behavior in previous versions, where backups were made once a day. The underlying technology is Azure SQL Database. For more information, see\nAutomated backups\n.\nBecause Azure SQL Database continuously makes backups, and there's no specific way to make other, on-demand backups, we recommend that you use the on-demand backup capabilities for labeled backups in Power Platform admin center.\nHow long are my manual, on-demand backups and system backups retained?\nSystem and manual backups for some production-type environments are retained for up to 28 days. Backups for other environment types are retained for only up to seven days. Learn more in the section:\nHow do I determine if backups of a production environment are retained for 28 days?\n.\nHow do I determine if backups of a production environment are retained for 28 days?\nProduction environments that have been created with a database give you the option to enable one or more Dynamics 365 applications (for example, Dynamics 365 Sales or Dynamics 365 Customer Service). However, you must purchase licenses that entitle you to deploy those applications. Backups of production environments that have a database and Dynamics 365 applications are retained for up to 28 days. By default, backups of production environments that don't have Dynamics 365 applications are retained for seven days. However, for Managed Environments, you can increase the retention period beyond seven days.\nCan I move my data from an online environment to an on-premises version?\nIt isn't possible to obtain a copy of your database backup. If you want to move your online data to Dynamics 365 Customer Engagement (on-premises), data migration is required. For smaller data sets, consider\nexporting data to Excel\n. For larger data sets, find a third-party data migration solution on\nMicrosoft Marketplace\n.\nHow can I download a copy of my backup?\nIt isn't possible to obtain a copy of your database backup. Moving your online data requires data migration. For smaller data sets, consider\nexporting data to Excel\n. For larger data sets, find a third-party data migration solution on\nMicrosoft Marketplace\n.\nIs there a database size restriction for backing up or restoring an organization through the user interface or API?\nThere are no restrictions on database size (or storage capacity/entitlement) for backups that are made through the user interface (UI) or API. However, if an organization's storage capacity usage exceeds the entitled capacity, the following admin operations are blocked:\nRestore an environment (requires minimum 1-GB capacity available)\nCreate new environment (requires minimum 1-GB capacity available)\nCopy an environment (requires minimum 1-GB capacity available)\nTo comply with storage usage requirements, customers can always\nfree up storage\n,\narchive data\n,\ndelete unwanted environments\n, or buy more capacity. To learn more about capacity add-ons, refer to the add-ons section in the\nMicrosoft Dynamics 365 Licensing Guide\nor the\nMicrosoft Power Platform Licensing Guide\n. You can work through your organization's standard procurement process to purchase capacity add-ons.\nCan I restore to a production environment?\nYou can't directly restore to a production environment. This restriction helps prevent accidental overwrites.\nIf you want to restore to a production environment, you must first change the environment type to sandbox. Learn more in\nSwitch an environment\n.\nIf you want to restore a system backup or a restore point from the past seven days, you can safely switch the environment type. If you think you might have to restore to a backup that is older than seven days, we strongly recommend that you keep the environment a production environment and consider restoring to a different environment of the sandbox type.\nIf you do switch a production environment to a sandbox environment for a manual restore, you can choose a backup only from the past seven days. After the restore is completed, be sure to change the environment back to a production environment\nas soon as possible\n, to help prevent the loss of any backups that are older than seven days.\nWhy is my organization in administration mode after a restore, and how do I disable it?\nThe newly restored environment is put in administration mode. To turn off administration mode, go to\nSet administration mode\n. You can set administration mode in sandbox or production environments.\nAfter a restore, what steps are needed to ensure that flows work as expected?\nFlows\n– In the target environment, existing solution flows are deleted, but existing nonsolution flows remain. Review the flows in the target environment to ensure that triggers and actions point to the correct locations. Solution flows are turned off. Therefore, enable flows as required. Solution flows must be enabled or turned on for the PowerShell and API commands to work with them.\nConnection references\n– Connection references require new connections. Create and set connections on connection references.\nCustom connectors\n– Custom connectors should be reviewed and, as required, deleted and reinstalled.\nDo apps that are shared with Everyone continue to be shared with Everyone in a restored environment?\nNo. Apps that are shared with Everyone in an environment that is backed up aren't shared with Everyone in the restored environment. Alternatively, a canvas app can be shared with a security group. In this case, the app in the restored environment is shared with that security group.\nAre app identifiers the same after backup and restore operations?\nNot for canvas apps. The app ID for a canvas app in a restored environment differs from the app ID when an environment was backed up.\nIf I restore my environment, do previous backups remain available?\nYes, all backups within the organization's retention period remain available.\nHow can I restore records after a bulk deletion without restoring over an organization?\nTo restore records after a bulk deletion without restoring over an organization, follow these steps.\nCreate a new, empty organization.\nRestore the backup from the current organization to the new organization.\nThis approach keeps the original organization together with all the records that were added since the backup. At the same time, it creates a new organization that has the records that were deleted.\nHow can I restore a deleted environment?\nYou can recover a recently deleted environment (within seven days of deletion) by using the Power Platform admin center or the Recover-AdminPowerAppEnvironment Power Apps cmdlet. Production environments that have Dynamics 365 applications are available for up to 28 days.\nFor more information about the recovery environment, go to\nRecover environment\n.\nTroubleshooting\nThe restore operation failed. What action can I take?\nThe restore process, especially for environments with large amounts of data, is a complex backend operation. If the restore operation fails, the target environment is left in a disabled state. To retry the restore process, the failed environment must be the target environment for the operation. Wait 30 minutes and retry the operation again. The other actions you can take for the disabled, target environment are reset, delete, or copy to as a target environment.\nYou don't see the environment that you want to restore to\nThe source environment can be a production, sandbox, or developer environment. No other types of environments are supported.\nThe target environment can be a sandbox or developer environment. If the target is a developer environment, the source must also be a developer environment.\nThe target and source environments should be in the same region.\nA Managed Environment can be restored only to another Managed Environment. Learn more in\nManaged Environments overview\n.\nIf the source environment has a customer-managed encryption key applied, the target environment must also have a customer-managed encryption key applied. Learn more in\nManage your customer-managed encryption key\n.\nIf an environment is enabled for\nVirtual Network support\n, the target environment must be in the same enterprise policy as the source environment.\nRestoration of an environment requires\n1 GB of available capacity\n. Learn more in the section:\nIs there a database size restriction for backing up or restoring an organization through the user interface or API?\n.\nBackup and restore operations work only with source and target environments that have Dataverse. Learn more in\nAdd a Microsoft Dataverse database\n.\nIf you don't have enough storage, go to\nAdd Microsoft Dataverse storage capacity\nto request more storage.\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Backup and Restore", @@ -278,7 +278,7 @@ "https://learn.microsoft.com/en-us/power-platform/admin/regions-overview": { "content_hash": "sha256:d375d4a578beb7a1e0a113be00b6d9a41a84802870b534c2fdf6da69d3ed0eee", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nRegions overview\nFeedback\nSummarize this article for me\nFor multinational companies with employees and customers distributed around the world, you can create and manage environments specific to your global regions. You can create an environment in a different region than where your tenant resides. Local environments can provide quicker data access for users in that region. Be sure to read\nA multi-environment deployment\nto understand the features of multiple environments.\nHow do I find out where my app is deployed?\nYour app is deployed in the region that hosts the environment. For example, if your environment is created in the Europe region, then your app is deployed in Europe data centers.\nUsing Power Platform admin center\nIf you're an administrator, you can determine the region of each environment in the Power Platform admin center.\nSign in to the\nPower Platform admin center\n.\nIn the navigation pane, select\nManage\n.\nIn the\nManage\npane, select\nEnvironments\n.\nOn the\nEnvironments\npage, locate the\nRegion\ncolumn.\nWhat regions are available?\nRefer to the\nMicrosoft Dynamics 365 and Power Platform data residency documentation\n.\nWho can create environments in these regions?\nWith Power Apps, you can create environments in various regions across the globe, which benefits your business in these ways:\nStore your data closer to your users\nMaintain the compliance requirement of your geography\nYou can create a database for an environment in one region (for example, United States) even if the Microsoft Entra tenant is in another region (for example, Canada or Europe). Note the following:\nTax laws prevent you from creating a database for an environment in India and Australia, if your Microsoft Entra tenant is not in India and Australia respectively. You can get an exception for Australia.\nOnly a US Government associated organization can create an environment in US Government (GCC).\nYour Microsoft Entra tenant's home location\nRegions where you can create a database\nIndia\nAny region except Australia\nAustralia\nAny region except India\nAny other location\nAny region except India and Australia\nWhat features are specific to a given region?\nEnvironments can be created in different regions, and are bound to that geographic location. When you create an app in an environment, that app is deployed in datacenters in that geographic location. This applies to any items you create in that environment, including databases in the Microsoft Dataverse, apps, connections, gateways, and custom connectors.\nFor optimal performance, if your users are in Europe, create and use the environment in the Europe region. If your users are in the United States, create and use the environment in the U.S.\nNote\nOn-premises data gateways aren't available in the India region.\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Regions Overview", @@ -287,7 +287,7 @@ "https://learn.microsoft.com/en-us/power-platform/admin/capacity-storage": { "content_hash": "sha256:62ce4de8fbad4e828d92e4f9699c1b0284164973bc4d2de80a5a334880c262a2", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nDataverse capacity-based storage details\nFeedback\nSummarize this article for me\nIf you purchased storage after April 2019, or if you have a mix of storage purchases made before and after April 2019, you see your storage capacity entitlement and usage by database, file, and log as it appears in the Microsoft Power Platform admin center today.\nData volume continues to grow exponentially as businesses advance their digital transformation journey and bring data together across their organization. Modern business applications need to support new business scenarios, manage new data types, and help organizations with the increasing complexity of compliance mandates. To support the growing needs of today's organizations, data storage solutions need to evolve continuously and provide the right solution to support expanding business needs.\nNote\nFor licensing information, see the\nPower Platform Licensing Guide\n.\nIf you purchased your Dynamics 365 subscription through a Microsoft partner, contact them to manage storage capacity. The following steps don't apply to partner-based subscriptions.\nLicenses for Microsoft Dataverse capacity-based storage model\nThe following licenses provide capacity by using the new storage model. If you have any of these licenses, you see the new model report:\nDataverse for Apps Database Capacity\nDataverse for Apps File Capacity\nDataverse for Apps Log Capacity\nTo check whether you have any of these licenses, sign in to the Microsoft 365 admin center and then go to\nBilling\n>\nLicenses\n.\nNote\nIf you have a mix of\nlegacy model licenses\nand new model licenses, a new model report is displayed.\nIf you have none of the\nlegacy model licenses\nnor the new model licenses, a new model report is displayed.\nVerifying your Microsoft Dataverse capacity-based storage model\nTo view the Capacity add-ons summary page, you need one of the following roles:\nTenant administrator\nPower Platform administrator\nDynamics 365 administrator\nAlternatively, a user with any of the preceding roles can grant permissions to the environment administrator to view the\nCapacity summary\ntab within the\nTenant setting\npage.\nFollow these steps to verify you have the Microsoft Dataverse capacity-based storage model:\nSign in to the\nPower Platform admin center\n.\nIn the navigation pane, select\nLicensing\n.\nIn the Licensing pane, select\nCapacity add-ons\nto go to the Capacity add-ons summary page where you can see your tenant's storage, add-ons, and Microsoft Power Platform requests.\nLearn more in\nDataverse capacity-based storage overview\n.\nCapacity page details\nThe tabs\nSummary\n,\nDataverse\n,\nMicrosoft Teams\n,\nAdd-ons\n, and\nTrial\nare available on the Capacity add-ons page.\nSummary tab\nOn the Capacity page,\nSummary\nis the default view where you see a tenant-level view of where your organization is using storage capacity. You can view:\nStorage capacity usage\nStorage capacity, by source\nTop storage usage, by environment\nAll Dataverse tables, including system tables, are included in the storage capacity reports. Files such as .pdf (or any other file attachment type) are stored in file storage. However, the database stores certain attributes needed to access the files.\nStorage capacity usage\nIn the\nstorage capacity usage\nsection, you can see:\nFile and database\n: The following tables store data in file and database storage:\nAttachment\nAnnotationBase\nAny custom or out-of-the-box table that has columns of datatype file or image (full size)\nAny table that is used by one or more install Insights applications and\nends in -\nAnalytics\nWebResourceBase\nRibbonClientMetadataBase\nLog\n: The following tables are used:\nAuditBase\nPlugInTraceLogBase\nElastic tables\nDatabase only\n: All other tables count for your database\nStorage capacity, by source\nIn the\nstorage capacity, by source\nsection, you can see:\nOrg (tenant) default\n: The default capacity given at the time of sign up\nUser licenses\n: More capacity added for every user license purchased\nAdditional storage\n: Any extra storage you bought\nTotal\n: Total storage available\nView self-service sources\n: Learn more at\nView self-service license amounts and storage capacity\nTop storage usage, by environment\nIn the\ntop storage usage, by environment\nsection, you can see the environments that consume the most capacity.\nAdd-ons\nIn the\nadd-ons\nsection, you can see the details of add-ons that your organization purchased. Learn more at\nView capacity add-ons in Power Platform admin center\n.\nIn the\nadd-ons\nsection, you can also select\nManage\nto assign add-ons to environments or\nDownload reports\nto view a downloaded report. Add-on reports expire after 30 days.\nDataverse tab\nOn the Capacity page, select\nDataverse\n. This page provides similar information as the summary tab, but with an environment-level view of where your organization is using capacity.\nNote\nThere's no technical limit on the size of a Dataverse environment. The limits mentioned on this page are entitlement limits based on product licenses you purchase.\nThis table highlights some of the features you can see on the Dataverse page.\nFeature\nDescription\nDownload\nSelect\nDownload\nabove the list of environments to download an Excel .CSV file with high-level storage information for each environment that the signed-in admin has permission to see in the Power Platform admin center.\nSearch\nUse\nSearch\nto search by the environment name and the environment type.\nDetails\nSelect the\nDetails\nbutton (\n) to see an environment-level detailed view of where your organization is using capacity, in addition to the three types of capacity consumption.\nDefault environment tip\nThe calculated storage usage in this view only displays what is\nabove\nthe default environment's included capacity. Tool tips indicate how to view actual usage in the\nDetails\nsection.\nNote\nThe following environments don't count against capacity and show as 0 GB:\nMicrosoft Teams\nTrial\nPreview\nSupport\nDeveloper\nThe default environment has the following included storage capacity: 3 GB Dataverse database capacity, 3 GB Dataverse file capacity, and 1 GB Dataverse log capacity.\nYou can select an environment that's showing 0 GB and then go to its environment capacity analytics page to see the actual consumption.\nFor the default environment, the list view shows the amount of capacity consumed beyond the included quota. Select the\nDetails\nbutton (\n) to see usage.\nThe capacity check - conducted before creating new environments - excludes the default environment's included storage capacity when calculating whether you have sufficient capacity to create a new environment.\nEnvironment storage capacity details\nSelect the\nDetails\nbutton (\n) associated with the environment you want to see more information about.\nThe following details are provided:\nActual database usage\nTop database tables and their growth over time\nActual file usage\nTop files tables and their growth over time\nActual log usage\nTop tables and their growth over time\nNote\nRefer to the\nstorage capacity reports\nunder\nDataverse long term retention\nto understand details on storage capacity with the retention feature.\nMicrosoft Teams tab\nOn the Capacity page, select\nMicrosoft Teams\n. This tab shows the capacity storage used by your Microsoft Teams environments. Teams environment capacity usage doesn't count towards your organization's Dataverse usage.\nFeature\nDescription\nDownload\nSelect\nDownload\nabove the list of environments to download an Excel .CSV file with high-level storage information for each environment that the signed-in admin has permission to see in the Power Platform admin center.\nSearch\nUse\nSearch\nto search by the environment name and the environment type.\nAdd-ons tab\nOn the Capacity page, select\nAdd-ons\n. This tab shows your organization's add-on usage details and lets you assign add-ons to environments. For more information, see\nView capacity add-ons in Power Platform admin center\n.\nNote\nThis tab only appears if your tenant includes add-ons.\nTrial tab\nOn the Capacity page, select\nTrial\n. This tab shows the capacity storage used by your trial environments. Trial environment capacity usage doesn't count towards your organization's Dataverse usage.\nFeature\nDescription\nDownload\nSelect\nDownload\nabove the list of environments to download an Excel .CSV file with high-level storage information for each environment that the signed-in admin has permission to see in the Power Platform admin center.\nSearch\nUse\nSearch\nto search by the environment name and the environment type.\nDataverse page in Licenses (preview)\nImportant\nThis is a preview feature.\nDon't use preview features in production environments. Preview features might have restricted functionality. They're subject to\nsupplemental terms of use\n. Microsoft makes preview features available before an official release so that customers can get early access and provide feedback.\nThis feature is being gradually rolled out across regions and might not be available in your region yet.\nTrack tenant usage\nYou can track and manage Dataverse capacity in the\nLicenses\nsection of Power Platform admin center.\nSign in to the\nPower Platform admin center\n.\nOn the navigation pane, select\nLicensing\n.\nOn the Licensing pane, select\nDataverse\nunder\nProducts\n.\nUsage per storage type\nIn the\nUsage per storage type\ntile, you can view the consumption of your database, log, and file storage. This section displays your prepaid entitled capacity along with the corresponding usage. Additionally, it indicates if any part of your Dataverse usage is billed under a pay-as-you-go plan.\nTop environment consuming storage\nThe\nTop environment consuming storage\ntile displays the environments using the most capacity. It also indicates whether any of these top-consuming environments are in overage and provides a breakdown of prepaid versus pay-as-you-go usage. You can select\nDatabase\n,\nFile\n, or\nLog\nto view the corresponding consumption details.\nDataverse environment usage\nIn the\nTop environments consuming storage\ntile, select\nSee all environments\nto view capacity consumption across all your Dataverse environments. The following details are provided:\nName of the environment\nOverage status if capacity is allocated to the environment\nWhether capacity is preallocated to the environment\nEnvironment type\nManaged Environment status\nPay-as-you-go plan linkage status\nAbility to draw capacity from available tenant pool\nDatabase, file, and log consumption\nTrack environment usage\nOn the\nDataverse\npage, select\nEnvironment\nand choose an environment from the list.\nAlternatively, in the\nTop environment consuming storage\ntile, select\nSee all environments\nand select an environment name.\nUsage per storage type tile\nIn the\nUsage per storage type\ntile, you can view the consumption of your database, log, and file storage. This section displays your prepaid allocated capacity, if any, along with the corresponding usage. Additionally, it indicates if any part of your Dataverse usage is billed under a pay-as-you-go plan.\nConsumption per table\nIn the\nConsumption per table\nsection, you can view the amount of storage consumed by each Dataverse table. To see table consumption for a specific storage type, select\nDatabase\n,\nFile\n, or\nLog\nin the\nUsage per storage type\ntile. Select the table name for the consumption trend, with the option to track daily usage trends, for up to the past three months.\nDataverse search consumption and reporting\nIn addition to database and file storage, Dataverse search includes the indexes that power different experiences. These indexes support search and generative AI across structured or tabular data and unstructured data stored in Dataverse, such as files.\nStorage consumed by Dataverse search is reported at the environment-level as a table called\nDataverseSearch\n. It was previously named\nRelevanceSearch\n.\nDataverse search can also be monitored at the Dataverse Environment report in the Power Platform admin center\nThe Dataverse Environment report is located at\nLicensing\n>\nDataverse\n>\nEnvironments\ntab (consumption per table reporting).\nHow much does the indexed Dataverse search data cost?\nAll Dataverse indexes are reported at the Dataverse database capacity rate. Turning on Dataverse search doesn't turn on any other experience automatically. For more information, see\nWhat is Dataverse search?\nAllocate capacity for an environment\nWhen you select the\nDataverse\ntab, you can allocate capacity to a specific environment. After you allocate capacity, you can view the status of your environments to see whether they're within capacity or in an overage state.\nSign in to the\nPower Platform admin center\n.\nOn the navigation pane, select\nLicensing\n.\nOn the\nLicensing\npane, select\nDataverse\nin the\nProducts\nsection.\nOn the\nSummary\npage, select\nManage capacity\n.\nSelect the environment for which you want to allocate capacity.\nIn the\nManage capacity\npanel, view the currently allocated and consumed capacity for the environment.\nAllocate capacity by entering the desired value in the\nDatabase\n,\nFile\n, and\nLog\nfields. Make sure the capacity values are positive integers and don't exceed the available capacity displayed at the top of the panel.\nOpt in to receive daily email alerts sent to tenant and environment admins when the consumed capacity (database, log, or file) reaches a set percentage of the allocated capacity.\nSelect\nSave\nto apply the changes.\nManage capacity overage\nWhen an environment's capacity consumption exceeds the preallocated capacity, you have two options to manage the overage:\nIn the\nManage capacity\npane, use capacity available from the tenant's overall capacity pool.\nIn the\nManage capacity\npane, link the environment to a pay-as-you-go billing plan, where any overage is charged to the associated Azure subscription.\nChanges for exceeding storage capacity entitlements\nMicrosoft is making changes for what happens when an organization's storage capacity is close to or exceeds the capacity entitled or purchased through add-ons.\nNotifications for capacity approaching storage limits are triggered when any of the three storage capacities (database, file, or log) have less than 15% of space available. Another warning notification that admin operations could be impacted is sent when any of the three storage capacities have less than 5% of space available. The final tier of notification triggers when the tenant is \"in overage\" (storage usage exceeds capacity entitlements), which alerts the admin that the following operations aren't available until the overage is resolved:\nCreate a new environment (requires minimum 1-GB capacity available)\nCopy an environment\nRestore an environment\nConvert a trial environment to paid (requires minimum 1-GB capacity available)\nRecover an environment (requires minimum 1-GB capacity available)\nAdd Dataverse database to an environment\nNote\nThe storage driven capacity model calculation of these thresholds also considers the overflow usage allowed in the storage driven model. For example, extra database capacity can be used to cover log and file overuse and extra log capacity can be used to cover file overuse. Therefore, overflow usage is taken into consideration to reduce the number of emails a tenant admin receives.\nTenant admins, Power Platform admins, and Dynamics 365 admins receive these notifications on a weekly basis. At this time, there's no option for a customer to opt out of these notifications or delegate these notifications to someone else. All admin types listed earlier automatically receive these notifications.\nAdditionally, there's a notification banner in the Power Platform admin center when a tenant exceeds storage capacity.\nThe\nUniversal License Terms for Online Services\napply to your organization's use of the online service, including consumption that exceeds the online service's documented entitlements or usage limits.\nYour organization must have the right licenses for the storage you use:\nIf you use more than your documented entitlements or usage limits, you must buy more licenses.\nIf your storage consumption exceeds the documented entitlements or usage limits, Microsoft might suspend use of the online service. Microsoft provides reasonable notice before suspending your online service.\nExample storage capacity scenarios and overage enforcement\nStay within the limits for your entitled capacity for database, log, and file storage. If you use more capacity than you're entitled to, buy more capacity or free up some space. However, if you overuse database, log, or file capacity, review the following scenarios to understand when enforcement applies.\nScenario 1: Database storage is over capacity, overage enforcement\nType\nEntitled\nConsumed\nDatabase\n100 GB\n110 GB\nLog\n10 GB\n5 GB\nFile\n400 GB\n200 GB\nThis tenant uses 10 GB more than the database capacity. Even though the tenant has 200 GB of extra file storage, the tenant is in deficit. This tenant should free up storage or purchase more capacity.\nScenario 2: Log storage is over capacity, overage enforcement\nType\nEntitled\nConsumed\nDatabase\n100 GB\n95 GB\nLog\n10 GB\n20 GB\nFile\n400 GB\n200 GB\nThis tenant uses 10 GB more than the log capacity and has only 5 GB available in database capacity. Therefore, the tenant is in deficit and should free up storage or purchase more capacity.\nScenario 3: File storage is over capacity, overage enforcement\nType\nEntitled\nConsumed\nDatabase\n100 GB\n20 GB\nLog\n10 GB\n5 GB\nFile\n200 GB\n290 GB\nThis tenant is 90 GB over in file usage. Despite having 85 GB available (80-GB database + 5-GB log) in storage capacity, the tenant is considered to be in deficit. This tenant should free up storage or purchase more capacity.\nExample storage capacity scenario, no overage\nScenario 4: Log storage is over capacity\nType\nEntitled\nConsumed\nDatabase\n100 GB\n80 GB\nLog\n10 GB\n20 GB\nFile\n400 GB\n200 GB\nThis tenant is 10 GB over in log usage but has 20 GB available in database capacity. Therefore, the tenant isn't in deficit. File storage excess entitlement can't be used to compensate deficits in log or database storage.\nActions to take for a storage capacity deficit\nYou can always\nfree up storage\n,\ndelete unwanted environments\n, or buy more capacity to be compliant with storage usage. To learn more about capacity add-ons, go to the\nDynamics 365 Licensing Guide\nor the\n\"Add-ons\" section of the Power Apps and Power Automate Licensing Guide\n.\nYou can work through your organization's standard procurement process to purchase\ncapacity add-ons\n.\nFrequently asked questions (FAQ)\nWhy does my storage consumption decrease in the database and grow in the file storage?\nMicrosoft constantly optimizes Dataverse for ease of use, performance, and efficiency. Part of this ongoing effort is moving data to the best possible storage with the lowest cost for customers. File-type data such as \"Annotation\" and \"Attachment\" is moving from database to file storage. This change leads to decreased usage of database capacity and an increase in file capacity.\nWhy could my database table size decrease while my table and file data sizes remain the same?\nAs part of moving file-type data such as \"Annotation\" and \"Attachment\" out from database and into file storage, Microsoft periodically reclaims the freed database space. This change leads to decreased usage of database capacity, while the table and file data size computations remain unchanged.\nDo indexes affect database storage usage?\nDatabase storage includes both the database rows and index files used to improve search performance. Indexes are created and optimized for peak performance. The system frequently updates them by analyzing data use patterns. No user action is needed to optimize the indexes, as all Dataverse stores have tuning enabled by default. A fluctuation in database storage can be represented by an increased or decreased number of indexes. Dataverse is continually being tuned to increase efficiency and incorporate new technologies that improve user experience and optimize storage capacity. Common causes for an increase in index size are:\nAn organization makes use of new functionality. This functionality can be custom, out-of-the-box, or part of an update or solution installation.\nData volume or complexity changes.\nA change in usage patterns that indicate new indexes need reevaluation.\nIf you configure Quick Find lookups for data that's frequently used, this configuration also creates more indexes in the database. Admin-configured Quick Find values can increase the size of the indexes based on:\nThe number of columns chosen and the data type of those columns.\nThe volume of rows for the tables and columns.\nThe complexity of the database structure.\nBecause an admin creates custom Quick Find lookups in the org, these indexes can be user-controlled. Admins can reduce some of the storage used by these custom indexes by taking the following action:\nRemove unneeded columns or tables.\nEliminate multiline text columns from inclusion.\nNote\nThe Dataverse search indexed data is the data that improves the search quality for the global search and generative AI experiences, as well as interpreting the content by using natural language. This index data accrues to the overall Dataverse search consumption.\nWhat is the DataverseSearch table and how can I reduce it?\nThe\nDataverseSearch\ntable (previously known as\nRelevanceSearch\n) stores indexed data for the global search and generative AI experiences. It includes data from all searchable, retrievable, and filterable fields of the tables you indexed for your environment and Copilot semantic indexes.\nFor more information, see\nManaging Dataverse search\n.\nCan I manage Dataverse search?\nAn admin can manage Dataverse search through the three states associated with this setting: On, Default, and Off. Learn more in\nConfigure Dataverse search for your environment\n.\nNote\nDataverse search is set to\nOn\nfor any new production, sandbox, or default environment type. It's set to\nDefault\nfor any new other type of environment.\nIf you turn on Dataverse search as\nOn\nor\nDefault\n, no other setting is turned on.\nWhat actions can makers take?\nDepending on the experience that leverages Dataverse search and its usage, the consumption size might increase. Learn more in\nWhat is Dataverse search?\n.\nImportant\nDon't turn off Dataverse search. Turning off Dataverse search directly impacts all dependent generative AI experiences in your different applications and all users using them.\nTurning off Dataverse search\nWhen you turn off Dataverse search, the system deletes its indexed Dataverse data. All experiences that depend on this data, including search and generative AI conversational capabilities, become limited or unusable for all users.\nEnvironment admins have 12 hours to turn the feature back on without losing indexed data.\nDuring 12 hours:\nYou can turn Dataverse search back on without losing indexed data.\nAfter 12 hours:\nThe system permanently deletes all indexed Dataverse data.\nTurning Dataverse search back on re-triggers the indexing of Dataverse data.\nImportant\nTurning off Dataverse search deprovisions and removes the index within a period of 12 hours. If you turn on Dataverse search after it's been off for 12 hours, it provisions a fresh index that needs to go through a full sync. Syncing might take up to an hour or more for average size organizations, and a couple of days for large organizations. Be sure to consider these implications when you turn off Dataverse search temporarily.\nIndex removal (or provisioning) can take multiple days to complete, depending on the amount of Dataverse search consumption. For example, an organization with 10 GB of indexed data might take one day to clean up all indexes, while an organization with 500 GB might take multiple days to see it reflected in Dataverse search reporting. Please wait a few days or a week before submitting a support ticket, to ensure a complete removal of Dataverse search indexed data.\nWhat happens if I turn off Dataverse search?\nAll experiences that use Dataverse search become limited. For more information, see\nFrequently asked questions about Dataverse search\n.\nTurning on Dataverse search again\nSelecting On\n:\nWhen you set Dataverse search to\nOn\nafter setting it to\nOff\n, the system immediately re-triggers all indexes across all enabled experiences for them to work accordingly, and Dataverse search costs resume.\nSelecting Default\n:\nWhen you set Dataverse search to\nDefault\nafter setting it to\nOff\n, the system only regenerates the indexes when triggered. Examples include when a Copilot Studio agent uses a file—such as a local file, OneDrive file, SharePoint file upload, or Dataverse table—or if a prompt is submitted to an agent or Copilot. When the indexes are triggered, Dataverse search costs resume.\nNote\nYou can't turn Dataverse search\nOn\nor\nOff\nfor different applications in the same environment. The status of the setting applies to all applications in the environment that use Dataverse search.\nI just bought the new capacity-based licenses. How do I provision an environment by using this model?\nYou can provision environments through the Power Platform admin center. Learn more in\nCreate and manage environments in the Power Platform admin center\n.\nI'm a new customer and I recently purchased the new offers. My usage of database, log, or file is showing red. What should I do?\nConsider buying more capacity by using the\nLicensing Guide\n. Alternatively, you can\nfree up storage\n.\nI'm an existing customer, and my renewal is coming up. Will I be affected?\nCustomers who renew existing subscriptions can choose to continue to transact by using the existing offers for a certain period of time. Contact your Microsoft partner or Microsoft sales team for details.\nI'm a Power Apps or Power Automate customer and have environments with and without database. Do they consume storage capacity?\nYes. All environments consume 1 GB, regardless of whether they have an associated database.\nDo I get notified through email when my organization is over capacity?\nYes, tenant admins receive email notifications on a weekly basis if their organization is at or over capacity. Additionally, tenant admins get notified when their organization reaches 15% of available capacity, and when their organization reaches 5% of available capacity.\nWhy am I no longer getting storage notifications?\nCapacity email notifications are sent weekly to tenant admins based on three different thresholds. If you're no longer getting storage notifications, check your admin role. It could also be the case that your organization is over the three predefined capacity thresholds. In that case, you don't receive an email notification.\nI'm an existing customer. Should I expect my file and log usage to change?\nLog and files data usage isn't expected to be exactly the same size as when the same data is stored by using database, due to different storage and indexing technologies. The current set of out-of-the-box tables stored in file and log storage might change in the future.\nThe capacity report shows the entitlement breakdown per license, but I have more licenses in my tenant and not all of them are listed in the breakdown. Why?\nNot all licenses give per-user entitlement. For example, the Team Member license doesn't give any per-user database, file, or log entitlement. So in this case, the license isn't listed in the breakdown.\nWhich environments are counted in the capacity report?\nDefault, production, and sandbox environments count for consumption. Trial, preview, support, and developer environments don't count.\nWhat are tables ending in \"- analytics\" in my capacity report?\nTables ending in \"– Analytics\" are tables used by one or more Insights applications, for example Sales Insights, Customer Service Hub, or Field Service and resource scheduling and optimization analytics dashboard to generate predictive insights or analytics dashboards. The data is synched from Dataverse tables. Go to the section\nMore information\nfor documentation covering the installed Insights applications and the tables used to create insights and dashboards.\nWhy can't I see the Summary tab in my capacity report?\nIn April 2023, Microsoft changed the roles that can see the\nSummary\ntab in the capacity report. Now, only users with the tenant admin, Power Platform admin, or Dynamics 365 admin roles can see the\nSummary\ntab. Users with other roles, such as environment admins, no longer see this tab and are redirected to the\nDataverse\ntab when accessing the report. If you need access to the\nSummary\ntab, ask your admin to assign one of the required roles.\nMore information:\nSales Insights\nField Service and resource scheduling optimization (RSO)\nCustomer Service Insights\nField Service\nWho can allocate capacity?\nUsers with global admin, Power Platform admin, and Dynamics 365 admin roles can allocate Dataverse capacity.\nDoes this change affect the total available capacity in my tenant?\nThis change doesn't affect the overall capacity available at the tenant level. Admins can choose to preallocate capacity from the tenant pool to an environment. When they preallocate capacity, it reduces the tenant level's total available capacity for use by other environments.\nWhat happens if capacity consumption goes beyond the allocated capacity?\nCurrently, only\nsoft enforcement\nthrough email notification is turned on. Admins (Power Platform admins and environment admins) start receiving notifications when capacity usage exceeds 85% of the allocated capacity.\nWhat types of Dataverse capacity can be allocated?\nYou can allocate database, file, and log capacity.\nDo I need to allocate capacity to every environment like other supported currencies?\nNo, admins can select specific environments to allocate capacity.\nRelated information\nAdd Microsoft Dataverse storage capacity\nCapacity add-ons\nAutomatic tuning in Azure SQL Database\nWhat's new in storage\nFree up storage space\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Capacity Storage", @@ -296,7 +296,7 @@ "https://learn.microsoft.com/en-us/power-platform/admin/powerapps-powershell": { "content_hash": "sha256:bd59f66c3d66a656f8a9dd689eab67aaa4c7abcb22cdf5c362276571ec70be37", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nPowerShell support for Power Apps and Power Automate\nFeedback\nSummarize this article for me\nWith\nPowerShell\ncmdlets for Power Platform creators and administrators, you can automate many monitoring and management tasks. Tasks that are only possible\nmanually\ntoday in\nPower Apps\n,\nPower Automate\n, or the\nPower Platform admin center\n.\nCmdlets\nCmdlets\nare functions written in the\nPowerShell\nscript language that execute commands in PowerShell. Running these Power Apps cmdlets allows you to interact with your Business Application Platform without having to go through the admin portal in a web browser.\nYou can combine cmdlets with other PowerShell functions to write complex scripts that can optimize your workflow. You can still use the cmdlets if you're not an admin on the tenant, but you're limited to the resources you own. Administrative user account use cmdlets that start with\nAdmin\n.\nCmdlets are available on the PowerShell gallery as two separate modules:\nAdministrator\nMaker\nFor information on the Power Apps admin module, see\nGet started using the Power Apps admin module\nand\nMicrosoft.PowerApps.Administration.PowerShell\n.\nGet started with PowerShell\nIf you're new to PowerShell and need help with finding and launching it, go to\nGetting Started with PowerShell\n. If you need help with using PowerShell or the cmdlets, go to\nThe PowerShell Help System\n.\nPrerequisites for PowerShell\nPowerShell in this article requires\nWindows PowerShell\nversion 5.x. To check the version of PowerShell running on your machine, run the following command:\n$PSVersionTable.PSVersion\nIf you have an outdated version, go to\nUpgrading existing Windows PowerShell\n.\nImportant\nThe modules described in this document use .NET Framework, which is incompatible with PowerShell 6.0 and later. These later versions use .NET Core.\nModule installation and sign in\nTo run PowerShell cmdlets for app creators:\nRun PowerShell as an administrator.\nImport the necessary modules.\nInstall-Module -Name Microsoft.PowerApps.Administration.PowerShell\nInstall-Module -Name Microsoft.PowerApps.PowerShell -AllowClobber\nAlternatively, if you don't have admin rights on your computer, use the\n-Scope CurrentUser\nparameter for installation.\nInstall-Module -Name Microsoft.PowerApps.Administration.PowerShell -Scope CurrentUser\nInstall-Module -Name Microsoft.PowerApps.PowerShell -AllowClobber -Scope CurrentUser\nIf you're prompted to accept the change to the\nInstallationPolicy\nvalue of the repository, accept\n[A] Yes\nto all modules by typing\nA\n, then press\nEnter\nfor each module.\nOptionally, before accessing the commands, you can provide your credentials. Credentials are refreshed for up to eight hours before you're required to sign in again. If credentials aren't provided before a command is executed, then a prompt for credentials appears.\n# Opens a prompt to collect credentials (Microsoft Entra account and password).\nAdd-PowerAppsAccount\n# Here is how you can pass in credentials (to avoid opening a prompt).\n$pass = ConvertTo-SecureString \"password\" -AsPlainText -Force\nAdd-PowerAppsAccount -Username user@contoso.com -Password $pass\nOptionally, a specific endpoint can be targeted. The default endpoint is\nprod\n. If a user wants to run a PowerShell script targeting an environment in a nonproduction region, such as GCC, the\n-Endpoint\nparameter can be changed to\nusgov\nfor GCC Moderate, or\nusgovhigh\nfor GCC High, or\ndod\nfor GCC DOD. The full list of endpoints supported is: \"prod,preview,tip1,tip2,usgov,usgovhigh,dod,china\".\n# An environment in another region, such as GCC, can be targeted using the -Endpoint parameter.\nAdd-PowerAppsAccount -Endpoint \"usgov\"\nModule updates\nYou can check the version of all your PowerShell modules using\nGet-Module\n.\nGet-Module\nAnd you can update all your PowerShell modules to the latest using\nUpdate-Module\n.\nUpdate-Module\nAlternately, check the Power Platform modules version, using\nGet-Module\nand the\n-Name\nparameter.\nGet-Module -Name \"Microsoft.PowerApps.Administration.PowerShell\"\nGet-Module -Name \"Microsoft.PowerApps.PowerShell\"\nUpdate the Power Platform PowerShell modules, using\nUpdate-Module\nand the\n-Name\nparameter.\nUpdate-Module -Name \"Microsoft.PowerApps.Administration.PowerShell\"\nUpdate-Module -Name \"Microsoft.PowerApps.PowerShell\"\nPower Apps cmdlets for app creators\nPrerequisites for Power Apps cmdlets\nUsers with a valid Power Apps license can perform the operations in these cmdlets. However, they only have access to resources, like apps and flows, that are created or shared with them.\nCmdlet list - Maker Cmdlets\nNote\nWe updated some of the cmdlets function names in the latest release in order to add appropriate prefixes to prevent collisions. For an overview of what changed, refer the following table.\nPurpose\nCmdlet\nAdd a canvas app to a Microsoft Dataverse solution\nSet-PowerAppAsSolutionAware\nRead and update environments\nGet-AdminPowerAppEnvironment\n(previously Get-PowerAppsEnvironment)\nGet-FlowEnvironment\nRestore-PowerAppEnvironment\n(previously Restore-AppVersion)\nRead, update, and delete a canvas app\nGet-AdminPowerApp\n(previously Get-App)\nRemove-AdminPowerApp\n(previously Remove-App)\nPublish-AdminPowerApp\n(previously Publish-App)\nRead, update, and delete canvas app permissions\nGet-AdminPowerAppRoleAssignment\n(previously Get-AppRoleAssignment)\nRemove-AdminPowerAppRoleAssignment\n(previously Remove-AppRoleAssignment)\nRead, update, and delete a flow\nGet-AdminFlow\nEnable-AdminFlow\nDisable-AdminFlow\nRemove-AdminFlow\nRead, update, and delete flow permissions\nGet-AdminFlowOwnerRole\nSet-AdminFlowOwnerRole\nRemove-AdminFlowOwnerRole\nRead and respond to flow approvals\nGet-AdminFlowApprovalRequest\nRemove-AdminFlowApprovals\nRead and delete connections\nGet-AdminPowerAppConnection\n(previously Get-Connection)\nRemove-AdminPowerAppConnection\n(previously Remove-Connection)\nRead, update, and delete connection permissions\nGet-AdminPowerAppConnectionRoleAssignment\n(previously Get-ConnectionRoleAssignment)\nSet-AdminPowerAppConnectionRoleAssignment\n(previously Set-ConnectionRoleAssignment)\nRemove-AdminPowerAppConnectionRoleAssignment\n(previously Remove-ConnectionRoleAssignment)\nRead, and delete connectors\nGet-AdminPowerAppConnector\n(previously Get-Connector)\nRemove-AdminPowerAppConnector\n(previously Remove-Connector)\nAdd, read, update, and delete custom connector permissions\nGet-AdminPowerAppConnectorRoleAssignment\n(previously Get-ConnectorRoleAssignment)\nGet-PowerAppConnectorRoleAssignment\n(previously Set-ConnectorRoleAssignment)\nRemove-PowerAppConnectorRoleAssignment\n(previously Remove-ConnectorRoleAssignment)\nRead, add, and remove policy URL patterns\nGet-PowerAppPolicyUrlPatterns\nNew-PowerAppPolicyUrlPatterns\nRemove-PowerAppPolicyUrlPatterns\nRead, register, and remove management apps\nGet-PowerAppManagementApp\nGet-PowerAppManagementApps\nNew-PowerAppManagementApp\nRemove-PowerAppManagementApp\nRead, create, update, and import protection keys\nGet-PowerAppRetrieveAvailableTenantProtectionKeys\nGet-PowerAppGenerateProtectionKey\nGet-PowerAppRetrieveTenantProtectionKey\nNew-PowerAppImportProtectionKey\nSet-PowerAppTenantProtectionKey\nPower Apps cmdlets for administrators\nFor more information on Power Apps and Power Automate cmdlets for admins, see\nGet started with PowerShell for Power Platform Administrators\n.\nTips\nUse\nGet-Help\nfollowed by a\nCmdletName\nto get a list of examples.\nAfter you type dash\n-\n, you can press\nTab\nto cycle through the input tags. Place this flag after the cmdlet name.\nExample commands:\nGet-Help Get-AdminPowerAppEnvironment\nGet-Help Get-AdminPowerAppEnvironment -Examples\nGet-Help Get-AdminPowerAppEnvironment -Detailed\nOperation examples\nFollowing are some common scenarios that show how to use new and existing Power Apps cmdlets.\nEnvironments Commands\nPower Apps Commands\nPower Automate commands\nAPI connection commands\nData policy commands\nData resource exemption cmdlets\nBlock trial licenses commands\nEnvironments commands\nUse these commands to get details on and update environments in your tenant.\nDisplay a list of all environments\nGet-AdminPowerAppEnvironment\nReturns a list of each environment across your tenant, with details of each (for example, environment name (guid), display name, location, creator, and more).\nDisplay details of your default environment\nGet-AdminPowerAppEnvironment –Default\nReturns the details for only the default environment of the tenant.\nDisplay details of a specific environment\nGet-AdminPowerAppEnvironment –EnvironmentName 'EnvironmentName'\nNote\nThe\nEnvironmentName\nfield is a unique identifier, which is different from the\nDisplayName\n(see first and second fields in the output in the following image).\nPower Apps commands\nThese operations are used to read and modify Power Apps data in your tenant.\nDisplay a list of all Power Apps\nGet-AdminPowerApp\nReturns a list of all Power Apps across the tenant, with details of each (for example, application name (guid), display name, creator, and more).\nDisplay a list of all Power Apps that match the input display name\nGet-AdminPowerApp 'DisplayName'\nThis command lists all Power Apps in your tenant that match the display name.\nNote\nUse quotations around input values that contain spaces. For example, use \"My App Name\".\nFeature an application\nSet-AdminPowerAppAsFeatured –AppName 'AppName'\nFeatured applications are grouped and pushed to the top of the list in the Power Apps mobile player.\nNote\nLike environments, the\nAppName\nfield is a unique identifier, which is different from the\nDisplayName\n. If you want to perform operations based on the display name, some functions will let you use the pipeline (see next function).\nMake an application a Hero app, using the pipeline\nGet-AdminPowerApp 'DisplayName' | Set-AdminPowerAppAsHero\nA Hero app appears at the top of the list in the Power Apps mobile player. There can only be one Hero app.\nThe pipe\n|\ncharacter between two cmdlets takes the output of the first cmdlet and passes it as the input value of the second, if the function is written to accommodate the pipe.\nNote\nAn app must already be a featured app before it's changed to a Hero.\nDisplay the number of apps each user owns\nGet-AdminPowerApp | Select –ExpandProperty Owner | Select –ExpandProperty displayname | Group\nYou can combine native PowerShell functions with the Power Apps cmdlets to manipulate data even further. Here we use the Select function to isolate the Owner attribute (an object) from the Get-AdminApp object. We then isolate the name of the owner object by pipelining that output into another Select function. Finally, passing the second Select function output into the Group function returns a nice table that includes a count of each owner's number of apps.\nDisplay the number of apps in each environment\nGet-AdminPowerApp | Select -ExpandProperty EnvironmentName | Group | %{ New-Object -TypeName PSObject -Property @{ DisplayName = (Get-AdminPowerAppEnvironment -EnvironmentName $_.Name | Select -ExpandProperty displayName); Count = $_.Count } }\nDownload Power Apps user details\nGet-AdminPowerAppsUserDetails -OutputFilePath '.\\adminUserDetails.txt' –UserPrincipalName 'admin@bappartners.onmicrosoft.com'\nThe previous command stores the Power Apps user details (basic usage information about the input user via their user principal name) in the specified text file. It creates a new file if there's no existing file with that name, and overwrites the text file if it already exists.\nExport a list of assigned user licenses\nGet-AdminPowerAppLicenses -OutputFilePath ''\nExports all the assigned user licenses (Power Apps and Power Automate) in your tenant into a tabular view .csv file. The exported file contains both self-service, sign-up, internal trial plans and plans sourced from Microsoft Entra ID. The internal trial plans aren't visible to admins in the Microsoft 365 admin center.\nThe export can take a while for tenants with a large number of Microsoft Power Platform users.\nNote\nOutput of the Get-AdminPowerAppLicenses cmdlet only includes licenses for users who accessed Power Platform services (for example, Power Apps, Power Automate, or Power Platform admin center). Users who had licenses assigned in Microsoft Entra ID (typically via the Microsoft 365 admin center) but never accessed Power Platform services don't have their licenses included in the generated .csv output. Furthermore, since the Power Platform licensing services caches the licenses, updates made to license assignments in Microsoft Entra ID can take up to seven days to reflect in the output for users who didn't access the service recently.\nSet logged in user as the owner of a canvas app\nSet-AdminPowerAppOwner –AppName 'AppName' -AppOwner $Global:currentSession.userId –EnvironmentName 'EnvironmentName'\nChanges the owner role of a Power App to the current user, and replaces the original owner as a \"can view\" role type.\nNote\nThe AppName and EnvironmentName fields are the unique identifiers (guids), not the display names.\nDisplay a list of deleted canvas apps in an environment\nGet-AdminDeletedPowerAppsList -EnvironmentName 'EnvironmentName'\nThis command displays all canvas apps recently deleted, as they might still be recovered. The restorable duration is 28 days. Any app deleted after 28 days isn't returned in this list and can't be recovered.\nRecover a deleted canvas app\nGet-AdminRecoverDeletedPowerApp -AppName 'AppName' -EnvironmentName 'EnvironmentName'\nThis command recovers a canvas app discoverable through the\nGet-AdminDeletedPowerAppsList\ncmdlet. Any canvas app that isn't displayed in the\nGet-AdminDeletedPowerAppsList\nisn't recoverable.\nDesignate SharePoint custom form environment\nThe following cmdlets can be used to specify and verify which environment SharePoint custom forms are saved to, instead of the default environment. When the designated environment for SharePoint custom forms changes, this environment is where newly created custom forms are saved. Existing custom forms don't automatically migrate to different environments as these cmdlets are used. The ability for a user to create a custom form in a designated environment requires that user to have the Environment Maker role. Users can be granted the Environment Maker role in the\nPower Platform admin center\n.\nAny environment that isn’t the default environment can be deleted. If the designated SharePoint custom form environment is deleted, the custom forms are deleted with it.\nGet-AdminPowerAppSharepointFormEnvironment \nThis command returns the\nEnvironmentName\nfor the environment currently designated for newly created SharePoint custom forms. If an environment has never been designated, the default environment is returned.\nSet-AdminPowerAppSharepointFormEnvironment –EnvironmentName 'EnvironmentName'\nThis command designates the environment newly created SharePoint custom forms save to, instead of the default environment. Existing custom forms don't automatically migrate to the newly designated environment. Only production environment can be designated for SharePoint custom forms.\nReset-AdminPowerAppSharepointFormEnvironment\nThis resets the default environment as the designated environment to save SharePoint custom forms.\nDisplay tenant setting for ability to share apps with\nEveryone\n$settings = Get-TenantSettings \n$settings.PowerPlatform.PowerApps.disableShareWithEveryone\nThis setting controls whether users with the Environment Maker security role can share canvas apps with\nEveryone in an organization\n. When the setting is set to\ntrue\n, only users with an admin role (Dynamics 365 admin, Power Platform Service admin, Microsoft Entra tenant admin) can share apps with\nEveryone in an organization\n.\nRegardless of this tenant settings value, makers with the sharing privilege can share apps with security groups of any size. This control only determines whether the\nEveryone\nshorthand can be used when sharing.\nChange tenant setting for ability to share apps with\nEveryone\n$settings = Get-TenantSettings \n$settings.powerPlatform.powerApps.disableShareWithEveryone = $True \nSet-TenantSettings -RequestBody $settings\nSurface your organization’s governance error message content\nIf you specify governance error message content to appear in error messages, the content in the error message is displayed when makers observe they don’t have permission to share apps with\nEveryone\n. See\nPowerShell governance error message content commands\n.\nAssociate in context flows to an app\nAssociate flows in context of an app to the app to create a dependency between the app and flows. To learn more about context flows, see\nWhat Power Automate capabilities are included in Power Apps licenses?\nAdd-AdminFlowPowerAppContext -EnvironmentName -FlowName -AppName [-ApiVersion ] []\nEnvironmentName and FlowName can be found in the flow url:\nFor a Non-Solution flow, the URL looks like this:\nhttps://preview.flow.microsoft.com/manage/environments/839eace6-59ab-4243-97ec-a5b8fcc104e7/flows/6df8ec2d-3a2b-49ef-8e91-942b8be3202t/details\nThe GUID after\nenvironments/\nis the EnvironmentName and the GUID after\nflows/\nis the FlowName\nFor Solution flow, the URL looks like this:\nhttps://us.flow.microsoft.com/manage/environments/66495a1d-e34e-e330-9baf-0be559e6900b/solutions/fd140aaf-4df4-11dd-bd17-0019b9312238/flows/53d829c4-a5db-4f9f-8ed8-4fb49da69ee1/details\nThe GUID after\nenvironments/\nis the EnvironmentName and the GUID after\nflows/\nis the FlowName\nThe AppName for a canvas app can be found on the canvas app details page.\nThe AppName for a model driven app can be found in solution explorer.\nTo see the examples, type:\nget-help Add-AdminFlowPowerAppContext -examples\n.\nTo get more information, type:\nget-help Add-AdminFlowPowerAppContext -detailed\n.\nTo get technical information, type:\nget-help Add-AdminFlowPowerAppContext -full\n.\nRemove in context flows of an app\nRemove the dependency between flows and an app with this PowerShell command. The Remove-AdminFlowPowerAppContext removes app context from the specific flow.\nRemove-AdminFlowPowerAppContext -EnvironmentName -FlowName -AppName [-ApiVersion ] []\n\n - To see the examples, type: \"get-help Remove-AdminFlowPowerAppContext -examples\".\n - For more information, type: \"get-help Remove-AdminFlowPowerAppContext -detailed\".\n - For technical information, type: \"get-help Remove-AdminFlowPowerAppContext -full\".\nPower Automate commands\nUse these important commands to perform administration related to Power Automate.\nFor a full list of Power Automate and Power Apps cmdlets for admins, see\nGet started with PowerShell for Power Platform Administrators\n.\nDisplay all flows\nGet-AdminFlow\nReturns a list of all flows in the tenant.\nDisplay flow owner role details\nGet-AdminFlowOwnerRole –EnvironmentName 'EnvironmentName' –FlowName 'FlowName'\nReturns the owner details of the specified flow.\nNote\nLike\nEnvironments\nand\nPowerApps\n,\nFlowName\nis the unique identifier (guid), which is different from the display name of the flow.\nDisplay flow user details\nGet-AdminFlowUserDetails –UserId $Global:currentSession.userId\nReturns the user details regarding flow usage. In this example, we're using the user ID of the current logged in user of the PowerShell session as input.\nRemove flow user details\nRemove-AdminFlowUserDetails –UserId 'UserId'\nDeletes the details on a flow user completely from the Microsoft database. All flows the input user owns must be deleted before the flow user details can be purged.\nNote\nThe UserId field is the Object ID of the user's Microsoft Entra record, which can be found in the\nAzure portal\nunder\nMicrosoft Entra ID\n>\nUsers\n>\nProfile\n>\nObject ID\n. You must be an admin to access this data from here.\nExport all flows to a CSV file\nGet-AdminFlow | Export-Csv -Path '.\\FlowExport.csv'\nExports all the flows in your tenant into a tabular view .csv file.\nAdd flows into Dataverse solutions\nAdd-AdminFlowsToSolution -EnvironmentName \nMigrates all the nonsolution flows in the environment.\nParameter variations can be used to migrate only specific flows, add into a specific solution, or migrate only a set number of flows at a time.\nFor technical details, see\nAdd-AdminFlowsToSolution\n.\nList HTTP Action flows\nGet-AdminFlowWithHttpAction -EnvironmentName \nLists flows with HTTP actions.\nDisplayName\nFlowName\nEnvironmentName\nGet Invoice HTTP\nflow-1\nenvironment-1\nPay Invoice from App\nflow-2\nenvironment-2\nReconcile Account\nflow-3\nenvironment-3\nAPI connection commands\nView and manage API connections in your tenant.\nDisplay all native Connections in your default environment\nGet-AdminPowerAppEnvironment -Default | Get-AdminPowerAppConnection\nDisplays a list of all API connections you have in the default environment. Native connections are found under the\nDataverse\n>\nConnections\ntab in\nPower Apps\n.\nDisplay all custom connectors in the tenant\nGet-AdminPowerAppConnector\nReturns a list of all custom connector details in the tenant.\nNote\nGet-AdminPowerAppConnector\ndoesn't list custom connectors that are in a solution. This is a known limitation.\nData policy commands\nThese cmdlets control the data policies on your tenant.\nCreate a data policy\nNew-DlpPolicy\nCreates a new data policy for the signed-in admin's tenant.\nRetrieve a list of data policy objects\nGet-DlpPolicy\nGets policy objects for the signed-in admin's tenant.\nNote\nWhen you view a data policy using PowerShell, the display name of connectors are from when the data policy was created or when the connectors were last moved within the policy. New changes to the display names of connectors aren't reflected.\nWhen you view a data policy using PowerShell, new connectors in the default group that weren't moved aren't returned.\nFor both of these known issues, a workaround is to move the affected connector to another group within the policy and then move it back to the correct group. After doing this, each of the connectors is visible with their correct name.\nUpdate a data policy\nSet-DlpPolicy\nUpdates details of the policy, such as the policy display name.\nRemove a policy\nRemove-DlpPolicy\nDeletes a data policy.\nData resource exemption cmdlets\nThese cmdlets allow you to exempt or unexempt a specific resource from a policy.\nRetrieve existing exempt resource list for a data policy\nGet-PowerAppDlpPolicyExemptResources -TenantId -PolicyName\nCreate a new exempt resource list for a data policy\nNew-PowerAppDlpPolicyExemptResources -TenantId -PolicyName -NewDlpPolicyExemptResources\nUpdate the exempt resource list for a data policy\nSet-PowerAppDlpPolicyExemptResources -TenantId -PolicyName -UpdatedExemptResources\nRemove the exempt resource list for a data policy\nRemove-PowerAppDlpPolicyExemptResources -TenantId -PolicyName\nTo exempt a resource from a data policy, you need the following information:\nTenant ID (GUID)\nData policy ID (GUID)\nResource ID (ends with a GUID)\nResource type\nYou can retrieve the resource ID and type using PowerShell cmdlets Get-PowerApp for apps and Get-Flow for flows.\nExample removal script\nTo exempt flow with ID\nf239652e-dd38-4826-a1de-90a2aea584d9\nand app with ID\n06002625-7154-4417-996e-21d7a60ad624\n, we can run the following cmdlets:\n1. PS D:\\> $flow = Get-Flow -FlowName f239652e-dd38-4826-a1de-90a2aea584d9 \n2. PS D:\\> $app = Get-PowerApp -AppName 06002625-7154-4417-996e-21d7a60ad624 \n3. PS D:\\> $exemptFlow = [pscustomobject]@{ \n4. >> id = $flow.Internal.id \n5. >> type = $flow.Internal.type \n6. >> } \n7. PS D:\\> $exemptApp = [pscustomobject]@{ \n8. >> id = $app.Internal.id \n9. >> type = $app.Internal.type \n10. >> } \n11. PS D:\\> $exemptResources = [pscustomobject]@{ \n12. >> exemptResources = @($exemptFlow, $exemptApp) \n13. >> } \n14. PS D:\\> New-PowerAppDlpPolicyExemptResources -TenantId aaaabbbb-0000-cccc-1111-dddd2222eeee -PolicyName bbbbcccc-1111-dddd-2222-eeee3333ffff -NewDlpPolicyExemptResources $exemptResources \n15. \n16. exemptResources \n17. --------------- \n18. {@{id=/providers/Microsoft.ProcessSimple/environments/Default-aaaabbbb-0000-cccc-1111-dddd2222eeee/flows/f239652e-dd38-4826-a1de-90a2aea584d9; type=Microsoft.ProcessSimple/environments/flows}, @{id=/providers/Microsoft.PowerApps/apps/06002625-7154-4417-996e-21d7a60ad..\nData policy exemption experience in the following scenarios\n#\nScenario\nExperience\n1\nUser launches an app that’s not data policy compliant but data policy exempt.\nApp launch proceeds with or without data policy enforcement.\n2\nMaker saves an app that’s not data policy compliant but data policy exempt\nWith or without data policy exemption, data policy compliance doesn't block the app save operation. The data policy noncompliance warning is shown regardless of data policy exemption.\n3\nMaker saves a flow that’s not data policy compliant but data policy exempt\nWith or without data policy exemption, data policy compliance doesn't block the flow save operation. The data policy noncompliance warning doesn't appear.\nGovernance error message content commands\nThe following cmdlets can lead your end users to your organization’s governance reference material. The command includes a link to governance documentation and a governance contact for when they're prompted by governance controls. For instance, when governance error message content is set, it appears in Power Apps data policy runtime enforcement messages.\nSet governance error message content\nNew-PowerAppDlpErrorSettings -TenantId 'TenantId' -ErrorSettings @{ \n ErrorMessageDetails = @{ \n enabled = $True \n url = \"https://contoso.org/governanceMaterial\" \n } \n ContactDetails= @{ \n enabled = $True \n email = \"admin@contoso.com\" \n } \n}\nThe governance error message URL and email can be shown independently or together. You can enable or disable the governance error message with the\nenabled\nfield.\nGovernance error message content scenarios\n#\nScenario\nAvailability\n1\nUser launches an app created using Power Apps that’s not data policy compliant\nGenerally available\n2\nMaker shares a Power Apps canvas app but doesn’t have share privilege\nGenerally available\n3\nMaker shares a Power Apps canvas app with\nEveryone\nbut doesn’t have privilege to share with\nEveryone\nGenerally available\n4\nMaker saves an app created using Power Apps that’s not data policy compliant\nGenerally available\n5\nMaker saves a Power Automate flow that’s not data policy compliant\nGenerally available\n6\nUser launches an app without security group membership to the security group associated to Dataverse environment\nGenerally available\nDisplay governance error message content\nGet-PowerAppDlpErrorSettings -TenantId 'TenantId'\nUpdate governance error message content\nSet-PowerAppDlpErrorSettings -TenantId 'TenantId' -ErrorSettings @{ \n ErrorMessageDetails = @{ \n enabled = $True \n url = \"https://contoso.org/governanceMaterial\" \n } \n ContactDetails= @{ \n enabled = $True \n email = \"admin@contoso.com\" \n } \n}\nEnforce data policy for violating connections - environment\nThese cmdlets allow you to enforce data policy for violating connections at environment or tenant level.\nEnforce data policies for violating connections\nYou can enforce data policies on connections in an environment. Enforcing disables existing connections that violate data policies and enables any previously disabled connections that no longer violate data policies.\nStart-DLPEnforcementOnConnectionsInEnvironment -EnvironmentName [Environment ID]\nExample environment enforcement script\nStart-DLPEnforcementOnConnectionsInEnvironment -EnvironmentName c4a07cd6-cb14-e987-b5a2-a1dd61346963\nEnforce data policies for violating connections - tenant\nYou can enforce data policies on connections in the tenant. Enforcing disables existing connections that violate data policies and enables any previously disabled connections that no longer violate data policies.\nStart-DLPEnforcementOnConnectionsInTenant\nBlock trial licenses commands\nCommands:\nRemove-AllowedConsentPlans\nAdd-AllowedConsentPlans\nGet-AllowedConsentPlans\nThe allowed consent plans cmdlets can be used to add or remove access to a particular type of consent plan from a tenant. \"Internal\" consent plans are either trial licenses or developer plans that users can sign themselves up for via Power Apps/Power Automate portals/Power Automate for desktop. \"Ad-hoc subscription\" or \"Viral\" consent plans are trial licenses that users can sign themselves up for at\nhttps://signup.microsoft.com\n. Admins can assign users through Microsoft Entra ID or the Microsoft 365 admin portal.\nBy default, all types of consent plans are allowed in a tenant. However, a Power Platform admin might want to block users from assigning themselves trial licenses, but retain the ability to assign trial licenses on behalf of users. This rule can be accomplished by using the\nRemove-AllowedConsentPlans -Types \"Internal\"\ncommand and by not allowing the setting\nUpdate-MgPolicyAuthorizationPolicy -AllowedToSignUpEmailBasedSubscriptions\nin Microsoft Entra ID.\nIf you have questions\nIf you have comments, suggestions, or questions, post them on the\nAdministering Power Apps community board\n.\nRelated information\nGet started using the Power Apps admin module.\nMicrosoft.PowerApps.Administration.PowerShell\nPreview: Programmability and extensibility overview\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "PowerShell", @@ -305,7 +305,7 @@ "https://learn.microsoft.com/en-us/power-platform/admin/powershell-getting-started": { "content_hash": "sha256:9ea21e24fd7d3fed5eee601adcbe3e50c1371102bddd52c224c542303f3660cc", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nGet started with PowerShell for Power Platform Administrators\nFeedback\nSummarize this article for me\nPowerShell for Power Platform Administrators cmdlets are designed for managing and administering Microsoft Power Platform environments, Power Apps, and Power Automate flows. Use PowerShell for Power Platform Administrators when you want to build automated tools that interact with these resources.\nThis article helps you get started with the PowerShell module and teaches the core concepts behind it.\nInstallation\nThe easiest way to get started with the PowerShell module is by installing it on your local machine. Follow the instructions in\nInstallation\nto import the module, or to update an outdated version you might have installed previously.\nSign in to Microsoft Power Platform\nSign in interactively with the Add-PowerAppsAccount cmdlet.\nAdd-PowerAppsAccount -Endpoint prod\nAlternatively, you can sign in with a client ID and secret or certificate. To do this, you need to\nCreate a service principal\n.\n$appId = \"CLIENT_ID_FROM_AZURE_APP\"\n$secret = \"SECRET_FROM_AZURE_APP\"\n$tenantId = \"TENANT_ID_FROM_AZURE_APP\"\n\nAdd-PowerAppsAccount -Endpoint prod -TenantID $tenantId -ApplicationId $appId -ClientSecret $secret -Verbose\nPrerequisite\nTo perform the administration operations in the cmdlets, you'll need the following:\nAny of these roles from Microsoft Entra ID, Tenant admin, Power Platform administrator, Dynamics 365 Service Administrator, can access the Power Apps admin PowerShell cmdlets. These roles no longer require a Power Apps plan for administrative access to the Power Apps admin PowerShell cmdlets. However, these administrators need to sign in to the Power Platform admin center at least once before using the PowerShell cmdlets. If this isn't done, the cmdlets fail with an authorization error.\nPower Platform administrator or Dynamics 365 administrator permissions are required if you need to search through another user's resources. Note that environment admins only have access to those environments and environment resources for which they have permissions.\nFor Dataverse for Teams environments, you must be a Power Platform administrator to manage environments from which you aren't the owner of the team in Microsoft Teams.\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "PowerShell Getting Started", @@ -314,7 +314,7 @@ "https://learn.microsoft.com/en-us/power-platform/guidance/adoption/dlp-strategy": { "content_hash": "sha256:99a8e61514ff882b1195a8bdb61cc94d284fa3bc8642ee6722d77d1db793e487", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nImplement a data policy strategy\nFeedback\nSummarize this article for me\nData policies, including data loss prevention (DLP) policies, act as guardrails to help prevent users from unintentionally exposing organizational data and to protect information security in the tenant. Data policies enforce rules for which connectors are enabled for each environment, and which connectors can be used together. Connectors are classified as either\nbusiness data only\n,\nno business data allowed\n, or\nblocked\n. A connector in the business data only group can only be used with other connectors from that group in the same app or flow. Learn more in\nData policies\n.\nEstablishing your data policies goes hand in hand with your\nenvironment strategy\n.\nQuick facts\nData policies\nact as guardrails to help prevent users from unintentionally exposing data.\nData policies can be scoped at the environment level and tenant level, offering flexibility to craft policies that are sensible and don't block high productivity.\nEnvironment data policies can't override tenant-wide data policies.\nIf multiple policies are configured for one environment, the most restrictive policy applies to the combination of connectors.\nBy default, no data policies are implemented in the tenant.\nPolicies can't be applied at the user level, only at the environment or tenant level.\nData policies are connector aware, but they don't control connections made using the connector. In other words, data policies can't determine whether the connector is used to connect to a development, test, or production environment.\nPowerShell and admin connectors can manage policies.\nUsers of resources in environments can view policies that apply.\nConnector classification\nBusiness and non-business classifications draw boundaries around what connectors can be used together in a given app or flow. Connectors can be classified across the following groups using data policies:\nBusiness\n: A given Power App or Power Automate resource can use one or more connectors from a business group. If a Power App or Power Automate resource uses a business connector, it can't use any non-business connector.\nNon-business\n: A given Power App or Power Automate resource can use one or more connectors from a non-business group. If a Power App or Power Automate resource uses a non-business connector, it can't use any business connector.\nBlocked\n: No Power App or Power Automate resource can use a connector from a blocked group. All Microsoft-owned premium connectors and third-party connectors (standard and premium) can be blocked. Microsoft-owned standard connectors and Common Data Service connectors can't be blocked.\nNote\nThe names \"business\" and \"non-business\" don't have any special meaning—they are simply labels. The grouping of the connectors themselves is of significance, not the name of the group they're placed in.\nLearn more:\nConnector classification\nGranular control\nYou can achieve more granular control by configuring\nconnector action control\n. Through action control, you can choose which actions on a connector are allowed or not allowed. This option is for blockable connectors that you have added to a policy's non-business or business data group. Using it, you might allow makers to use the \"read\" actions but not the \"modify\" actions on the connector. Connectors get new actions when they're updated. You can set whether to allow or block new actions.\nAnother way to get more granular control is by configuring\nconnector endpoint filtering\n. Endpoint filtering allows admins to govern which specific endpoints makers can connect to when building apps, flows, or chatbots. Connector endpoint filtering applies to six connectors: HTTP, HTTP with Microsoft Entra ID, HTTP Webhook, SQL Server, Azure Blob Storage, and SMTP. The rules only apply when a maker uses a static value to specify an endpoint.\nPower Platform allows makers to create and share\ncustom connectors\n. You can manage\ncustom connectors for tenant and environment level data policies\n.\nSpecifically:\nEnvironment admins can use the Power Platform admin center to classify individual, custom connectors by name for environment-level data policies.\nTenant admins can use the Power Platform admin center and PowerShell to classify custom connector by their Host URL endpoints using a pattern matching construct for tenant-level data policies.\nData policies for Copilot Studio\nData policies let you govern how agents connect and interact with data and services, within and outside your organization. Learn more in\nConfigure data policies for agents\n.\nData policies for desktop flows\nPower Automate allows you to create and enforce data policies that classify desktop flow modules and individual module actions as business, non-business, or blocked. This categorization prevents makers from combining modules and actions from different categories into a desktop flow or between a cloud flow and the desktop flows it uses. Learn more in\nData policies for desktop flows\n.\nStrategies for creating data policies\nAs an administrator taking over an environment or starting to support use of Power Platform, data policies should be one of the first things you set up. With a base set of policies in place, you can then focus on handling exceptions and creating targeted data policies that implement these exceptions once approved.\nWe recommend the following starting point for data policies for\nshared user and team productivity environments\n:\nCreate a policy spanning all environments except selected ones (for example, your production environments), keep the available connectors in this policy limited to Microsoft 365 and other standard microservices, and block access to everything else. This policy applies to the default environment, and to training environments you have for running internal training events. Additionally, this policy also applies to any new environments that are created.\nCreate appropriate and more permissive data policies for your\nshared user and team productivity environments\n. These policies could allow makers to use connectors like Azure services in addition to the Microsoft 365 services. The connectors available in these environments depend on your organization, and where your organization stores business data.\nWe recommend the following starting point for data policies for\nproduction (business unit and project) environments\n:\nExclude those environments from shared user and team productivity policies.\nWork with the business unit and project to establish which connectors and connector combinations they use and create a tenant policy to include the selected environments only.\nUse environment policies to categorize custom connectors as business-data only, as necessary.\nWe also recommend to:\nCreate a minimal number of policies per environment. There's no strict hierarchy between tenant and environment policies. At design and runtime, all policies that are applicable to the environment in which the app or flow resides are evaluated together to decide whether the resource is in compliance or in violation of data policies.\nMultiple data policies\napplied to one environment will fragment your connector space in complicated ways and might make it difficult to understand issues your makers are facing.\nCentrally manage data policies using tenant level policies, and use environment policies only to categorize custom connectors or in exception cases.\nWith a base strategy in place, plan how to handle exceptions. You can:\nDeny the request.\nAdd the connector to the default data policy.\nAdd the environments to the All Except list for the global default data policy and create a use case-specific policy with the exception included.\nExample: Contoso's data strategy\nLet's look at how Contoso Corporation, our sample organization for this guidance, set up their data policies. The setup of their data policies ties in closely with their\nenvironment strategy\n.\nContoso admins want to support user and team productivity scenarios, business applications, and Center of Excellence (CoE) activity management.\nThe environment and data policy strategy that Contoso admins apply includes:\nA tenant-wide restrictive data policy that applies to all environments in the tenant except some specific environments that they exclude from the policy scope. Admins intend to keep the available connectors in this policy limited to Microsoft 365 and other standard micro-services by blocking access to everything else. This policy also applies to the default environment.\nContoso admins create another shared environment for users to create apps for user and team productivity use cases. This environment has an associated tenant-level data policy that isn't as risk-averse as a default policy and allows makers to use connectors like Azure services in addition to the Microsoft 365 services. Because this environment isn't the default environment, admins actively control the environment maker list for it. This strategy takes a tiered approach to shared user and team productivity environment and associated data settings.\nBusiness units create development, test, and production environments for their tax and audit subsidiaries across various countries and regions to build line-of-business applications. Access for environment makers is carefully managed, and appropriate first- and third-party connectors are made available using tenant-level data policies in consultation with business unit stakeholders.\nSimilarly, development, test, and production environments are created for Central IT to develop and roll out relevant applications. These business application scenarios typically have a well-defined set of connectors that need to be available for makers, testers, and users in these environments. Access to these connectors is managed using a dedicated tenant-level policy.\nContoso also has a special purpose environment dedicated to their Center of Excellence activities. In Contoso, the data policy for the special purpose environment remains high touch given the experimental nature of the theory teams book. In this case, tenant admins delegate data management for this environment directly to a trusted environment admin of the CoE team and exclude it from all tenant-level policies. This environment is managed only by the environment-level data policy, which is an exception rather than the rule at Contoso.\nAs expected, any new environments that are created in Contoso map to the original all-environments policy.\nThis setup of tenant-centric data policies doesn't prevent environment admins from coming up with their own environment-level policies, if they want to introduce more restrictions or to classify custom connectors.\nSet up data policies\nCreate your policy in the\nPower Platform admin center\n. Learn more in\nManage data policies\n.\nUse the\nDLP SDK\nto add custom connectors to a data policy.\nClearly communicate your organization's data policies to makers\nSet up a\nSharePoint site or a wiki\nthat clearly communicates:\nTenant-level and key environment-level (for example, default environment, trial environment) data policies enforced in the organization, inclusive of lists of connectors classified as business, non-business, and blocked.\nYour admin group's email ID so makers can contact them for exception scenarios. For example, admins can help makers comply by editing an existing data policy, moving the solution to a different environment, creating a new environment and a new data policy, and moving the maker and resource to this new environment.\nAlso clearly\ncommunicate your organization's environment strategy\nto makers.\nNext steps\nReview the detailed articles in this series to further enhance your security posture:\nDetect threats to your organization\nEstablish data protection and privacy controls\nConfigure identity and access management\nMeet compliance requirements\nSecure the default environment\nAfter reviewing the articles, review the security checklist to ensure Power Platform deployments are robust, resilient, and aligned with best practices.\nReview the security checklist\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Power Platform Governance", @@ -323,7 +323,7 @@ "https://learn.microsoft.com/en-us/power-platform/guidance/adoption/environment-strategy": { "content_hash": "sha256:6c0ef11537c2de11287146a00efb9e616cd9faaab18059d31801263e5932cf38", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nDevelop a tenant environment strategy to adopt Power Platform at scale\nFeedback\nSummarize this article for me\nEvery organization's journey to adopt Microsoft Power Platform is unique. A tenant environment strategy lays the foundation to help accelerate usage in a manageable and secure fashion.\nThis article shows you how to align your Power Platform tenant environment strategy with the product capabilities and vision. You learn how to best use the latest features of the platform to implement a strategy that can allow your adoption of Power Platform to reach enterprise scale.\nIntroduction\nPower Platform empowers organizations to build low-code solutions for rapid innovation. These solutions can focus on productivity for individuals and small teams, or apply across the organization. They can also extend to business processes, including external customers and partners. Supporting these solutions are Power Platform environments where the low-code resources are built, tested, and used. As an organization increases its adoption of Power Platform, implementing a good tenant environment strategy is essential to make it manageable and secure as the number of environments grows.\nTo help you be more successful, this article guides you on how best to use the features available to establish your first environment strategy or evolve your current plans. We also outline our vision for how these features are intended to work together and how they'll evolve for managing Power Platform at scale. In this guidance, we establish how to properly route new users to environments and group environments to consistently apply governance, security rules, and other important aspects of a tenant environment strategy. We also provide detailed steps to secure your default environment, which is a critical first step in implementing an environment strategy.\nWhile many perspectives are available for managing Power Platform environments, the approach in this article aligns with Microsoft's latest product direction and uses current features and near-term planned enhancements. This updated guidance can help you ensure that you use only the environment features and options that are strategic to how Microsoft intends for you to manage environments at scale.\nMicrosoft's tenant environment strategy vision\nMany organizations start their Power Platform journey with personal productivity apps and automations built and running in a shared central environment called the\ndefault environment\n. These resources often use only the basic capabilities included with Microsoft 365 and don't use the full capabilities of Power Platform. As this initial adoption accelerates, Microsoft provides organizations with an on-ramp to an environment strategy for enterprise scale adoption of the full Power Platform capabilities. These premium governance capabilities become available when users have a premium Power Platform (Power Apps, Power Automate, Microsoft Copilot Studio, and Dynamics 365) license. The\nPower Platform adoption maturity model\nprovides more insights to help organizations define their roadmap to achieve enterprise scale adoption beyond their environment strategy. This approach can help organizations mature from basic personal productivity to enterprise-scale adoption of Power Platform.\nPower Platform administrative, governance, and security features allow organizations to adopt and manage Power Platform for enterprise productivity and enterprise app usage at scale. Using Managed Environments activates a set of premium capabilities that enable greater visibility and control and reduce the manual effort to administer and secure environments. Using these capabilities, you can ensure consistent application of your governance and security policies. Admins can transition into an enterprise-scale, environment strategy using these capabilities. Spending less time and effort on the administration helps reduce the overall total cost of ownership (TCO) of the platform as your organization scales usage.\nA key element of the transition to enterprise scale is to enhance the shared, central environment strategy for makers by making it easier for them to use personal, development environments. In a shared, central environment strategy, makers build, use, and share apps in the default environment. This strategy can result in lack of isolation and makers encroaching on each other. Imagine if everyone in the company shared a single OneDrive folder for all their documents. Instead, use environment features to guide makers to their own, personal environment where they can safely build their apps protected from makers working on unrelated assets, with simplified governance for admins. Coworkers can be added as more makers to these environments to collaborate on building solutions.\nFigure: Illustration of a shared, central environment (left) and an environment routing strategy (right).\nNewly created maker environments can be automatically added to a group that applies rules to ensure that the environments have consistent governance and security policies. Admins can handle exceptions by moving a maker's environment to a group with relaxed rules.\nLow-code resources created by the makers represent the initial stage in a resource's application lifecycle management (ALM) journey. As part of this initial stage, it's important to capture each version of a resource and be able to recreate it, if necessary. When the resource is ready to be shared, the maker can use the continuous integration attached to the developer environment to promote it to a production environment. Users can then run the resource, isolated from any ongoing maker activity.\nPrioritize the built-in features of the platform for managing environments when possible, instead of building your own tools. If the built-in features don't meet your organization's unique requirements, use platform admin tooling to create custom tools. You should evaluate any custom tooling against new features as they become available. Monitoring Microsoft's platform roadmap and aligning it with your own roadmap makes this process easier.\nEstablish your environment strategy using the recommended environment capabilities tailored for your organization's unique needs. Don't think of creating your environment strategy as a one-time activity. It should evolve over time to incorporate new environment features as they become available.\nFeatures that support an enterprise-scale, environment strategy\nEnvironments\nare a building block for Power Platform administration, governance, and security. A complete feature overview is out of the scope of this article; however, this section highlights the features that support implementation of an environment strategy at enterprise scale.\nTypes of environments\ndescribes the different uses of environments as part of your strategy.\nManaged Environments\nprovides a set of premium capabilities that make environments easier to manage at scale.\nLicense auto-claim\nsimplifies license assignment by allowing users to claim Power Apps per user licenses when they're needed, instead of requiring an admin to identify users who need licenses in advance.\nEnvironment groups and rules\nexplains how to manage environments as groups and apply rules to groups to automate consistent governance policies.\nDefault environment routing\nautomatically moves makers away from creating resources in the default environment to their own personal environment.\nMicrosoft Dataverse\nprovides enhanced security and ALM.\nPreferred solutions\nhelps makers ensure that all the assets they build are in a Dataverse solution, making it easier to promote them to other environments.\nPipelines in Power Platform\nprovides a simplified process for promoting assets from development to test and production environments, making continuous integration and deployment (CI/CD) available to all makers.\nCatalog in Power Platform\nallows makers to share components, like apps and flows, and more advanced starting points such as templates.\nTypes of environments\nThe following table describes the types of environments you can create, their characteristics, and their intended uses.\nType\nCharacteristics and uses\nDefault\nThe environment that comes with every tenant. Many Microsoft 365 experiences use this environment for customizations and automations. This environment isn't intended for long-term or permanent work beyond the Microsoft 365 personal, productivity scenarios.\nProduction\nThis environment is intended to be used for permanent work in an organization. Production environments support extended, back-up retention, from seven days to up to 28 days.\nSandbox\nThese nonproduction environments support environment actions like copy and reset. Sandboxes are best used for testing and ALM build environments.\nDeveloper\nThese special environments are intended as makers' personal development workspaces, which isolate low-code assets from users and other makers. Makers can have up to three developer environments. They don't count against your tenant capacity. Developer environments that haven't been used for 90 days are automatically turned off and then removed from your tenant if the owner doesn't respond to notifications. Dynamics 365 apps aren't available in developer environments.\nTrial\nThese environments are intended to support short-term testing and proofs of concept. They're limited to one per user. Trial environments are automatically removed from your tenant after a short period of time.\nMicrosoft Dataverse for Teams\nThese environments are automatically created when you create an app in Teams or install an app from the app catalog. The security model for these environments aligns with the team they're associated with.\nSupport\nThese are special environments created by Microsoft Support to allow engineers to troubleshoot problems. These environments don't count against your tenant capacity.\nWhen creating an overall tenant environment strategy, consider the different types to support your recommendations.\nManaged Environments\nEnvironments have a base set of features and characteristics depending on the environment type. Managed Environments expand on the base features to provide a suite of premium capabilities that allow admins to more easily manage Power Platform at scale with more control, less effort, and more insights. These capabilities are unlocked when you set an environment as managed.\nThe following table lists the features of Managed Environments that are available, as of this writing. New features are added often, so check the\ndocumentation\nfor the latest list. Although all the features can help you build an environment strategy, the features in italics are more relevant for the strategy that's outlined in this article.\nMore visibility\nMore control\nLess effort\nUsage insights\nAdmin digest\nLicense reports\nData policy view\nExport data to Azure Application Insights\nAI-generated descriptions for all apps\nSharing limits\nData policies for desktop flows\nSolution checker\nMaker welcome content\nIP firewall\nIP cookie binding\nCustomer-managed keys\nCustomer Lockbox\nExtended back-ups\nEasy activation\nPower Platform pipelines\nEnvironment routing\nEnvironment groups and rules\nActions page\nLicense auto-claim\nAuto-claim policies\nautomate the assignment of Power Apps and Power Automate licenses to users when they need one to use certain apps or features. Automation can help reduce the number of licenses consumed and avoid the overhead of manually assigning licenses.\nAfter a policy is configured, any user in the organization who needs an individual Power Apps license is automatically granted one under the following conditions:\nIf a user without a standalone Power Apps license launches an app that demands a premium license, the system automatically assigns the user a Power Apps per user license.\nIf a user without a standalone Power Apps license launches an app in a Managed Environment, the system automatically assigns the user a Power Apps per user license.\nSimilarly, after a policy is configured, any user in the organization who needs an individual Power Automate license is automatically granted one under the following conditions:\nThe user triggers, saves, or turns on a premium cloud flow with attended RPA (Robotic Process Automation).\nThe user requests a Power Automate premium license.\nWe recommend configuring license auto-claim if your environment strategy includes Managed Environments. Users of apps and flows encounter the least amount of licensing friction, and you only consume licenses for users who are actively running apps or using Power Automate.\nEnvironment groups and rules\nAs Power Platform adoption in your tenant increases, so can the number of environments that require administration and governance. As the number of environments increases, the more challenging it becomes to ensure that you've applied consistent settings and governance policies on the environments. The\nenvironment groups feature\nmakes this easier, by allowing you to create named groups and associate environments with them, like placing related documents in a file folder.\nKeep the following considerations in mind as you think about using environment groups:\nAn environment must be managed to be included in a group.\nAn environment can be in only one group at a time.\nAn environment can be moved from one group to another.\nEnvironments in a group can be from multiple geographic regions.\nGroups can't contain other groups.\nTo help you apply consistent settings and governance, environment groups can have one or more of the following rules configured and turned on:\nSharing controls for canvas apps\nUsage insights\nMaker welcome content\nSolution-checker enforcement\nBack-up retention\nAI-generated descriptions\nA rule becomes active when it's published. Active rules are applied to all environments that are associated with the group.\nWhen a group rule is managing a setting, individual environment settings are locked. The only way to change them is to modify the rule. If the environment is removed from the group, it keeps the group settings, but an environment admin can change them. This approach is important for an environment strategy because it ensures that an environment admin can't override the policies set for the group.\nUsing environment groups allows you to organize your environments in logical ways, similar to your organization structure, product service hierarchy, or other frameworks that we explore later. The following diagram is a conceptual example of how the Contoso organization might think about organizing its environment groups.\nFigure: Conceptualization of an environment strategy for a Contoso tenant.\nWhen you're planning the rules to configure, think through what you could apply at each level of the conceptual hierarchy. Although you can't configure the group hierarchy yet, you can use a combination of naming conventions and rule configuration to implement your conceptual design. For example, given the Contoso tenant conceptualization shown earlier, the following illustration represents the environment groups the organization could use to implement its design.\nFigure: Example of implementing the conceptual environment groups into the actual tenant\nLater in this article, we explore more ways to use environment groups as part of a tenant environment strategy.\nDefault environment routing\nA key part of the environment strategy that we outline in this article is to move makers away from creating resources in the default environment. The\nenvironment routing feature\nredirects makers into their personal development environment and creates new developer environments, as needed.\nFigure: A maker is automatically redirected to a personal, developer environment instead of the default environment when building apps.\nThe developer environments that are created by routing are managed by default. Users with Developer Plan licenses are limited to creating and previewing resources in the environment. To run the resources as a user, they need an appropriate\nlicense\n.\nYou can use environment routing by itself, but the recommended way is to use it with environment groups. When used this way, any environment that's created is associated with the group that you designate to contain all new developer environments, ensuring that it's immediately covered by your governance policies.\nMakers are automatically assigned a security role that makes them an environment admin of their developer environment. When the environment is a part of an environment group, the maker—as the environment admin—can't change the environment settings because they're managed by the environment group rules. Only admins, who can modify the group rules, can make any changes.\nYou can impose even more control in two ways. First, you can disallow manual creation of developer environments in your tenant settings. When this option is set, makers can't create environments themselves in the admin portal. They also won't get one automatically created by the routing policy. Second, you can specify a security group, in the routing policy, to limit who can automatically get an environment created.\nInitially, environment routing supports routing new and existing makers away from the default environment when they use\nmake.powerapps.com\n. Over time, other Power Platform services will support the environment routing feature.\nMaker welcome content\nProvide\ncustomized welcome content\nto help makers get started with Power Apps and Copilot Studio. When you add your own help content, it replaces the default Power Apps first-time help experience for makers. The custom welcome message can inform makers about the company rules and what they can do in each environment or group of environments.\nHere are some suggestions for how your organization might use the welcome message in each type of environment. Include an image that identifies the environment type or owners to help with user adoption and error prevention.\nDefault environment\nThe default environment is often the most restricted, with data policies and sharing controls. Create a welcome message that warns your makers about restrictions and possible limitations, and include a link to your organization's policy website or document.\nFor example, you might want to inform makers to use the default environment only for solutions that are related to Microsoft 365 applications, avoid using production applications in the default environment, and to share their canvas apps only with a limited number of individuals. The following example shows how to create such a message in Managed Environments settings:\nExample Markdown input:\n![Contoso](https://i.ibb.co/SNSTCx3/something.png)\n## Welcome to Contoso Personal Productivity Environment\n\n### Before you start, here are some considerations\n\nUse this environment if you plan to build apps that integrate with Office 365.\n\nBefore you start, be aware of these limitations:\n\n1. You can't share your apps with more than five users.\n1. The data in Dataverse is shared with everyone in the organization.\n1. You can only use Office 365 connectors.\n\nIf you're not sure you're in the right place, follow **[this guidance](#)**.\nHere's the rendered welcome message:\nProduction environments\nProduction environments are typically used for deploying solutions that support the enterprise and team productivity. It's important that apps and data comply with organizational policies. Since you need to control which users have access to the production environment, it's a good idea to inform users if you have a policy of refreshing access. You might allow more connectors and increase the sharing limits in a production environment. You can also use the welcome message to inform makers of the right team to reach out to for support. The following example shows how to create such a message:\n![Contoso](https://i.ibb.co/SNSTCx3/something.png)\n## Welcome to HR Europe Environment\n\n### Before you start, here are some considerations\n\nUse this environment if you're on the HR team and your data is located in Europe.\n\nBefore you start, be aware of these limitations:\n\n1. You can only share apps with security groups. [Follow this process](#) to share your apps.\n1. The data in Dataverse is stored in Europe.\n1. You can only use social media connectors with read actions.\n1. If you need more connectors, [submit a request](#).\n\nIf you're not sure you're in the right place, follow **[this guidance](#)**.\nHere is sample output:\nDeveloper environments\nDeveloper environments are most often where developers build their solutions. Since the developers are working on the applications, they aren't in production, and scalability is limited. Normally, dev environments have more relaxed data policies due to the nature of the makers. To avoid developers using production assets in their dev environments, limit sharing capabilities and use a specific data policy for this type of environment. Here's an example of a welcome message for a development environment:\n![Contoso](https://i.ibb.co/SNSTCx3/something.png)\n## Welcome to a Developer Environment\n\n### Before you start, here are some considerations\n\nUse this environment if you're a developer and you're building solutions.\n\nBefore you start, be aware of these limitations:\n\n1. You can only share resources with up to two members of your team. If you need to share with more people, [submit a change request](#).\n1. Use resources only while you're developing a solution.\n1. Be mindful of the connectors and data you're using.\n1. If you need more connectors, [submit a request](#).\n\nIf you're not sure you're in the right place, follow **[this guidance](#)**.\nHere is sample output for a developer environment:\nSandbox environments\nTypically, sandbox environments are used to test solutions. Because some tests involve a significant number of users, these environments scale, up to a point, and have more capacity than a developer environment. Sandbox environments are also commonly used as development environments and are typically shared by multiple developers. Here's an example of a welcome message for such an environment:\n![Contoso](https://i.ibb.co/SNSTCx3/something.png)\n## Welcome to a Test Environment\n\n### Before you start, here are some considerations\n\nUse this environment only if you're testing solutions.\n\nBefore you start, be aware of these limitations:\n\n1. You can only share resources with your team. If you need to share with more people, [submit a change request](#).\n1. You're not allowed to edit or import solutions directly in this environment.\n1. Be mindful of the test data and compliance.\n1. If you need help from a security export or IT support, [submit a request](#).\n\nIf you're not sure you're in the right place, follow **[this guidance](#)**.\nHere is sample output for a sandbox or test environment:\nLimit sharing\nAdmins can\nlimit how broadly users can share canvas apps, flows, and agents\n. The limit only applies to future sharing, however. If you apply a sharing limit of 20 to an environment with resources that are already shared with more than 20 users, those resources continue to work for all users the resources were shared with. Create a process to inform makers of apps, flows, and agents shared with more than the new limit so they can reduce the number of users their resources are shared with. In some cases, you might decide to move the solution to another environment. Sharing limits apply to canvas apps, flows, and agents.\nAdmins typically need to control how makers share their apps, flows, and agents when:\nResources are shared in a personal productivity environment\n. If you have an environment where users can create resources for their own work, resources without global business value, or resources without support from IT, it's important that you don't allow makers to share them across the organization. If resources start as personal productivity but later become popular and are widely used, be mindful about the limit you set on sharing. A common limit is between 5 and 50 users.\nResources are shared with security groups or everyone\n. Resources that are shared with a security group can be run by all members of the group. In a developer environment, you might want the developer to control how resources are shared instead of relying on group membership. In other scenarios, you might want to allow sharing with everyone. If your organization's policy is that resources are shared with a security group that includes all users who are authorized to run the resource and is managed by the IT department, you might want to restrict makers from sharing with other security groups.\nHere are common sharing limits for each environment type:\nDefault\n: Select\nExclude sharing with security groups\n, select\nLimit total individuals who can share to\n, and select 20 for the value.\nDeveloper\n: Select\nExclude sharing with security groups\n, select\nLimit total individuals who can share to\n, and select 5 for the value.\nSandbox\n: Select\nExclude sharing with security groups\nand leave\nLimit total individuals who can share to\nunselected. Use this option if apps are shared with an IT-managed security group that includes the users who are authorized to run the application. If the maker, user, or team can manage which users are permitted to test a solution, select\nDon't set limits\n(default).\nProduction\n: Select\nDon't set limits\n(default). To control sharing based on a specific security group, select\nExclude sharing with security groups\nand leave\nLimit total individuals who can share to\nunselected.\nMicrosoft Dataverse\nDataverse securely stores and manages data that's used by applications. In the context of an environment strategy, the\nDataverse solution feature\nlets you transport apps and components from one environment to another. Makers build their assets in containers—solutions—that track what they build. Solutions can easily be transported to other environments. Using this approach, you can separate developer environments, where makers build resources, from the production environments where they're used. Both makers and users benefit. Makers can continue to evolve their resources, and users aren't surprised by sudden changes. When makers are ready to publish their changes, they can request to promote the updated resource to the production environment.\nDataverse solutions are the mechanism for implementing ALM in Power Platform products like Power Apps and Power Automate. Pipelines in Power Platform use solutions to automate CI/CD of assets that makers build. Solutions can be exported from Dataverse and stored in a source control tool like Azure DevOps or GitHub. The solution in source control becomes the source of truth if you need to recreate the development environment. For example, if a maker built a popular app and then deleted the developer environment, an exported solution stored in source control could be used to recreate a viable development environment.\nAnother important consideration when you create an environment with Dataverse is whether any Dynamics 365 applications will be deployed to the environment. If the potential exists, you must enable Dynamics 365 when you create the environment or you won't be able to install Dynamics 365 apps later.\nWe recommended that you provision Dataverse in any environment where makers create assets that will be shared with other users. This strategy makes it easier for the assets to be ALM ready.\nPreferred solutions\nWhen a maker creates a Dataverse asset in a Dataverse environment—and doesn't start from a custom solution—the asset is associated with the default solution and may also be associated with the Common Data Service default solution. The default solution is shared by all makers who create assets in the environment. Identifying which maker created specific components or which assets belong to specific apps is challenging, making it harder to promote a popular app to another environment for sharing with a larger audience. To do so, you need to promote all the assets in the default solution, which isn't ideal.\nTo support your environment strategy and make it easier to work with, makers should create a custom solution in their development environment, and then set it as the\npreferred solution\nin the environment. Makers set the preferred solution in an environment to indicate which solution an asset they created should be associated with. Preferred solutions can help ensure that when makers use pipelines to promote their resources to other environments, the promoted solution contains all the required assets. Think of this as preparing the assets to be ALM-ready.\nPipelines in Power Platform\nAs we've seen, a key tenet of a good environment strategy is to isolate where an asset is built from where it's deployed and used. This separation ensures that users who are trying to use an asset don't encounter downtime because a maker is updating it. However, it requires assets to be promoted to a production environment—ideally, as part of a Dataverse solution—before they can be used.\nDataverse solutions can be manually transported between environments. However, you can automate the process—and put policies in place to ensure that proper change management occurs—using\npipelines\n. Depending on the environment rules that you set in the\nsolution checker\n, pipelines automatically enforce all the rules before the solution is deployed, preventing further deployment errors. The following diagram illustrates how pipelines can automate the promotion of an asset from development to production.\nFigure: A pipeline automates promoting an asset that's stored in source control from development, through test, to production.\nYou can configure the number of environments and processes, like approvals, that need to be included in a pipeline.\nPipelines work together with environment groups. They can be preconfigured for development environments to allow makers to easily start the promotion process by responding to a prompt when they try to share their assets with other users. As part of a deployment request using pipelines, makers can propose whom to share their assets with and the required security roles. A pipeline admin can approve or reject the request before deployment by ensuring least privileges for the maker who originated it.\nPipelines in Power Platform store the definitions of each pipeline in a host environment that Microsoft manages by default. However, you can define multiple host environments in your tenant that you manage, allowing you to handle unique requirements.\nSolution checker enforcement\nIt's common for a Center of Excellence (CoE) team to set up guardrails to reduce the risk of users importing noncompliant solutions into an environment. Admins can easily\nenforce rich static analysis checks of solutions\nagainst a set of best practice rules to identify problematic patterns. Organizations with decentralized CoEs often find it necessary to activate solution checker enforcement along with proactively reaching out to makers by email to offer support.\nSolution checker enforcement offers three levels of control, None, Warn, and Block. Administrators configure the effect of the check, whether it provides a warning but allows the import or blocks the import altogether, while also providing the result of the import to the maker.\nOrganizations that use this feature configure it differently depending on the environment type. It's normal to have exceptions, and this guidance should always be aligned with your needs. However, here are the most common settings for solution checker enforcement in each environment type:\nDefault: Select\nBlock\nand\nSend emails\n.\nDeveloper: Select\nWarn\nand leave\nSend emails\nunselected.\nSandbox: Select\nWarn\nand leave\nSend emails\nunselected.\nProduction: Select\nBlock\nand\nSend emails\n.\nTeams Environment: Select\nBlock\nand\nSend emails\n.\nCatalog in Power Platform\nOrganizations where developers and makers build and share components like apps, flows, and templates—advanced starting points—tend to get more value from Power Platform.\nThe Power Platform catalog\nmakes it easy for makers to share their components and templates across environments.\nThe catalog is installed in an environment and can be installed with the pipeline host in the same environment. It's also possible to handle unique resource segmentation requirements by having multiple environments with a catalog installed.\nOrganizations that encourage developers and makers to build and share components and templates in the catalog derive more value from their investment in Power Platform. Simply building isn't enough. Sharing the artifacts, at scale, fosters communities and supports groups that can unlock value from a diverse set of personnel in the organization. In fact, organizations that are most successful with Power Platform adopt a fusion team model, where pro developers, makers, and admins work together to help their fellow employees derive the highest value possible from the platform by reusing solutions, templates, and components.\nFeature roadmap\nAs Microsoft continues to evolve the features of Power Platform that support governance and administration, you can follow along in the\nrelease planner\n. You learn what's planned, what's in the upcoming release wave, and what you can try now. You can even create your own release plan by saving the items you want to follow.\nFoundation of an enterprise-scale environment strategy\nWe discussed our vision for a tenant environment strategy at enterprise scale and key environment features that support it. Now, we look at how you can use those features together as part of an environment strategy. Your strategy should be based on your organization's unique requirements, so let's start with a basic example before we tailor a strategy to meet your needs.\nIn this example, Contoso leadership wants to empower employees to take advantage of Power Platform and have identified the following high-level requirements:\nEmployees need to be able to build automated, document approval processes and other Power Platform customizations with Microsoft 365.\nEmployees should be able to build Power Apps and Power Automate automations to improve their personal productivity.\nThe makers who are working on the company's Compliance Tracker app must be able to develop and maintain it.\nTo support these requirements, the Contoso admin and governance team developed the following environment topology:\nFigure: Proposed environment topology for Contoso's Power Platform at scale project.\nLet's explore this environment topology diagram in detail.\nThe default environment is used to build Microsoft 365 productivity customizations. Data policies and restrictions on sharing limit other types of maker activity and place guardrails around what makers can build in this environment.\nOnly admins are able to create trial, sandbox, and production environments. Makers use a custom Microsoft Form or another process to request a new environment. The\nMicrosoft Power Platform Center of Excellence (CoE) Starter Kit\nincludes\nan environment request\nthat could be used.\nFour environment groups are created: Development, Shared Development, UAT (user acceptance testing), and Production.\nAn environment routing policy set for the Development group routes makers away from the default environment into their own developer environments. As new development environments are created, they're automatically associated with the Development group and its rules are applied.\nThe Shared Development group supports environments that contain projects with multiple makers.\nThe UAT group contains environments that are used to test resources before they're promoted to production.\nThe Production group contains environments that host apps, flows, and other artifacts for production use.\nThis proposed topology is missing pipelines to automate promotion between development, test, and production environments. Let's add them now.\nFigure: The same environment topology with pipelines connecting a pipeline host environment to development, test, and production environments.\nIn the revised environment topology diagram, we've added a pipeline host environment and two pipelines. One pipeline moves resources from development to test and then to production environments. The pipeline rule on the Development group will be modified to use this pipeline. The other pipeline moves resources from the shared dev environment to test and then to production. The pipeline rule on the Shared Development group will be modified to use this pipeline.\nThis basic environment strategy provides a foundation that you can build on for other use cases, which we explore next.\nEnvironment strategies for specific scenarios\nHere are some common use cases that you might need to incorporate in the foundation tenant environment strategy.\nControl which makers can create developer environments\nBy default, anyone who has a Power Platform Premium license, a Developer Plan license, or a Power Platform tenant admin role can create a developer environment from the admin portal.\nIn the foundation environment strategy, environment routing ensures that makers are directed away from the default environment, to a new developer environment that's created in the designated group. However, makers can still manually create developer environments that aren't placed in an environment group and don't have its rules applied.\nTo refine which makers are eligible for environment routing, specify a security group in the routing configuration. When a security group is configured, only members of the security group are routed. All others fall back to the default environment.\nProvide more flexibility to advanced makers\nIn the foundation environment strategy, all new maker environments are routed to a designated developer environment group. Typically, this group of environments has a fairly restrictive set of governance rules applied.\nAs makers become more advanced, you can allow them to request access to more capabilities. Instead of removing them from the original environment group and manually managing the exception, you can use another environment group to track these advanced makers.\nFigure: Add more capable makers to an environment that has relaxed governance rules.\nOrganize developer environments by region or business unit\nIn the current implementation of environment routing, all new developer environments are created in a single environment group. What if you want to organize your makers' developer environments by region, for example, or business unit?\nUse routing to direct makers into a new developer environment that's created in the designated group. Then you can move it to another group that's based on region, organizational unit, or other criteria, where you can apply more granular governance rules.\nFigure: After environment routing creates developer environments in the designated group, move them to more structurally specific groups.\nMoving environments is a manual action today, but you'll be able to automate it when the Power Platform admin connector supports the group feature in a future update.\nDevelop an app for enterprise use\nA team in your organization might be developing an app for enterprise-wide use. The team might be IT-driven or include both IT and business users (what's known as a fusion team).\nIn the simplest environment strategy, the project team builds in a shared environment that's either a sandbox or a production type. A developer environment type isn't the best way to support multiple makers collaborating on a resource. Makers need to communicate with one another to avoid collisions and conflicts in the shared environment.\nDedicated testing and production environments aren't required. The app could be tested in and deployed to organization-wide testing and production environments that host multiple applications.\nFigure: Two enterprise apps under development in dedicated environments, then tested and deployed in environments that are shared with other apps.\nIn a more advanced variation, each maker has an individual developer environment. This strategy has the benefit of providing greater isolation to the maker, but can make combining individual work in an integration environment more complicated. Although working in isolation can be helpful for larger, sophisticated teams, it can add unnecessary overhead to smaller teams that can be more successful collaborating in a shared development environment.\nFigure: Two makers working on the same app in individual developer environments must combine their work in a shared integration environment before it moves to testing and production.\nThis variation commonly incorporates a source control strategy, with each development environment represented as a branch in source control that gets merged when changes are ready to be promoted. It's important to account for how the application will be maintained after the initial release.\nFor example, version 1.0 of the app might be in production while the team moves on to building version 2.0. Your environment strategy must support fixing a problem in version 1.0, while development of version 2.0 is underway.\nFigure: Version 1.0 must be patched, tested, and deployed while version 2.0 is being developed, tested, and deployed.\nEnvironment groups offer multiple approaches to handling this enterprise app scenario. For example, this could be a single app group or could involve having separate groups for each development stage. In the best practices section, we explore how to evaluate the options.\nMinimize use of developer environments\nIndividual developer environments are the recommended way to provide makers a workspace to build low-code solutions. They offer the highest level of isolation from other makers. If your organization wants to minimize the number of developer environments, multiple shared environments are better than encouraging makers to build assets in the default environment.\nIn this scenario, you would restrict the creation of developer environments and create shared production-type development environments. You could organize these shared environments by organization structure, region, or other criteria. An environment group could contain them to ensure that they have consistent governance rules applied. Grant makers permission to create low-code assets in the environment that's assigned to them.\nSecurity as part of your environment strategy\nEnvironments are a key component of using Power Platform securely. They represent security boundaries within your tenant that help protect apps and data. As part of your environment strategy, you must consider how your security requirements influence the number and purpose of the environments in your tenant.\nEnvironments enable you to create multiple security boundaries within your tenant to protect apps and data. The protection provided by the environment can be adjusted to meet the necessary security protection by applying a configurable set of security features on the environment. A detailed discussion of individual environment security features is beyond the scope of this article. However, in this section we offer recommendations for how to think of security as part of your tenant environment strategy.\nSecurity at the tenant level\nMost security settings that affect environments are configured for each environment individually. However, you can make some changes at the tenant level to help support your environment strategy.\nConsider\nturning off the Share with Everyone feature\nin Power Platform. Only admins would be able to share an asset with everyone.\nConsider\nsecuring integration with Exchange\n.\nApply cross-tenant isolation\nto help minimize the risk of data exfiltration between tenants.\nRestrict the creation of net-new production environments to admins.\nLimiting environment creation\nis beneficial to maintain control in general, both to prevent unaccounted capacity consumption and to reduce the number of environments to manage. If users have to request environments from central IT, it's easier to see what people are working on if admins are the gatekeepers.\nSecure the default environment\nThe default environment has a role in supporting Microsoft 365 productivity customizations. As part of the recommended environment strategy, though, it's best to minimize its use as much as possible. Instead, makers should build in their own isolated environments. Although you can't block access to the default environment, you can minimize what can be done in it.\nFirst, use environment routing to direct makers to their own workspace to build low-code assets.\nReview who has admin access to the default environment and limit it to roles that need it.\nConsider renaming the default environment to something more descriptive, like \"Personal Productivity.\"\nEstablish a data policy for the default environment that blocks new connectors and restricts makers to using only basic, unblockable connectors. Move all the connectors that can't be blocked to the business data group. Move all the blockable connectors to the blocked data group.\nCreate a rule\nto block all URL patterns used by custom connectors.\nSecuring the default environment is a priority. Implement it with tenant-level security as part of the first step in your environment strategy. Without these measures, makers can add more assets to the default environment. With these measures and environment routing in place, makers are encouraged to use their own environment.\nLearn more:\nSecure the default environment\nSecure other environments\nIf your organization is like most, you have several environments in addition to the default environment. The level of security each one requires can vary depending on the apps and data it contains. Developer environments typically have more relaxed rules than production environments. Some production environments require the most protection possible.\nAs part of establishing your environment strategy, identify common levels of security for your environments and the features that protect each level, as in the following example.\nFigure: An example of three tiers of environment security and the security features that apply to environments in each tier.\nIncorporate the security levels you identify into your group strategy, and where possible, use rules to enable the security features in your environments. In this example, a rule limits sharing in all the environments that are designated as normal or medium security.\nAlign environments to your data policy strategy\nData policies are another important part of an overall governance effort to control the services used by low-code resources in an environment. Environment groups don't have a rule to apply a data policy to an environment. However, you can align your data policy strategy with your environment groups. For example, you could create a data policy with the same or a similar name as an environment group and apply it to environments in that group.\nLearn more about how to implement a data policy strategy\n.\nFigure: In this example, environments in the Personal Dev group follow a data loss prevention (DLP) policy that blocks all non-Microsoft connectors.\nTailor an environment strategy for your organization\nIn earlier sections, we described our vision for how organizations can manage environments at scale. We explored essential features, how they contribute to an environment strategy, and what a foundation environment topology that uses them might look like. We gave examples of how to build on that foundation to accommodate common scenarios. Because every organization is unique, the next step is for you to tailor an environment strategy that meets your organization's needs.\nStart where you are\nWhether your organization is new to Power Platform or has been using it for years, the first step is to evaluate your situation. Assess, at a high level, what's in your default environment, what other environments you have, and what they're being used for. Often an environment strategy is done as part of an overall effort to establish governance of Power Platform in an organization. If that is the case, you might already have established some of the governance vision that is required to tailor a strategy for your organization.\nOrganization information you should know includes:\nWhat is the vision for how Power Platform will be used in the organization?\nWho in the organization will be building low-code assets?\nYou need to make some key decisions:\nHow will makers get new environments?\nWill you group your environments, and if so, how?\nWhat security levels are required for different environments, and how do environments get classified?\nHow will you decide whether an app, automation, or Copilot will use an existing environment or a new one?\nAre there any gaps between the baseline features of the platform and your requirements that require a custom governance process?\nHow will you handle any existing assets in the default environment?\nDo you have a tenant and environment data policy strategy, and if so, how does it align with the environment strategy you're creating?\nYou might find inspiration in the\ncloud operating models\nthat are part of the Cloud Adoption Framework for Azure.\nFill gaps using the platform\nYou'll almost always find requirements that the platform's built-in capabilities don't satisfy. As you evaluate these gaps, consider the following possible outcomes of your evaluation:\nThe gap is acceptable.\nThe gap can be filled using the Power Platform Center of Excellence Starter Kit.\nThe gap can be filled using the platform's capabilities, such as APIs, connectors and custom apps, or automations.\nThe gap can be filled using a third-party tool or app.\nCoE Starter Kit\nThe\nPower Platform Center of Excellence Starter Kit\nis a collection of components and tools that are designed to help your organization adopt and support the use of Power Platform. A key aspect of the starter kit is its ability to collect data about platform usage across your environments, which can be helpful as you develop and evolve your environment strategy.\nFor example, the Environments Power BI dashboard offers an overview that helps you understand which environments exist in your tenant, who created them, and what assets they contain.\nFigure: The Environments dashboard in Power BI.\nThe kit includes starting points or inspiration, such as a process that makers can use to\nrequest new environments\nand changes to data policies for their environments.\nFigure: Flow diagram illustrating an environment management process in the CoE Starter Kit.\nPlatform programmability and extensibility\nOne of the great things about a low-code platform is that you can use it to build apps, automations, portals, and copilots to help you manage it. You also have access to lower-level tools that can be used to fill gaps in support of your environment strategy.\nYou can use the following connectors to build apps and flows:\nPower Platform for Admins\nand\nPower Platform for Admins V2\nPower Apps for Admins\nand\nPower Apps for Makers\nPower Automate Management\nYou can use the\nPower Platform command-line interface (CLI)\nto develop automations to help you manage the environment lifecycle and other tasks related to DevOps practices.\nWith\nPowerShell cmdlets for Power Platform creators and administrators\n, you can automate many monitoring and management tasks.\nThe\nPower Platform DLP SDK\ncan help you manage your tenant and environment data loss prevention policies.\nBest practice recommendations\nIn this section of the article, we build on the recommendations in the foundation and scenario-specific sections.\nNew environments\nAs part of developing your strategy, consider when to create environments to support a workload. Your evaluation must balance the benefits of isolation that an environment provides, such as locking down particular environments for better security, with the disadvantages, like the friction users face when sharing data across apps.\nWhen you're evaluating whether an app or an automation belongs in its own environment, assess the different stages of the app's life cycle separately. During development, isolation from other apps is important. When multiple apps are developed in a single environment, you risk creating cross-app dependencies.\nAs a general recommendation, when possible, development environments should be single-purpose, disposable, and easily recreated.\nTesting multiple apps in the same environment makes sense if they run together in production. In fact, if you don't test with the apps that will be running in production, you risk not discovering compatibility problems.\nWhen you evaluate the production environment for an app, keep the following considerations in mind:\nIs the app compatible with existing apps in the environment?\nFor example, two apps that both use the Dataverse Contact table for different purposes might not be compatible. Are the apps compatible from a data policy perspective?\nAre there special compliance or regulatory requirements for separation of data?\nFor example, does the sensitivity of the data require it to be isolated? Is there a requirement that data can't be included with other data?\nIs the data highly confidential or sensitive? Would exfiltration cause monetary or reputational damage to the organization?\nIsolating in a separate environment can allow for more control over security.\nDoes the app need data from other apps and need to be collocated with them?\nFor example, two apps that both use your Customer table should be hosted together. Separating them would create redundant data copies and create problems with maintaining the data.\nDoes the data require regional data residency?\nIn some scenarios, the same app or automation can be deployed to regional environments to ensure appropriate data isolation and residency.\nAre most users in the same region as the environment?\nIf the environment is in EMEA, but most of the app's users are US-based, sharing an environment might not provide the best performance.\nWill new admins be needed, or will the existing admins be sufficient?\nIf the new app requires more admins, are they compatible with the existing admins (since all will have admin permissions on all apps in the environment)?\nWhat's the life expectancy of the app?\nIf the app or automation is temporary or short-lived, it might not be a good idea to install it in an environment with more permanent apps.\nWill users have difficulty having to use multiple environments for different apps?\nThis can affect everything from finding an app on their mobile device to self-service reporting that has to pull data from multiple environments.\nCapacity\nEach environment, except trial and developer environments, uses 1 GB to provision initially. Capacity is shared across the tenant so it needs to be allocated to those who need it.\nConserve capacity by:\nManaging shared test and production environments. Unlike shared development environments, permissions in test and production environments should be limited to user access for testing.\nAutomate cleanup of temporary development environments and encourage use of trial environments for testing or proof-of-concept work.\nEnvironment groups\nEnvironment groups are flexible and allow you to accommodate various use cases unique to your organizations. Here are a few ways you could consider grouping environments as part of your environment strategy:\nBy service or component; for example, a ServiceNow service tree\nDevelopment, test, and production\nDepartments, business groups, or cost centers\nBy Projects\nBy location, if most environments in a location have similar governance needs; this can also help meet similar regional regulatory and legal compliance\nFigure: Environment groups for two different departments with different rules.\nNaming environments and groups\nAs part of your strategy, consider how environments and groups are named.\nEnvironment names are visible to admins, makers, and users. Only admins typically use environment groups, but makers may encounter them if they have privileges to create environments.\nDeveloper environments that are automatically created follow the pattern\n's Environment\n; for example, \"Avery Howard's Environment.\" Environment groups aren't named automatically.\nEnvironment and environment group names aren't required to be unique. However, to avoid confusion, it's a best practice to avoid duplicate names.\nNames are limited to 100 characters. Shorter names are easier to use.\nNaming conventions\nEstablish consistent naming conventions.\nConsistent names help admins know what the group's purpose is and what environments it manages. Consistent names also make automation and reporting easier.\nA common practice is to include the lifecycle stage in the name of an environment; for example, Contoso Dev, Contoso Test, Contoso Prod. The goal is to clearly separate environments that have the same content, but different purposes.\nAnother common practice is to include the department or business unit in the name when the environment is dedicated to that group of users.\nFor example, you might decide that all environment or environment group names must follow the pattern\n---\n(Prod-US-Finance-Payroll).\nKeep names short, meaningful, and descriptive.\nAvoid including confidential information in names. They can be visible to anyone who has access to the admin center.\nThink about how your groups will evolve and grow over time, and make sure your naming convention can accommodate these evolving needs.\nAssets in the default environment\nYour environment strategy should encourage (or enforce) the use of personal, development environments to reduce what gets created in the default environment. However, you should look at what makers have already created in the default environment and evaluate how to handle each use case. Is it appropriate to leave in the default environment, or should it be migrated to another environment?\nA key part of this hygiene effort is identifying widely used applications in your organization that need a protected development environment separate from the production environment.\nThe following table lists example use cases and migration actions. Ultimately, your organization needs to identify its own use cases and risk factors associated with leaving assets in the default environment. Learn more about when to\nmove assets from the default environment\n.\nDefault environment\nMigration action\nMicrosoft 365 personal productivity\nStay in the default environment.\nAssets with a single maker that have been used recently but aren't shared\nMove to the owner's individual, developer environment.\nAssets with a single maker that have been used recently and are shared\nMove to the owner's individual developer environment and run from a shared production environment.\nAssets with multiple makers that have been used recently and are shared\nMove to a shared developer environment and run from a shared production environment.\nAssets that haven't been used recently\nNotify the owner and move to quarantine if no response.\nAssets in Dataverse for Teams environments\nMicrosoft Dataverse for Teams\nempowers users to build custom apps, bots, and flows in Microsoft Teams by using Power Apps, Microsoft Copilot Studio, and Power Automate. When a team owner adds this capability to their team, a Microsoft Power Platform environment with a Dataverse for Teams database is created and linked to their team. Learn how to\nestablish governance policies to manage Microsoft Dataverse for Teams environments\n.\nEnvironment strategy internally at Microsoft\nMicrosoft considers itself \"Customer Zero\" because it internally adopts Power Platform to drive automation and efficiency for its employees. The following numbers show the scale of use across Microsoft's internal tenant.\n50,000-60,000 active makers each month\nOver 250,000 applications and over 300,000 flows\nOver 20,000 environments\nMicrosoft is shifting from its prior environment strategy to one using the latest Power Platform governance features, including Managed Environments, environment groups, and rules.\nAs part of the enhanced strategy Microsoft plans to group together scenarios based on development type, organizational ownership, and risk level. Because so much is being built across the company, it's hard to focus on every possible scenario and to customize for each use case. Given the scale of innovation and change, automation is required, together with as many out-of-the-box controls as possible.\nMicrosoft is structuring its Power Platform environments into three broad categories that cover seven use cases, reflecting varying degrees of risk and control: personal productivity, team collaboration, and enterprise development.\nPersonal productivity\n: For users who just want to build an app or flow for themselves, without collaborating with others. These users are routed to personal development environments. These locked-down environments use Managed Environment features, including restricting sharing and controlling other actions. Connectors and actions in these environments are heavily restricted. These environments are the least risky. Using locked-down personal environments allows users to avoid the more rigorous compliance process required to build personal productivity apps and flows.\nTeam collaboration\n: For users who are building tooling, automation, and processes for their team. For this scenario, Microsoft recommends using Dataverse for Teams environments. Lifecycle, access management, and data labeling are controlled at the Microsoft 365 group-level, eliminating the need to spend time managing these users from a Power Platform governance perspective. This level of use is the next step up in the risk spectrum.\nEnterprise development/production-level used by all employees\n: For users building tooling or solutions used more broadly across the company. These environments may store the most sensitive data, use more powerful connectors, and require more governance. This level carries the highest risk and, therefore, significant effort is spent on governance. ALM is required, with preproduction work happening in sandbox environments and only managed solutions allowed in production environments. These environments must be linked to ServiceTree, which enforces reoccurring security and privacy reviews. The environment group rules are customized based on ServiceTree metadata and signals. Many environment groups and rules are used to manage and control these environments.\nMicrosoft's governance strategy isn't static. It's fluid and changes to adapt to new challenges and incorporate new Power Platform features.\nEvolve your tenant environment strategy\nIn this article, we described how to establish an enterprise-scale tenant environment strategy. The strategy grows with your business, regardless of where you're starting on the journey. Organizations of any size can benefit from the strategy we present; however, for organizations that are already at higher scale, the benefits are greater.\nDeveloping a tenant environment strategy isn't a one-time activity. It's a journey. Evolve your strategy over time as your needs change. Your strategy must also adjust to adopt new capabilities of the platform and to address new challenges.\nLike all journeys, different organizations join at different points along the way, but all have the same destination in mind. What follows are possible on-ramps that represent where your organization is today.\nStart\nYour organization is at the beginning of its journey to adopt Power Platform. This stage is often referred to as\ngreenfield\n. You're starting your journey at the best place because you don't have to worry about existing environments or the impact that new policies might have on how people in your organization are using Power Platform. This is the best time to implement an enterprise-scale environment strategy that's aligned with product features and best practices.\nExplore the key environment features and strategies that are outlined in this article. Take the time to understand the key themes and the considerations and decisions that you need to make to design and implement a tenant environment strategy that best fits your requirements.\nEstablishing a solid foundation now is essential to avoid having to wrangle an out-of-control situation that can occur later if you start without a defined strategy. Plan for rapid acceleration of your use of Power Platform, but avoid the temptation to over-engineer your environment strategy by adding complexity that isn't required. Remember, this is a journey, and you can continue to evolve your strategy as your needs change.\nAlign\nYour organization has and is executing an environment strategy that needs to be modified to align with new Power Platform features and best practices. This stage is often referred to as\nbrownfield\n. Unlike organizations just starting out, you need to consider the impact on your organization of changing your environment strategy.\nExplore the key environment features and strategies that are outlined in this article and evaluate what's required to evolve your strategy to be more in line. Usually all that's needed are incremental adjustments. When possible, plan the roll-out of changes to minimize the impact on your users.\nThe following suggestions are common incremental changes you could implement:\nTo start your alignment without affecting existing environments, create an environment group that contains new developer environments and establish rules for how you want to govern them. Turn on environment routing to ensure that all new developer environments are created in the designated group.\nEvaluate your grouping strategy and, if needed, create groups to support your existing environments. Establish rules on those groups that align with existing restrictions and exceptions. Move existing environments into those groups.\nIdentify broadly popular applications that are built and used in the default environment. Use pipelines to publish them to a production environment where users in your organization can run them. Then work on migrating development of those apps to either an individual developer environment or a dedicated development environment.\nCreate a plan to identify, quarantine, and remove assets in the default environment that aren't being used.\nEnhance\nThe environment strategy you're executing is already in line with the latest features and best practices, but your organization wants to add more controls or features.\nCommunicate your environment strategy to your organization\nYou implement your tenant environment strategy more successfully if your Power Platform users understand and are aligned with what you're trying to achieve. If you simply activate your strategy without any communication, users see the changes as restrictions and look for ways to work around them.\nAs part of developing or evolving your strategy, decide how you inform users of key elements of the strategy that affect their use of Power Platform. They don't need all the technical details of your strategy—only the essentials that help them stay productive. For example, communicate:\nThe purpose of the default environment\nWhere they should build new low-code assets\nHow they should use their personal developer environment\nHow to request custom environments for specific business units or projects\nGeneral connector usage policies, and how to request more connector privileges for their environments\nHow to share what they build with others\nThe responsibilities of a maker; for example:\nKeep the tenant clean. Delete your environments, apps, and flows if they're no longer needed. Use test environments if experimenting.\nShare wisely. Watch out for oversharing of your environments, apps, flows, and shared connections.\nProtect organization data. Avoid moving data from highly confidential or confidential data sources to unprotected or external storage.\nWhen your strategy changes, share how the changes affect your users so that they know what to do differently\nA good start is to\nturn on the maker welcome content\nin the environment group where new makers are added.\nFigure: Use the welcome content to help new makers be successful.\nAnother effective approach to communicating with your users is establishing an internal Power Platform hub. The hub can be a place for people to collaborate on projects, share ideas, and discover new ways to apply technology to achieve more. The hub is where you might also share detailed information about your environment strategy that's relevant to your users. Learn how to\ncreate an internal Power Platform hub\n.\nConclusion\nIn this article, we explored features that are designed to help your organization manage Power Platform environments at enterprise scale and incorporate them into your tenant environment strategy.\nAs your organization adopts Power Platform and usage accelerates, the need for environments can change rapidly. You need an agile approach that helps your environment strategy keep up with changes and continue to meet your organization's evolving governance requirements.\nA key factor for success with a tenant environment strategy is communicating with your makers and users and gaining their support. Make sure that the people who build low-code applications and automations know how to follow your organization's environment strategy and where they should be building their low-code assets.\nEvery organization's journey to adopting Power Platform is unique. We presented some ideas to help you get started. Your Microsoft account team or Power Platform partner can help you create a more customized tenant environment strategy for your organization.\nRelated information\nEnvironment groups\nEnvironment routing\nSecurity overview and strategy\nLow-code security and governance\nSolution concepts in ALM\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Environment Strategy", @@ -332,7 +332,7 @@ "https://learn.microsoft.com/en-us/power-platform/guidance/coe/power-bi-monitor": { "content_hash": "sha256:e8417dcc35e2581d85cd232ce3495e00e5b06b3af77f2884447daf08a288bf2f", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nMonitor with the CoE Power BI dashboard\nFeedback\nSummarize this article for me\nWith the\nMonitor\nsection of the Center of Excellence (CoE) Power BI dashboard, you can query basic inventory (environments, apps, flows, makers, connectors, and audit logs) to monitor usage across your entire tenant and within each environment. These reports also support drill-downs and filtering, for example by maker department/country/city, connector usage, or premium feature usage.\nOverview\nThe\nOverview – Power Apps, Power Automate and Chatbots\npage provides you with a tenant-wide overview of resources:\nTotal number of environments (and environments created this month)\nTotal number of environment makers\nTotal number of custom connectors\nTotal number of apps, app makers, and apps created this month\nTotal number of flows, flow makers, and flows created this month\nTotal number of bots, bot makers, and bots created this month\nThe visualizations highlight environments and makers that have the most resources, and show a map of where your makers are based.\nEnvironments\nThe\nEnvironments\npage shows you how many environments, environment makers, and Microsoft Dataverse instances you have.\nThe graphs visualize:\nThe environment creation trend by date\nThe number of resources per environment\nThe number of environments by type\nTop environment creators\nThe number of Managed Environments\nThe filters allow you to drill down and analyze specific environment types, maker trends, and changes over time.\nTeams Environments\nThe\nTeams Environments\npage shows you the number of Microsoft Teams environments, environment makers, and resources in those environments.\nThe graphs visualize:\nThe environment creation trend\nThe number of resources per environment\nThe tables of environments shows:\nEnvironment Name\nLink to the Environment in the Admin Center\nLink to the connected Microsoft Teams\nOwner\nLatest App launch in the environment\nNumber of apps and flows\nA red icon if no apps or flows exist in the environments\nCreated On date\nThe table of apps shows:\nApp name\nOwner\nLast launched\nCreated on date\nModified on date\nNote\nLast launched\ninformation is only available if the\nAudit Log\nhas been configured.\nInformation about bots created via Microsoft Copilot Studio in Microsoft Teams environments is currently not available in the CoE Starter Kit.\nThe filters allow you to filter by Owner as well as Created date.\nApps\nThe\nApps\npage provides an overview of apps in your environment:\nTotal number of apps\nTotal number of apps created this month\nTotal number of app makers\nTotal number of canvas apps and model-driven apps\nThe number of production apps (a\nproduction app\nhas had 50 active sessions, or active sessions by five unique users, in a month)\nOn the graphs, you can see your app creation trend, your makers over time, your top environments, and top connectors used in apps.\nFilters on this page can help you narrow down this view by app owner, app plan classification, app type, environment name and type, or connector used.\nThe hamburger menu on this page helps you navigate to other reports relevant to Power Apps.\nSharePoint form apps\nThe\nSharePoint Form Apps\npage provides an overview of apps created to customize SharePoint lists or document library forms.\nNavigate directly to the SharePoint site and view how many connectors are being used in a customized form.\nCloud flows\nThe\nCloud flows\npage provides an overview of cloud-based API automation flows in your environment:\nTotal number of flows\nTotal number of flows created this month\nTotal number of flow makers\nTotal number of started, suspended and stopped flows\nThrough visuals, you can see your flow creation trend, your top active departments, and top environments and top connectors used in flows.\nFilters on this page can help you narrow down this view by flow owner, flow state, flow display name, environment, maker department, or connector used.\nCustom connectors\nThe\nCustom Connectors\npage helps you understand what\ncustom connectors\nyou have, what endpoints they're connecting to, and which resources are using the custom connector.\nNext to the total number of custom connectors and number of test connectors (those that have the word\nTest\nin the display name), you'll also see a connector creation trend, which environments have the most custom connectors, and which flows and apps are using custom connectors.\nFilters help you narrow down the view by connector creator, environment, or created date.\nDesktop flows\nThe\nDesktop flows\npage provides an overview of UI-based robotic process automation (RPA) flows in your environment:\nTotal number of desktop flows\nTotal number of desktop flows created this month\nTotal number of desktop flow makers\nThrough visuals, you can see your flow creation trend and top environments with desktop flows. Use the list view of all flows to sort your flows by type and flow state.\nFilters on this page can help you narrow down this view by flow owner, flow display name, environment, maker department, or desktop flow type.\nBots\nThe\nBots\npage provides an overview of Microsoft Copilot Studio bots in your environment:\nTotal number of bots\nTotal number of bots created this month\nTotal number of bot makers\nTotal number of published bots\nThrough visuals, you can see your bot creation trend and top environments with bots. Use the list view of all bots to sort your bots by bot maker or bot state.\nFilters on this page can help you narrow down this view by environment and by maker.\nAI Builder models\nThe\nAI Builder Models\npage provides an overview of AI Builder Models in your environment:\nTotal number of AI Builder models\nTotal number of AI Builder models created this month\nTotal number of AI Builder models makers\nThrough visuals, you can see your AI Builder model creation trend and top environments with AI Builder models. Use the list view of all AI Builder models to sort your AI Builder models by maker or template.\nFilters on this page can help you narrow down this view by environment and by maker.\nPower Pages\nThe\nPower Pages\npage provides an overview of Power Pages in your environment:\nTotal number of Power Pages\nTotal number of Power Pages created this month\nTotal number of Power Pages makers\nThrough visuals, you can see your Power Pages creation trend and top environments with Power Pages. Use the list view of all Power Pages to sort your Power Pages by maker, website, website status, and table permission.\nFilters on this page can help you narrow down this view by environment and by maker.\nSolutions\nThe\nSolutions\npage provides an overview of Power Platform solutions in your environment:\nTotal number of solutions\nTotal number of solutions created this month\nTotal number of solution makers\nThrough visuals, you can see your solution creation trend and top environments with solutions. Use the list view of all solutions to sort your solutions by publisher, maker, or environment.\nFilters on this page can help you narrow down this view by environment, publisher, and maker.\nBusiness process flows\nThe\nBusiness Process Flows\npage provides an overview of Business process flows in your environment:\nTotal number of business process flows\nTotal number of business process flows created this month\nTotal number of business process flows makers\nThrough visuals, you can see your business process flow creation trend and top environments with business process flows. Use the list view of all business process flows to sort by state, maker, and environment.\nFilters on this page can help you narrow down this view by environment, state, and maker.\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "CoE Power BI Monitor", @@ -341,7 +341,7 @@ "https://learn.microsoft.com/en-us/power-platform/release-plan/2025wave2/": { "content_hash": "sha256:0b8aed00eaf5b437dd61358fdc7c4d82acb703b60d4c44d22e2ed2f93618d189", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nMicrosoft Power Platform 2025 release wave 2 plan\nFeedback\nSummarize this article for me\nThe Microsoft Power Platform release plan for the 2025 release wave 2 announces the latest updates to customers as features are prepared for release. You can browse the release plan here\nonline\n(updated throughout the month), view it in the\nRelease planner\n, or download the document as a\nPDF file\n, which is updated with every publish. The plan for 2025 release wave 2 covers new features for Dynamics 365 releasing from\nOctober 2025\nthrough\nMarch 2026\n.\nDownload the 2025 release wave 2 PDF for Power Platform\nor select the option at the bottom of the table of contents.\nThe Dynamics 365 features coming in the 2025 release wave 2 have been summarized in a separate\nrelease plan\nand a downloadable\nPDF\n.\nThe role-based Copilot offerings features coming in the 2025 release wave 2 have been summarized in a separate\nrelease plan\nas well as a downloadable\nPDF\n.\n2025 release wave 2 overview\nMicrosoft Power Platform enables users and organizations to analyze, act on, and automate data to digitally transform their businesses. Microsoft Power Platform today is comprised of: Power Apps, Power Pages, Power Automate, Microsoft Copilot Studio, Microsoft Dataverse and Microsoft Power Platform governance and administration. The 2025 release wave 2 contains hundreds of new features across Power Platform applications, including Power Apps, Power Pages, Power Automate, and Microsoft Copilot Studio, as well as Microsoft Dataverse and Power Platform capabilities for governance and administration.\nPower Apps\nPower Apps\nenables human and agent collaboration. They include an agent feed to supervise the work of agents and extensible built-in agents for common tasks like entering, exploring, visualizing, and summarizing data. Bring business problems to Plan Designer and a team of agents will help you build enterprise solutions that include apps, agents, Power BI reports and more. Vibe-code with the App Agent to create data-connected experiences. Just describe what you need or provide an image, and it will be done!\nPower Pages\nPower Pages\nenables businesses to build secure, data-driven portals effortlessly. In this wave, we will further expedite site building for low-code makers and pro developers to build intelligent sites for your employees, customers, and partners. Introduction of enhanced security agent features will further empower low code makers, pro developers, and admins with actionable insights and abilities for securing their websites.\nPower Automate\nPower Automate\nis transforming how enterprises automate complex business processes; through new human in loop experiences such as advanced approvals, AI native capabilities such as Generative Actions and Intelligent Document processing. To manage complex automations at scale there is a comprehensive suite of governance, observability, and security controls coming to Automation Center and Power Platform Admin Center.\nMicrosoft Copilot Studio\nMicrosoft Copilot Studio\ncontinues its journey to make agent creation and operation even easier and more powerful with autonomous agents in Microsoft 365 Copilot, the ability to build complete teams of agents that work seamlessly together, and improved governance for enterprise scalability. Copilot Studio will offer even deeper integration with Azure AI Foundry and the Microsoft Graph, ensuring your agents can use the latest AI technology in coordination with your data in the Microsoft Graph.\nMicrosoft Dataverse\nMicrosoft Dataverse\ncontinues to serve as a trusted low-code data platform, enabling the creation of scalable agents, Copilot applications, and automations. This update introduces enhancements to core agentic capabilities - including Dataverse for Agents and Dataverse Search - to support smarter, AI-ready experiences. New features such as Dataverse MCP Server and AI-powered business logic tools further expand the ability to build dynamic, intelligent solutions grounded in enterprise data.\nGovernance and administration\nMicrosoft Power Platform governance and administration\nwill become the unified governance hub for managing intelligent agents, agent-driven apps, and automated workflows across the Microsoft ecosystem in this release wave. This will provide the most secure, governable, reliable platform for agent development.\nKey dates for the 2025 release wave 2\nThese release plans describe functionality that may not have been released yet. Delivery timelines and projected functionality may change or may not ship (see\nMicrosoft policy\n).\nHere are the key dates for the 2025 release wave 2.\nMilestone\nDate\nDescription\nRelease plans available\nJuly 16, 2025\nLearn about the new capabilities coming in the 2025 release wave 2 (October 2025 - March 2026) across Microsoft Power Platform, Dynamics 365, and role-based Copilot offerings.\nEarly access available\nAugust 4, 2025\nTest and validate new features and capabilities that will be part of 2025 release wave 2, coming in October, before they are enabled automatically for your users. You can view the Dynamics 365\n2025 release wave 2 early access features\nnow\n.\nRelease plans available in additional languages\nJuly 30, 2025\nThe Microsoft Power Platform, Dynamics 365, and role-based Copilot offerings release plans are published in 11 additional languages: Danish, Dutch, Finnish, French, German, Italian, Japanese, Norwegian, Portuguese (Brazilian), Spanish, and Swedish.\nGeneral availability\nOctober 1, 2025\nProduction deployment for the 2025 release wave 2 begins.\nRegional deployments will start on October 1, 2025.\nJust like the previous release waves, we continue to call out how each feature will be enabled in your environment:\nUsers, automatically\n: These features include changes to the user experience for users and are enabled automatically.\nAdmins, makers, or analysts, automatically\n: These features are meant to be used by administrators, makers, or business analysts and are enabled automatically.\nUsers by admins, makers, or analysts\n: These features must be enabled or configured by the administrators, makers, or business analysts to be available for their users.\nYou can get ready with confidence knowing which features will be enabled automatically.\nWe’ve done this work to help you—our partners, customers, and users—drive the digital transformation of your business on your terms. We’re looking forward to engaging with you as you put these new services and capabilities to work, and we’re eager to hear your feedback as you dig in to the 2025 release wave 2 plans.\nLet us know your thoughts. Share your feedback in the\nMicrosoft Power Platform community forum\n. We'll use your feedback to make improvements.\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Release Plans", @@ -350,7 +350,7 @@ "https://learn.microsoft.com/en-us/power-platform/alm/pipelines": { "content_hash": "sha256:25d5efa453fcca72c6da99d4ed51fd671be0be17f41270a568087a450e7fc344", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nOverview of pipelines in Power Platform\nFeedback\nSummarize this article for me\nPipelines in Power Platform aim to democratize application lifecycle management (ALM) for Power Platform and Dynamics 365 customers by bringing ALM automation and continuous integration and continuous delivery (CI/CD) capabilities into the service in a manner that's more approachable for all makers, admins, and developers.\nPipelines significantly reduce the effort and domain knowledge previously required to realize the ROI from adopting healthy, automated ALM processes within your team or organization.\nAdmins easily configure automated deployment pipelines in minutes rather than days or weeks.\nMakers have an intuitive user experience for easily deploying their solutions.\nProfessional developers can (optionally)\nextend pipelines\nand run them using the Power Platform command line interface (CLI).\nAdmins centrally manage and govern pipelines\nPipelines enable admins to centrally govern citizen-led and pro-dev-led projects at scale with less effort. Admins set up the appropriate safeguards that govern and facilitate solution development, testing, and delivery across the organization. Other admin benefits include:\nLower total cost of ownership:\nPipelines significantly improve maker, developer, and admin productivity. Pipelines enable your business solutions to come to market faster, with higher quality, through a safe and governed process.\nMinimal effort to implement a secure and custom-tailored change management processes across your organization or team.\nSave time and money:\nThe system handles the heavy lifting and ongoing maintenance so you don't have to.\nScale ALM at your own pace:\nRegardless of where you're at in your ALM journey, you can extend pipelines to accommodate your evolving business needs. We aim for this upward transition to be as seamless and effortless as possible. More information:\nMicrosoft Power Platform CLI\npac pipeline\ncommand group\nAchieve compliance, safety, monitoring, and automation goals with:\nSecure production environments with approval based\ndelegated deployments\n.\nCustomizations and audit log saved automatically and are easily accessible.\nOut-of-the-box analytics provides better visibility within a central location.\nThe ability to view out-of-the-box Power BI reports within the pipelines app or create your own. More information:\nReporting overview for model-driven apps\nCustom tailor pipelines to the needs of your organization with\npipelines extensibility\nand Power Automate.\nMakers run preconfigured pipelines\nOnce pipelines are in place, makers can initiate in-product deployments with a few clicks. They do so directly within their development environments. Other benefits to makers include:\nNo prior knowledge of ALM processes or systems required. Citizen developers often view pipelines as a guided change management process.\nSolution deployments are prevalidated against the target environment to prevent mistakes and improve success rates. For example, missing dependencies and other issues are detected before deployment and makers are immediately guided to take the appropriate action.\nConnections and environment variables are provided upfront and validated before the deployment begins.\nThis helps ensure applications and automation are deployed without needing manual post-processing steps, and are connected to the appropriate data sources within each environment.\nAdmins can even preconfigure certain connections that will be used.\nDevelopers can use and extend pipelines\nProfessional developers are more productive with pipelines now handling the complex background operations. Developers can tell the system what they want to accomplish instead of executing the various underlying tasks necessary to accomplish the same goal. Using the Power Platform CLI, developers can:\nList pipelines to view pertinent details such as which stages and environments are ready to deploy their solutions to.\nDeploy a solution with a single command:\nWith pipelines, developers simply provide the required parameters and the system orchestrates all the end-to-end deployment operations in compliance with the organizational policies.\nNo need to connect to multiple environments, export solutions, download solution files, manually create connections and populate deployment settings files, import solutions, or handle various other tasks that were required previously.\nAdditionally, developers can\nextend pipelines\nand integrate with other CI/CD tools.\nFrequently asked questions\nWhat do pipelines deploy?\nPipelines deploy solutions as well as configuration for the target environment such as connections, connection references, and environment variables. Any Power Platform customization contained in your solution can be deployed using pipelines. Pipelines, or solutions in general, don't contain data stored within Dataverse tables.\nImportant\nPower BI Dashboards (preview) and Power BI Datasets (preview) are not currently supported in pipelines.\nWhy can't I see my pipeline from my environment?\nFirst, ensure that your source and target environments are linked properly. You'll only be able to view your pipeline in the assigned source environments, such as your development environments. When linking each of your environments to your pipeline during configuration, you have an option of\nDevelopment Environment\nor\nTarget Environment\nenvironment type. If your pipeline-associated environments are assigned their proper type, your pipeline appears as an option in your source development environment.\nDoes pipelines automatically store solution backups?\nYes. Both managed and unmanaged solutions are automatically exported and stored in the pipelines host for every deployment.\nCan customization bypass a deployment stage such as QA?\nNo. Solutions are exported as soon as a deployment request is submitted (when the maker selects\nDeploy\nfrom within their development environment), and the same solution artifact will be deployed. Similarly, the system doesn't re-export a solution for deployments to subsequent stages in a pipeline. The same solution artifact must pass through pipeline stages in sequential order. The system also prevents any tampering or modification to the exported solution artifact. This ensures customization can't bypass QA environments or your approval processes.\nAre standalone licenses required to use pipelines?\nDeveloper environments aren't required to be Managed Environments. They can be used for development and testing with the developer plan.\nThe pipelines host should be a production environment, but the pipelines host doesn't have to be a Managed Environment.\nAll other environments used in pipelines must be enabled as Managed Environments.\nLicenses granting premium use rights are required for all Managed Environments.\nA common setup example:\nEnvironment purpose\nEnvironment type\nStandalone license required\nHost\nProduction\nNo\nDevelopment\nDeveloper\nNo\nQA\nDeveloper\nNo\nProduction\nProduction\nYes\nCan I ensure pipeline targets are Managed Environments?\nYes. Tenant admins can automatically convert pipeline target environments to Managed Environments, ensuring compliance with Microsoft standards.\nTo enable an environment as a Managed Environment, go to the Power Platform admin center\nDeployments\n>\nSettings\n. Turn on the automatic managed environment setting for each pipeline host.\nImportant\nStarting February 2026, Microsoft will start enabling Managed Environments for any pipeline target environments that aren’t already enabled. Customers will be notified via Microsoft 365 Message center.\nWe recommend you review and enable Managed Environments for all pipeline targets now. You can do this manually now or set it to occur automatically:\nManually:\nGo to enable\nManaged Environments\n.\nAutomatically:\nConfigure the setting for new pipelines as described above.\nCan I configure approvals for deployments?\nYes. See\ndelegated deployments\n.\nCan I use different service principals for different pipelines and stages?\nYes. More information:\nDeploy with a service principal\nWhat connections can be used?\nSimilar to authoring experiences, makers running pipelines can either provide their own connections or connections they have access to. Service principal connections can also be used for connectors that support service principal authentication, including custom connectors.\nWhy can't I update existing connection references?\nCurrently, connection references without a value in the solution or targeted environment can't be updated during deployment. If a value was deployed previously, it can be updated in the targeted environment.\nWho owns deployed solution objects?\nThe deploying identity. For standard deployments, the owner is the requesting maker. For delegated deployments, the owner is the delegated service principal or user.\nCan pipelines deploy to a different tenant?\nNo. We recommend using Azure DevOps or GitHub for this scenario.\nWhy can't I access the \"Manage pipelines\" button in the command bar?\nIf the user has the \"Deployment Pipeline Administrator\" security role the \"Manage pipelines\" button will be enabled and it will open the \"Deployment Pipeline Configuration\" app. The button will also not be enabled if there is no platform host or custom host available. More information:\nAccessing the \"Deployment Pipeline Configuration\" app\nWhat should I do if my development or target environment is reset or deleted?\nYou should delete the environment record and update the pipeline configuration when needed. If an environment is reset, you re-create the environment record then associate it with your pipeline.\nCan I use pipelines in the default environment?\nYes. However, using the default environment as the pipelines host isn't recommended for all customers.\nCan I deploy using my own service principal?\nYes. More information:\nDeploy pipelines as a service principal or pipeline owner\n.\nCan pipelines be used with Azure DevOps, GitHub, or the ALM Accelerator?\nYes, together these tools are powerful while keeping maker experiences simple. More information:\nextend pipelines\nCan I roll back to a previous version?\nYes. If the pipeline setting is enabled, you can\nredeploy previous solution versions\nfrom the run history view on the Pipelines page. If the setting is disabled, only higher solution versions can be deployed or imported. As a work-around, admins can download the artifact from the pipelines host, increment the solution version in the solution.xml file, then manually import it into the target environment.\nCan I set retention policies for pipelines data?\nYes. You can configure bulk delete jobs in the Dataverse pipelines host to delete data on a defined schedule.\nCan I specify advanced solution import behaviors such as update versus upgrade?\nNot currently. Pipelines default import behavior is\nUpgrade\nwithout\nOverwrite customizations\n.\nCan an environment be associated with multiple hosts?\nNo. However, one environment can be linked to multiple pipelines within the same host. In order to associate an environment with a different host, add it to a pipeline in the new host. Then delete the environment record from the original host and verify everything works as expected.\nCan I customize or extend the first-party deployment pipeline app and tables?\nNot currently. However, intentional extension hooks are available to customize pipelines logic. More information:\nextend pipelines\n.\nWhere can I view and run pipelines?\nNavigate to an unmanaged solution in development to an environment associated with your pipeline. Pipelines can't be viewed or run from the default solution, managed solutions, or in target environments. Notice you can also retrieve and run pipelines from the Power Platform CLI.\nCan I deploy across regions?\nYes, but only if the\nCross-Geo Solution Deployments\nsetting is enabled in the host. If the setting is disabled, the host and all environments associated with pipelines in a host must be located within the same geographic location (as specified when creating environments). For example, if the setting is disabled, a pipeline can't deploy from Germany to Canada and a host in Germany can't manage environments in Canada. In a case where the tenant administrator would like to prevent cross-geo solution deployments, separate hosts should be used for Germany and Canada.\nCan I deploy the same solution using different pipelines?\nYes, this is possible, although we recommend starting with the same pipeline for a given solution. This helps avoid confusion and inadvertent mistakes. Pipeline run information is displayed in the context of one pipeline and one solution (within the solution experience). Therefore other pipelines might not show the latest deployed solution version or other important run information associated with different pipelines. Notice that the Deployment Pipeline Configuration app shows run information across all pipelines and all solutions for the current host.\nCan the host environment also be used as a development or target environment?\nUsing the same environment for development and the host isn't supported; other combinations aren't recommended as a best practice.\nHow can I view what changed between different versions?\nWithin the target environment, you can see layers of deployed objects as well as what changed between layers. Additionally, you can see XML diffs between layers for model-driven apps, site maps, and forms. Pipelines can also be extended to integrate with GitHub and other source control systems to compare granular diffs.\nShould my host environment be the same as where I installed the COE toolkit?\nThis is a valid configuration and should be evaluated based on the needs and policies within your organization.\nCan I deploy unmanaged solutions?\nNo. We recommend that you always deploy managed solutions to nondevelopment environments. Notice unmanaged solutions are automatically exported and stored in the pipelines host so you can download and import them to other development environments or put them in source control.\nCan I deploy multiple solutions at once?\nNot currently. You'll need to submit a different deployment for each solution. However, the same pipeline can be used for multiple solutions.\nDo pipelines publish unmanaged customizations before exporting the solution?\nNot currently. We recommend you publish individual objects as they're saved. Note that only certain solution objects require publishing.\nCan I use pipelines for multi-developer teams working in isolated development environments?\nThe current implementation uses a single development environment for a given solution.\nHow are pipelines different from the ALM Accelerator?\nBoth offer many valuable capabilities and the owning teams work together closely in developing the pipelines and broader ALM vision for Power Platform. Pipelines are more simplistic in nature and can be set up and managed with less effort. Access to other products and technologies isn't required as everything is managed in-house. The ALM Accelerator, on the other hand, is sometimes a better fit for more advanced ALM scenarios.\nWhile there are many additional functional differences, the fundamental difference is that pipelines are an official Microsoft Power Platform product feature—meaning it's designed, architected, engineered, tested, maintained, and supported by Microsoft product engineering. Pipelines are built into the product and can be accessed within native product experiences.\nWhen should I use pipelines versus another tool?\nWe encourage customers to use pipelines for core deployment functionality, and when needed, extend pipelines to integrate with other CI/CD tools. When used together, the workloads required within CI/CD tools often become less complicated and costly to maintain.\nNext steps\nSet up pipelines\nExtend pipelines\nRelated information\nDeploy solutions using Pipeline in Power Apps (video)\nSimplify Microsoft Power Platform deployments by using pipelines - Learning Path\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Pipelines Overview", @@ -359,7 +359,7 @@ "https://learn.microsoft.com/en-us/power-platform/alm/set-up-pipelines": { "content_hash": "sha256:7ce6c87aecc57a179debf05e9a8673ac51ba8ecf0db3964e0d2f18c701dccf31", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nSet up pipelines in Power Platform\nFeedback\nSummarize this article for me\nCreate and run pipelines to easily deploy solutions to environments. There are two different ways to set up pipelines:\nPlatform host\n. The default tenant-wide platform host, which can be configured by makers.\nCustom host\n. Admins configure a custom host to centrally govern citizen-led and pro-dev-led projects.\nInformation within each section of this article pertains to the specified host method for setting up pipelines.\nFrequently asked questions\nWill personal pipelines conflict with any pipelines that I have already set up?\nNo. Thanks to the host separation dynamic that we have in place, there's no way for a maker creating a personal pipeline (in the platform host) to associate an environment that is already associated with a custom host. By default, makers don't have permissions to create lightweight personal pipelines in environments already associated with a custom host. This means that your current pipelines UX, if in place, won't change.\nImportant\nMakers also don't receive elevated access to environments as a result of this feature. Selectable target environments are filtered to include only environments that a maker can already import to. This feature ensures that all personal pipelines are stored in the platform host that is accessible to administrators, and provides an easier way for makers to self-service their application lifecycle management (ALM).\nWhy can't I select or view certain environments when I create a pipeline?\nThe target environment picker filters out any environments that:\nThe current user doesn't have import-access to, or\nare outside of the geographical region that the pipelines host is located in if host-wide setting is disabled. More information:\nEnable cross-geo solution deployments\nYou also can't create a pipeline with a target environment that's already associated to the host as a development environment. To change an environment's type distinction in a host, you must\nplay the Deployment Pipeline Configuration app\n, delete the environment record, and re-create the environment record with the desired type.\nWhy am I seeing an error that states \"this environment is already associated with another pipelines host?\"\nThis error indicates that another host already contains an active environment record that you're trying to associate with the current host. To resolve this, go to\nUsing Force Link to associate an environment with a new host\nor\nDisassociating environments from one host and associating them with another host\n.\nDo the pipelines and data within the platform host count towards my Dataverse capacity?\nNo. The data consumption in the\nplatform host\ndoesn't count against your current plan since the pipelines data for the platform host is stored in Power Platform infrastructure. This data is stored within your tenant and accessible by administrators, but due to its implementation details, doesn't consume data capacity within a plan.\nHowever, capacity does apply to a\ncustom host\n, which isn't an implementation in the platform but is instead in a customizable environment.\nCan I enable makers to create personal pipelines in a custom host?\nYes. As an administrator, you can assign the\nDeployment Pipeline Default\nrole to anyone you want to grant lightweight pipeline creation permissions to. Administrators can also add users to the\nDeployment Pipeline Maker\nteam via the\nSecurity Teams\npage in the Deployment Pipeline Configuration app.\nThis Deployment Pipelines Default role isn't assigned to anyone by default in the case of custom host, so the lightweight personal pipeline creation experience is only visible by default in environments that aren't already associated with a custom host.\nAs an admin, how do I prevent makers from creating personal pipelines by default?\nBecause custom hosts don't grant pipeline create-access by default like the platform host does. You can\nset up a custom host\nand then\nuse force link\n, if necessary, to associate development environments with a custom host.\nIf there's already a custom host available skip this step. If not, you have to create one following the steps to\ncreate a pipeline using a custom pipelines host\n.\nOnce there's a custom host available, as an admin, navigate to the Deployment Pipeline Configuration app for the custom host. The app is located in the environment that you installed the Power Platform Pipelines package in.\nGo to\nEnvironments\nfrom the side navigation pane, and\ncreate new environment record(s)\nfor the development environments that you would like to prevent makers from creating new personal pipelines from. If the environment was already linked to another host, such as the platform host, the validation fails. If this occurs, select\nForce Link\non the command bar after validation failure to override the current link to the other pipelines host.\nFollowing these steps effectively disables the\ncreate pipeline\ncapability for any makers who access in pipelines feature in these development environments because they don't have pipelines permissions. Existing pipelines in the custom host, if any, are also not shared with any users by default. Admins are able to apply with workaround with any existing custom host as well.\nWhy am I not seeing the latest features for pipelines?\nThe pipelines package is always being updated to give you the latest and greatest for your ALM processes. Ensure that you have the latest Power Platform pipelines package in your\ncustom host\n:\nGo to the\nPower Platform admin center\n,\nSelect your pipelines host environment.\nSelect\nDynamics 365 apps\n, and locate\nPower Platform Pipelines.\nNotice if there's an update available.\nFor\nplatform hosts\n, the pipelines package is updated automatically, and might not be available as soon as the manual package update is made available for custom hosts.\nHow can I recover unmanaged solutions from Power Platform Pipelines past deployment history?\nThe\nImport Solutions from a Pipelines Host\nfeature in Power Platform allows users to recover unmanaged and managed solutions from past deployments. This capability is particularly useful in scenarios where a solution is accidentally lost or needs to be restored to a development environment. Alternatively, Pipelines admins can also use the pipeline configuration management app to download the unmanaged solution from the deployment history record easily.\nNext steps\nExtend pipelines in Power Platform\nRun pipelines in Power Platform\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Set Up Pipelines", @@ -368,7 +368,7 @@ "https://learn.microsoft.com/en-us/power-platform/alm/run-pipeline": { "content_hash": "sha256:5fc1770a907c605150c6ec002b6cae7a998e7a49a39ebc49abf979349f9a95c5", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nRun pipelines in Power Platform\nFeedback\nSummarize this article for me\nPipelines automate solution deployments between Power Platform environments and facilitate healthy application management practices with minimal effort.\nPrerequisites\nOne or more pipelines must already be created and associated with the environment that's used for development.\nThe development environment must have Microsoft Dataverse or Dataverse plus Dynamics 365 customer engagement apps.\nYou must have access to run a pipeline. More information:\nGrant access to edit or run pipelines\nYou must have privileges to import solutions to the target environments associated with a pipeline.\nThe\nPower Platform Pipelines\napplication must be installed in your pipeline host environment. More information:\nInstall the pipelines application in your host environment\nFor more information about these prerequisites, go to\nSet up pipelines\n.\nRun a pipeline\nSign in to a Power Platform environment using Power Apps (\nmake.powerapps.com\n) or Power Automate (\nmake.powerautomate.com\n) and select your development environment.\nTo deploy a solution using a pipeline, go to\nSolutions\nand create or select an unmanaged solution to deploy.\nFrom the\nSolutions\narea, choose between two options to include the solution in the pipeline:\nSelect\nPipelines\nfrom the left navigation pane.\nSelect\nOverview\nfrom the left navigation pane, and then select\nDeploy\non the command bar.\nSelect the stage to deploy to, such as\nDeploy to Test\n, select\nDeploy here\n, and the deployment pane appears on the right.\nChoose to deploy\nNow\nor schedule for\nLater\n, and then select\nNext\non the right pane. This initiates validation of the solution against the test environment. This validation can also be referred to as preflight checks. Missing dependencies and other common issues are checked that might cause a deployment to fail.\nIf connection references or environment variables are present, you’re prompted to provide these (just as you would when manually importing solutions).\nReview the summary of the deployment and optionally add deployment notes.\nSelect\nDeploy\n. This initiates an automated deployment to the target environment.\nNote\nPipelines aren't visible within the default solution, managed solutions, or target environments.\nYou must complete the deployment stages in order. For example, you can't deploy version 1.0.0.1 to production before it has been deployed to test. After you deploy to test, the same solution that was deployed will then be deployed to production, even if afterward you made changes to the solution without incrementing the version.\nA message stating your request to deploy here is pending, which means your admin attached\nbackground processes or approvals\nthat run before your deployment can proceed.\nCancel a scheduled deployment\nIf you have a scheduled deployment, you can cancel it through three different methods:\nIn the pipeline\nDetails\nsection where you began your deployment, there's an option to\nCancel deployment\nbefore the scheduled deployment time.\nIn\nRun history\n, selecting\n...\non a scheduled deployment displays a\nCancel deployment\noption.\nIn the\nInformation\npane, select a deployment record in\nRun history\n, and then select\nCancel deployment\nunder the\nStatus\nof a scheduled deployment.\nChange the time of a scheduled deployment as a pipeline admin\nIn the Deployment Pipeline Configuration app, perform the following steps:\nNavigate to\nRun history\nunder\nDeployments\n.\nSelect the record for the scheduled deployment that you want to change.\nChange the\nScheduled Time\n(shown in UTC, which might differ from your time zone) as desired.\nMonitor pipeline deployments\nThe\nPipelines\npage in the\nSolutions\narea displays all deployment activity for the current pipeline and solution.\nSelect a pipeline, then select\nRun history\nto view more detail and error information if there was a failure.\nRelated articles\nCreate a pipeline in Microsoft Power Platform - Learn Module\nSolution concepts\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Run Pipelines", @@ -377,7 +377,7 @@ "https://learn.microsoft.com/en-us/power-platform/alm/solution-concepts-alm": { "content_hash": "sha256:1463424b4811c9073775ab64bf2c799c80fd8caa2cf3afa919f32ecff60054e8", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nSolution concepts\nFeedback\nSummarize this article for me\nSolutions are the mechanism for implementing application lifecycle management (ALM) in Power Apps and Power Automate. This article describes the following key solution concepts:\nTwo types of solutions (managed or unmanaged)\nSolution components\nLifecycle of a solution\nSolution publisher\nSolution and solution component dependencies\nManaged and unmanaged solutions\nA solution is either\nmanaged\nor\nunmanaged\n.\nUnmanaged solutions\nare developed. Unmanaged solutions are used in development environments while you make changes to your application. Unmanaged solutions can be exported either as unmanaged or managed. Exported unmanaged versions of your solutions should be checked into your source control system. Unmanaged solutions should be considered your source for Microsoft Power Platform assets. When an unmanaged solution is deleted, only the solution container of any customizations included in it's deleted. All the unmanaged customizations remain in effect and belong to the default solution.\nManaged solutions\nare deployed. Managed solutions are deployed to any environment that isn't a development environment for that solution. These environments include test, user acceptance testing (UAT), system integration testing (SIT), and production environments. Managed solutions can be serviced independently from other managed solutions in an environment. As an ALM best practice, managed solutions should be generated by exporting an unmanaged solution as managed and considered a build artifact. Additionally:\nYou can't edit components directly within a managed solution. To edit managed components, first add them to an unmanaged solution.\nWhen you edit a managed component, you create a dependency between your unmanaged customizations and the managed solution. When a dependency exists, the managed solution can't be uninstalled until you remove the dependency.\nSome managed components can’t be edited. To verify whether a component can be edited, view the\nManaged properties\n.\nYou can't export a managed solution. But you can export an unmanaged solution as managed.\nWhen a managed solution is deleted (uninstalled), all the customizations and extensions included with it are removed.\nImportant\nYou can't import a managed solution into the same environment that contains the originating unmanaged solution. To test a managed solution, you need a separate environment to import it into.\nWhen you delete a managed solution, the following data is lost: data stored in custom tables that are part of the managed solution and data stored in custom columns that are part of the managed solution on other tables that aren't part of the managed solution.\nMakers and developers work in development environments using unmanaged solutions, then import them to other downstream environments—such as test—as managed solutions.\nNote\nWhen you customize in the development environment, you're working in the unmanaged layer. Then, when you export the unmanaged solution as a managed solution to distribute to another environment, the managed solution is imported into the environment in the managed layer. More information:\nSolution layers\nSolution components\nA component, also known as objects, represents something that you can potentially customize. Anything that can be included in a solution is a component. To view the components included in a solution, open the solution you want. The components are listed in the\nComponents\nlist.\nNote\nA solution can be up to 95 MB in size.\nYou can't edit components directly within a managed solution.\nTo view a list of component types that can be added to any solution, go to\nComponentType Options\n.\nSome components are nested within other components. For example, a table contains forms, views, charts, columns, tables relationships, messages, and business rules. Each of those components requires a table to exist. Except for choice columns, all other columns can’t exist outside of a table. We say that the column is dependent on the table. There are twice as many types of components as shown in the preceding list, but most of them are nested within other components and not visible in the application.\nThe purpose of having components is to keep track of any limitations on what can be customized using managed properties and all the dependencies so that it can be exported, imported, and (in managed solutions) deleted without leaving anything behind.\nSolution lifecycle\nSolutions support the following actions that help support application lifecycle\nprocesses:\nCreate\n. Author and export unmanaged solutions.\nUpdate\n. Create updates to a managed solution that are deployed to the parent managed solution. You can't delete components with an update.\nUpgrade\n. Import the solution as an upgrade to an existing managed solution, which removes unused components and implements upgrade logic. Upgrades involve rolling up (merging) all patches to the solution into a new version of the solution. Solution upgrades delete components that existed but are no longer included in the upgraded version. You can choose to upgrade immediately or to stage the upgrade so that you can do some additional actions prior to completing the upgrade.\nPatch\n. A patch contains only the changes for a parent managed solution, such as adding or editing components and assets. Use patches when making small updates (similar to a hotfix). When patches are imported, they're layered on top of the parent solution. You can't delete components with a patch.\nSolution publisher\nEvery app and other solution components such as tables you create or any customization you make is part of a solution. Because every solution has a publisher, you should create your own publisher rather than use the default. You specify the publisher when you create a solution.\nNote\nBy default, if you don't use a custom solution you'll be working in the default system solutions, which are known as the\nCommon Data Service Default Solution\nand the\nDefault\nsolutions. More information:\nDefault Solution and Common Data Service Default Solution\nThe preferred solution is a solution you specify that becomes your default solution. More information:\nSet the preferred solution\nThe publisher of a solution where a component is created is considered the owner of that component. The owner of a component controls what changes other publishers of solutions including that component are allowed to make or restricted from making. It's possible to move the ownership of a component from one solution to another within the same publisher, but not across publishers. Once you introduce a publisher for a component in a managed solution, you can’t change the publisher for the component. Because of this restriction, it's best to define a single publisher so you can change the layering model across solutions later.\nThe solution publisher specifies who developed the app. For this reason, you should create a solution publisher name that's meaningful.\nSolution publisher prefix\nA solution publisher includes a prefix. The publisher prefix is a mechanism to help avoid naming collisions. This allows for solutions from different publishers to be installed in the same environment with few conflicts. For example, the Contoso solution displayed here includes a solution publisher prefix of\ncontoso\n.\nNote\nWhen you change a solution publisher prefix, you should do it before you create any new apps or metadata items because you can't change the names of metadata items after they're created.\nMore information:\nCreate a solution publisher prefix\nChange a solution publisher prefix\nSolution dependencies\nBecause of the way that managed solutions are layered, some managed solutions can be dependent on solution components in other managed solutions. Some solution publishers take advantage of this to build solutions that are modular. You might need to install a \"base\" managed solution first and then install a second managed solution that further customizes the components in the base managed solution. The second managed solution depends on solution components that are part of the first solution.\nThe system tracks these dependencies between solutions. If you try to install a solution that requires a base solution that isn’t installed, you won’t be able to install the solution. You get a message saying that the solution requires another solution to be installed first. Similarly, because of the dependencies, you can’t uninstall the base solution while a solution that depends on it's still installed. You have to uninstall the dependent solution before you can uninstall the base solution. More information:\nRemoving dependencies\nSolution component dependencies\nA solution component represents something that you can potentially customize. Anything that can be included in a solution is a solution component and some components are dependent on other components. For example, the website column and account summary report are both dependent on the account table. More information:\nDependency tracking for solution components\nSee also\nSolution layers\nCreate and manage environments in the Power Platform admin center\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Solution Concepts", @@ -386,7 +386,7 @@ "https://learn.microsoft.com/en-us/power-platform/alm/admin-deployment-hub": { "content_hash": "sha256:e100b1e28e9135db1f7dba56895d126e2b39b1475d9bff966e650900837ef642", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nAdmin deployment page\nFeedback\nSummarize this article for me\nThe\nDeployment\npage in the Power Platform admin center provides a streamlined experience to help administrators navigate the complexities of managing Power Platform application lifecycle management (ALM) workloads, including managing pipelines deployments at enterprise scale. Admins have visibility to all the deployments in their tenant and can approve deployment requests and troubleshoot issues.\nNote\nCurrently the deployment page doesn't have all the capabilities available within the\nDeployment Pipelines Configuration app\n.\nUse the deployment page\nSign in to the\nPower Platform admin center\n.\nIn the navigation pane, select\nDeployment\n.\nSelect the desired pipelines host. Deployment data isn't currently aggregated across hosts.\nLearn about the ALM process in Power Platform\nThe\nGet started\nsection provides helpful learning content and guidance to set up deployments using best practices.\nManage deployments from one central location\nPipelines in Power Platform allow you to manage the end-to-end deployment process across your organization. The deployment page makes it easy to view all the pipelines and deployment activity within the tenant.\nAdmins can view all\npipelines host\nenvironments in the tenant, including the platform host, and\nselect a host\nto view all the pipelines and deployment history managed by that host.\nSelect\nPipelines\non the left navigation pane to view all active pipelines within the pipelines host.\nNote\nYou can view up to the last 365 days by changing the filter.\nAdditional information and advanced pipeline configuration can be accessed by navigating to the pipelines host environment and opening the\nDeployment Pipelines Configuration app\n.\nThe\nRun history\nview shows all deployment activity managed by the selected pipelines host including:\nThe\nStart time\nand\nEnd time\nfor every deployment.\nPipeline\nthat facilitated each deployment.\nSource\nis the development environment where the solution was developed and exported from.\nTarget\nis the destination environment where the pipeline is deployed. For example, for integration testing, user acceptance testing (UAT), production, and so on.\nStatus\nindicates whether the deployment is in-progress, succeeded, failed, or canceled.\nSolution\nis the name of the artifact and the\nVersion\ndeployed to the target environment.\nImportant\nAll target environments used in a pipeline must be enabled as\nManaged Environments\n.\nTenant admins can enable automatic conversion of pipelines environments to\nManaged Environments\n.\nManage deployment settings\nAdmins can manage these\nSettings\nwithin the selected pipelines host (settings are managed separately for each host):\nEnable automatic conversion of pipelines environments to\nManaged Environments\n. This ensures pipelines environments meet Microsoft compliance standards automatically. When makers deploy to this environment, it gets automatically converted to a Managed Environment.\nSolution deployments across regions\n: Admins can opt in to allow deployments between environments in different geographic locations. For example, when the host and production environments are in North America but the development environment is in India.\nImportant\nThis setting enables data to be shared across geographical regions within your tenant.\nAllow makers to import shared solution deployments\n: Deployed solution backups are stored in the pipelines host. This setting allows nonadmins to import solutions that were shared with them, in addition to the ability to import solutions that they deployed.\nUse a custom pipelines host\n: Allows you to set one default host for the entire tenant. This replaces\npersonal pipelines\n- meaning admins control who can access pipelines and makers can no longer create personal pipelines in the platform host. It's also useful when a central team manages deployments for the entire tenant. This setting is only visible when the\nPlatform host\nis selected in the host picker.\nReview and approve deployment requests\nOn the deployment page, admins approve or reject deployment requests assigned to them. You’ll first need to\nsetup delegated deployments\nwith service principals, which is recommended as the secure way to securely deploy to production environments.\nIt’s important admins review changes in the solution and the sharing request. Once approved, the solution is deployed, and solution objects and security roles are shared automatically. Notice other types of approvals within the pipelines host environment can also be managed.\nRetry failed deployments\nA dedicated\nFailed deployments\nview helps admins quickly identify and troubleshoot failures. Deployments shown as\nFailed\nin the run history view can be retried by selecting\nRetry\nin the details panel if the operation was\nDeploy\n. A confirmation message appears when you confirm the retry.\nFAQ\nAre Managed Environments required for deployment pipelines, and what does this mean for my organization?\nYes. All target environments used in Power Platform deployment pipelines have always been required to be Managed Environments for compliant usage. This requirement helps your organization benefit from enhanced governance, improved security, and streamlined license management.\nHow can I ensure pipelines targets are Managed Environments automatically?\nTenant admins (Power Platform and Dynamics 365 admins) can enable a setting that automatically converts pipelines target environments to Managed Environments, ensuring compliance with Microsoft standards. Managed Environments are then enabled on the target during the next deployment.\nTo enable the setting, go to the Power Platform admin center\nDeployments\n>\nSettings\n. Turn on the automatic managed environment setting for each pipeline host.\nWhy did I receive Message Center notification “Power Platform – Automatic enablement of Managed Environments for Deployment Pipelines”?\nYou receive a notification when you have environments that aren't managed and are target of a pipeline and used for deployment in the last six months.\nThe notification lists specific environments that need action.\nImportant\nStarting February 2026, Microsoft starts enabling Managed Environments for any pipeline target environments that aren’t already enabled.\nWe recommend that you review and enable Managed Environments for all pipeline targets now or set it to occur automatically.\nHow do I verify pipelines target environments requiring Managed Environments?\nGo to the Power Platform admin center\nDeployments\n>\nPipelines\n>\nRun History\n. Then select a host and change the filter to\nLast 180 days\n. If multiple hosts exist, review deployments in each. Environments listed under\nTarget\nrequire Managed Environments.\nNote\nAdditional run history information and the ability to export data and generate reports is available within the\nDeployment Pipelines Configuration app\n.\nDoes this automatic enablement affect end users or licensing, and how can I prepare?\nThere's no expected disruption for end users or their applications because of this automatic enablement. The changes focus on environment governance and compliance, so your users and apps continue to function as usual.\nImportant\nManaged Environments come with an\nautoclaim policy\n, which is applied automatically. The autoclaim policy ensures users who access apps in Managed Environments automatically receive the necessary licenses. Ensure you have appropriate license capacity in the tenant to utilize autoclaim.\nWill Microsoft enable unmanaged pipelines target environments in February 2026 if the automatic enablement setting is turned off?\nYes. These are different and one doesn't impact the other. The notification you recieved doesn't require opt-in and is specific to environments you've already deployed to. The automatic enablement setting is opt-in and helps you stay compliant going forward.\nCan I restrict access to personal pipelines?\nYes. Go to the Power Platform admin center >\nDeployments\n>\nSettings\n>\nUse a custom pipelines host\n, and then select a custom pipelines host. If there's no existing custom host, create one. Save the setting.\nThis overrides the platform host behavior, and nonadmins can't use pipelines unless you\ngrant access\nin the custom host environment.\nRelated articles\nOverview of pipelines in Power Platform\nView solutions on the deployment page for makers\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Admin Deployment Hub", @@ -395,34 +395,34 @@ "https://learn.microsoft.com/en-us/microsoft-copilot-studio/fundamentals-what-is-copilot-studio": { "content_hash": "sha256:576f275b138f31c3d9519111577aac93887b5254ff6e06c95bde3bbbdc31bb2a", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nCopilot Studio overview\nFeedback\nSummarize this article for me\nCopilot Studio is a graphical, low-code tool for building agents and agent flows.\nOne of the standout features of Copilot Studio is its ability to connect to other data sources by using prebuilt or custom connectors. With this flexibility, you can create and orchestrate sophisticated logic, ensuring that your agent experiences are powerful and intuitive.\nThe platform's low-code experience puts the power of AI at your fingertips, making it accessible even if you don't have an extensive technical background.\nWhat is an agent?\nAn agent is a powerful AI companion that can handle a range of interactions and tasks. It can resolve issues that require complex conversations and autonomously determine the best action to take based on its instructions and context. It coordinates language models, along with instructions, context, knowledge sources, topics, tools, inputs, and triggers to accomplish your goals.\nAgents can engage with customers and employees in multiple languages across websites, mobile apps, Facebook, Microsoft Teams, or any channel supported by the Azure Bot Service. They can also improve productivity by performing tasks as part of a conversation or in reaction to a trigger to assist users and organizations.\nYou can easily create agents in Copilot Studio without the need for data scientists or developers. Some of the ways you might use agents include:\nSales help and support issues\nOpening hours and store information\nEmployee health and vacation benefits\nPublic health tracking information\nCommon employee questions for businesses\nUse agents on their own or to extend Microsoft 365 Copilot with enterprise data and scenarios.\nWhat is an agent flow?\nAgent flows offer a powerful way to automate repetitive tasks and integrate your apps and services. Agent flows can be triggered manually, by other automated events or agents, or based on a schedule.\nWith Copilot Studio, you can create agent flows by using natural language or a visual editor.\nYou can run agent flows as standalone automations. You can also configure an agent flow to trigger from an agent as a tool, and return results to the same agent.\nHow does an agent conversation work?\nCopilot Studio agents use customized NLU model and AI capabilities to understand what a user types or says, then respond with the best topic. A topic is a portion of a conversational thread between a user and the agent. For more information, see\nCreate and edit topics\n.\nFor example, you might create an agent for your customers to ask common questions about your business. Your agent reduces support overhead by deflecting support calls. In the agent, you can create a topic about your store's opening hours and name the topic\nStore hours\n.\nWhen a customer asks a question such as \"When do you open?\" or \"What are your opening hours?\", the agent uses natural language understanding (NLU) to understand the\nintent\nbehind the question. The agent matches that intent to the best topic, the\nStore hours\ntopic.\nThe agent follows the\nconversation flow\n—which is a group of connected nodes—that you define in the\nStore hours\ntopic. Some nodes can ask questions, while others use conditions (if/else) to determine which store the customer wants. The final output of the topic shows the hours and contact information for that specific store.\nHowever, you can't anticipate all the types of questions your customers ask. To help mitigate this issue, Copilot Studio incorporates powerful AI-powered capabilities that use the latest advancements in NLU models. Once your agent is linked to knowledge sources, it can automatically generate responses. These responses are conversational, plain language, and you don't need to create topics for every eventuality.\nYou can also choose to let your agent access information outside its knowledge sources.\nCopilot Studio can use AI powered by the Azure OpenAI GPT model, also used in Bing, to create topics from a simple description of your needs. Similarly, you can modify and update any topic in your agent by describing the changes you want to make.\nAccess Copilot Studio\nAccess Copilot Studio as a standalone web app or as a discrete app within Teams. The Copilot Studio app for Teams supports classic chatbots only.\nWeb app\nUse cases:\nYou're an IT admin who wants to create agents to perform tasks or interact with customers.\nYou're familiar with agent services and want to trial or test Copilot Studio.\nYou want to explore advanced agent concepts, such as entities and variables, and create complex agents.\nGo to the web app at\nhttps://copilotstudio.microsoft.com\nExplore the Copilot Studio demo\nTeams app\nUse cases:\nYou're an employee or member of an organization who wants to use chatbots to answer common employee questions.\nYou want to use advanced concepts, such as entities and variables, and have a chatbot internally available in Teams.\nYou want to create and distribute a chatbot quickly.\nOpen or add the Copilot Studio app in Teams\nPlan your agent\nConsider the following points when planning your agent.\nExtend Microsoft 365 Copilot with an agent\nConsider extending Microsoft 365 Copilot with an agent if:\nYou want to craft your own agent by declaring instructions, tools, and knowledge to customize Microsoft 365 Copilot for specific tasks and domain knowledge.\nYou wish to utilize the existing Copilot orchestrator.\nYou want a standalone custom version of the Microsoft 365 Copilot chat experience.\nCreate an agent\nCopilot Studio makes it easy to create agents. You only need to describe the agent you want in plain language. Tell Copilot Studio what specific instructions, triggers, knowledge sources, and tools you want for your agent. Then test your agent before you deploy it. Publish your agent when you're ready across multiple channels.\nConsider creating an agent if:\nYou want an agent that can:\nIntegrate company data and documents\nRetrieve real-time data from external APIs\nTake actions in response to external events\nBe embedded in company applications\nYou require a customized end-to-end solution for your web or mobile app or automation workflow that meets specific business needs and allows for complete control over product branding.\nYou want to surface your agent to other agents as their supported agent extension.\nYou're a proficient developer looking to create a customized end-to-end solution to cater to your business needs, and want:\nFull control on product branding\nChoice of language models and orchestration\nOr, if you're building products like:\nA customer service chatbot for your e-commerce site\nA virtual assistant to schedule appointments for your healthcare service\nGaming experiences that incorporate generative AI\nAccessibility\nThe agent authoring canvas is built for accessibility in accordance with\nMicrosoft accessibility guidelines\nand supports standard navigational patterns.\nImportant information\nImportant\nMicrosoft Copilot Studio (1) is not intended or made available as a medical device for the diagnosis of disease or other conditions, or in the cure, mitigation, treatment or prevention of disease, or otherwise to be used as a component of any clinical offering or product, and no license or right is granted to use Microsoft Copilot Studio for such purposes, (2) is not designed or intended to be a substitute for professional medical advice, diagnosis, treatment, or judgment and should not be used as a substitute for, or to replace, professional medical advice, diagnosis, treatment, or judgment, and (3) should not be used for emergencies and does not support emergency calls. Any agent you create using Microsoft Copilot Studio is your own product or service, separate and apart from Microsoft Copilot Studio. You are solely responsible for the design, development, and implementation of your agent (including incorporation of it into any product or service intended for medical or clinical use) and for explicitly providing end users with appropriate warnings and disclaimers pertaining to use of your agent. You are solely responsible for any personal injury or death that may occur as a result of your agent or your use of Microsoft Copilot Studio in connection with your agent, including (without limitation) any such injuries to end users.\nRelated content\nAI-based agent authoring overview\nCreate and delete agents\nCreate and edit topics\nKey concepts - Publish and deploy your agent\nAnalytics overview\nAgent flows overview\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Copilot Studio Overview", "section": "Copilot Studio" }, "https://learn.microsoft.com/en-us/microsoft-copilot-studio/security-and-governance": { - "content_hash": "sha256:df3aa6c3045231a09b7632ae204ab22f8b032abaac40463f4a6cd7bca65ed7f2", - "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nKey concepts - Copilot Studio security and governance\nFeedback\nSummarize this article for me\nCopilot Studio follows a number of security and governance controls and processes, including geographic data residency, data loss prevention (DLP), multiple standards certifications, regulatory compliance,\nenvironment routing\n, and regional customization. For more information and details on how Copilot Studio agent handle data, see\nGeographic data residency in Copilot Studio\n.\nThis article provides an overview of the security practices followed by Copilot Studio, a list of security and governance controls and features, and examples and suggestions for employing safety and security within Copilot Studio for your agent makers and users.\nSecurity and governance controls\nControl\nCore scenario\nRelated content\nAgent runtime protection status\nMakers can see the security status of their agents from the Agents page.\nAgent runtime protection status\nData policy controls\nAdmins can use data policies in the Power Platform admin center to govern the use and availability of Copilot Studio features and agent capabilities, including:\nMaker and user authentication\nKnowledge sources\nActions, connectors, and skills\nHTTP requests\nPublication to channels\nAppInsights\nTriggers\nConfigure data policies for agents\nMakers audit logs in Microsoft Purview for admins\nAdmins have full visibility into maker audit logs in Microsoft Purview.\nView audit logs\nAudit logs in Microsoft Sentinel for admins\nAdmins can monitor and receive alerts on agent activities through Microsoft Sentinel.\nView audit logs\nRun tools with user credentials\nAgent makers can configure tools to use the user’s credentials by default.\nUse tools with custom agents\nSensitivity label for Knowledge with SharePoint\nAgent makers and users can see the highest sensitivity label applied to sources used in the agent's response and individual reference labels in the chat.\nView sensitivity labels for Sharepoint data sources\nUser authentication with certificates\nAdmins and makers can configure agents to use Entra ID manual authentication with certificate provider.\nConfigure user authentication\nMaker security warning\nMakers can see security alerts for their agent before publishing it when security and governance default configurations are modified.\nAutomatic security scan in Copilot Studio\nEnvironment routing\nAdmins can configure environment routing to provide their makers a safe space to build agents.\nWork with Power Platform environments\nMaker welcome message\nAdmins can configure a maker welcome message to inform makers about important privacy and compliance requirements.\nWork with Power Platform environments\nAutonomous agents governance with data policies\nAdmins can manage agent capabilities with triggers using data policies, ensuring protection against data exfiltration and other risks.\nConfigure data policies for agents\nCMK\nAdmins can enable customer-managed encryption keys (CMK) for their Copilot Studio environments.\nConfigure customer-managed encryption keys\nSecurity Development Lifecycle\nCopilot Studio follows the Security Development Lifecycle (SDL). The SDL is a set of strict practices that support security assurance and compliance requirements. Learn more at\nMicrosoft Security Development Lifecycle Practices\n.\nData processing and license agreements\nThe Copilot Studio service is governed by your commercial license agreements, including the\nMicrosoft Product Terms\nand the\nData Protection Addendum\n. For the location of data processing, refer to the\ngeographical availability documentation\n.\nCompliance with standards and practices\nThe\nMicrosoft Trust Center\nis the primary resource for Power Platform compliance information.\nLearn more at\nCopilot Studio compliance offerings\n.\nData loss prevention and governance\nCopilot Studio supports an extensive set of\ndata loss prevention features\nto help you manage the security of your data, along with\nPower Platform data policies\n.\nAdditionally, to further govern and secure Copilot Studio using generative AI features in your organization, you can:\nDisable agent publishing: Your admin can use the Power Platform admin center to turn off the ability to publish agents that use generative AI features for your tenant.\nDisable data movement across geographic locations\nfor Copilot Studio generative AI features outside the United States.\nUse the Microsoft 365 admin center to govern the conversational and AI actions and agents that show in Microsoft 365 Copilot\n.\nFinally, Copilot Studio supports securely accessing customer data using\nCustomer Lockbox\n.\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "content_hash": "sha256:3e7489558143cb8fa29e034dd3c07129611600b1cbfa75aac77cbf433f1d1775", + "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nKey concepts - Copilot Studio security and governance\nFeedback\nSummarize this article for me\nCopilot Studio follows a number of security and governance controls and processes, including geographic data residency, data loss prevention (DLP), multiple standards certifications, regulatory compliance,\nenvironment routing\n, and regional customization. For more information and details on how Copilot Studio agent handle data, see\nGeographic data residency in Copilot Studio\n.\nThis article provides an overview of the security practices followed by Copilot Studio, a list of security and governance controls and features, and examples and suggestions for employing safety and security within Copilot Studio for your agent makers and users.\nSecurity and governance controls\nControl\nCore scenario\nRelated content\nAgent runtime protection status\nMakers can see the security status of their agents from the Agents page.\nAgent runtime protection status\nData policy controls\nAdmins can use data policies in the Power Platform admin center to govern the use and availability of Copilot Studio features and agent capabilities, including:\nMaker and user authentication\nKnowledge sources\nActions, connectors, and skills\nHTTP requests\nPublication to channels\nAppInsights\nTriggers\nConfigure data policies for agents\nMakers audit logs in Microsoft Purview for admins\nAdmins have full visibility into maker audit logs in Microsoft Purview.\nView audit logs\nAudit logs in Microsoft Sentinel for admins\nAdmins can monitor and receive alerts on agent activities through Microsoft Sentinel.\nView audit logs\nRun tools with user credentials\nAgent makers can configure tools to use the user’s credentials by default.\nUse tools with custom agents\nSensitivity label for Knowledge with SharePoint\nAgent makers and users can see the highest sensitivity label applied to sources used in the agent's response and individual reference labels in the chat.\nView sensitivity labels for SharePoint data sources\nUser authentication with certificates\nAdmins and makers can configure agents to use Entra ID manual authentication with certificate provider.\nConfigure user authentication\nMaker security warning\nMakers can see security alerts for their agent before publishing it when security and governance default configurations are modified.\nAutomatic security scan in Copilot Studio\nEnvironment routing\nAdmins can configure environment routing to provide their makers a safe space to build agents.\nWork with Power Platform environments\nMaker welcome message\nAdmins can configure a maker welcome message to inform makers about important privacy and compliance requirements.\nWork with Power Platform environments\nAutonomous agents governance with data policies\nAdmins can manage agent capabilities with triggers using data policies, ensuring protection against data exfiltration and other risks.\nConfigure data policies for agents\nCMK\nAdmins can enable customer-managed encryption keys (CMK) for their Copilot Studio environments.\nConfigure customer-managed encryption keys\nSecurity Development Lifecycle\nCopilot Studio follows the Security Development Lifecycle (SDL). The SDL is a set of strict practices that support security assurance and compliance requirements. Learn more at\nMicrosoft Security Development Lifecycle Practices\n.\nData processing and license agreements\nThe Copilot Studio service is governed by your commercial license agreements, including the\nMicrosoft Product Terms\nand the\nData Protection Addendum\n. For the location of data processing, refer to the\ngeographical availability documentation\n.\nCompliance with standards and practices\nThe\nMicrosoft Trust Center\nis the primary resource for Power Platform compliance information.\nLearn more at\nCopilot Studio compliance offerings\n.\nData loss prevention and governance\nCopilot Studio supports an extensive set of\ndata loss prevention features\nto help you manage the security of your data, along with\nPower Platform data policies\n.\nAdditionally, to further govern and secure Copilot Studio using generative AI features in your organization, you can:\nDisable agent publishing: Your admin can use the Power Platform admin center to turn off the ability to publish agents that use generative AI features for your tenant.\nDisable data movement across geographic locations\nfor Copilot Studio generative AI features outside the United States.\nUse the Microsoft 365 admin center to govern the conversational and AI actions and agents that show in Microsoft 365 Copilot\n.\nFinally, Copilot Studio supports securely accessing customer data using\nCustomer Lockbox\n.\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, - "last_changed": "2026-03-11T12:59:29.613756+00:00", + "last_changed": "2026-03-14T06:51:10.083468+00:00", "topic": "Security and Governance", "section": "Copilot Studio" }, "https://learn.microsoft.com/en-us/microsoft-copilot-studio/publication-fundamentals-publish-channels": { - "content_hash": "sha256:ca30e1880e140402a3b66c12b1b78e14772478565e472a103f13d02cb57c0ae3", - "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nKey concepts - Publish and deploy your agent\nFeedback\nSummarize this article for me\nWith Copilot Studio, you can publish agents to engage with your customers on multiple platforms or channels. For example, live websites, mobile apps, Microsoft 365 Copilot, and messaging platforms like Teams and Facebook.\nEach time you update your agent, you can publish it again from within Copilot Studio. Publishing your agent applies to all the channels associated with your agent.\nWeb app\nTeams\nYou need to publish your agent before your customers can engage with it. You can publish your agent on multiple platforms, or\nchannels.\nAfter you publish your agent to at least one channel, you can connect it to more channels. Remember to publish your agent again after you make any changes to it.\nWhen you publish your agent, this agent updates on all connected channels. If you make changes to your agent but don't publish after doing so, your customers won't be engaging with the latest content.\nThe agent comes with the\nAuthenticate with Microsoft\noption turned on. The agent automatically uses Microsoft Entra ID authentication for Teams, Power Apps, and Microsoft 365 Copilot without requiring any manual setup.\nIf you want to allow anyone to chat with your agent, select\nNo authentication\n.\nCaution\nSelecting the\nNo authentication\noption allows anyone who has the link to chat and interact with your bot or agent.\nWe recommend you apply authentication, especially if you are using your bot or agent within your organization or for specific users, along with\nother security and governance controls\n.\nIf you want to use other channels and still have authentication for your agent, select\nAuthenticate manually\n.\nImportant\nIf you select\nNo authentication\n, your agent can't use\ntools\nwith\nuser credentials\n.\nPublish the latest content\nWith your agent open for editing, in the navigation menu, select\nPublish\n.\nSelect\nPublish\n, and then confirm. Publishing can take a few minutes.\nTest your agent\nTest your agent after you publish it. You can\nmake the agent available to users in Teams and Microsoft 365 Copilot\nby using the installation link or from various places in the Microsoft Teams app store.\nYou can share your agent later by selecting\nMake the agent available to others\nfrom the\nPublish\npage, in Teams.\nYou can also install the agent for your own use in Microsoft Teams by selecting\nOpen the agent in Teams\n.\nIf you selected\nNo authentication\nor\nAuthenticate manually\n, select the\nDemo website\nlink to open a prebuilt website in a new browser tab, where you and your teammates can interact with the agent.\nThe demo website is also useful to gather feedback from stakeholders before you roll your agent out to customers. Learn how to\nconfigure the demo website and add the agent to your live website\n.\nTip\nWhat's the difference between the test chat and the demo website?\nUse the test chat (the\nTest agent\npane) while you're building your agent to make sure conversation flows as you expect and to spot errors.\nShare the demo website URL with members of your team or other stakeholders to try out the agent. The demo website isn't intended for production use. You shouldn't share the URL with customers.\nConfigure channels\nAfter publishing your agent at least once, you can add channels to make it reachable by your customers.\nTo configure channels for your agent:\nOn the top menu bar, select\nChannels\n.\nSelect the channel you want from the list of available channels.\nEach channel has different connection steps. Learn more:\nTeams and Microsoft 365 Copilot\nSharePoint\nWhatsApp\nDemo Website\nCustom Website\nMobile App\nFacebook\nAzure Bot Service channels\n, including:\nCortana\nSlack\nTelegram\nTwilio\nLine\nKik\nGroupMe\nDirect Line Speech\nEmail\nChannel experience reference table\nDifferent channels offer different user experiences. The following table shows a high-level overview of the experiences for each channel. Consider the channel experiences when you optimize your agent content for specific channels.\nExperience\nWebsite\nTeams and Microsoft 365 Copilot\nFacebook\nDynamics Omnichannel for Customer Service\nCustomer satisfaction survey\nAdaptive card\nText-only\nText-only\nText-only\nMultiple-choice options\nSupported\nSupported up to six (as hero card)\nSupported up to 13\nPartially Supported\nMarkdown\nSupported\nPartially Supported\nPartially supported\nPartially Supported\nWelcome message\nSupported\nSupported\nNot supported\nSupported for\nChat\n. Not supported for other channels.\nDid-You-Mean\nSupported\nSupported\nSupported\nSupported for\nMicrosoft Teams\n,\nChat\n, Facebook, and text-only channels (short message service (SMS) via\nTeleSign\nand\nTwilio\n,\nWhatsApp\n,\nWeChat\n, and\nTwitter\n).\nSuggested actions are presented as a text-only list; users must retype an option to respond.\nNext steps (Web app)\nArticle\nDescription\nPublish an agent to a live or demo website\nPublish your agent on your live website, or use a demo website to share internally.\nConnect and configure an agent for Teams and Microsoft 365 Copilot\nUse Teams and Microsoft 365 Copilot to distribute your agent.\nPublish an agent to Facebook\nAdd your agent to Facebook Messenger.\nPublish an agent to mobile or custom apps\nAdd your agent to mobile or custom native apps (developer coding required).\nPublish an agent to Azure Bot Service channels\nAdd your agent to Azure Bot Service channels (developer coding required).\nSee the\nweb app instructions for publishing latest content\nas they're the same in the Teams app.\nWhen publication is successful, you can\nmake the agent available to users in Microsoft Teams\nwith the installation link or from various places in the Microsoft Teams app store. You can share your agent later by selecting\nMake the agent available to others\nfrom the\nPublish\npage.\nYou can also install the agent for your own use in Microsoft Teams by selecting\nOpen the agent in Teams\n.\nTip\nTo prevent disrupting users who are having an existing conversation with the agent, they don't receive the latest published content until a new conversation starts. A new conversation starts after 30 minutes of inactivity.\nYou might want to try out the latest published content in Microsoft Teams right away. To do so, enter\nstart over\nin an existing conversation. This command restarts the conversation with the latest content you published.\nKnown limitations\nThe customer satisfaction survey in Microsoft Teams is a text-only version instead of an adaptive card.\nMicrosoft Teams can render up to six suggested actions in one question node.\nA user can't send or upload attachments to the chat. If they try to send an attachment, the agent replies:\nLooks like you tried to send an attachment. Currently, I can only process text. Please try sending your message again without the attachment.\nThis limitation applies to all channels, even if the channel or user-facing experience supports attachments. For example, if you're using the Direct Line API or Microsoft Teams.\nAttachments can be supported if the message is sent to a skill, where the skill bot supports the processing of attachments. Learn more in\nUse Microsoft Bot Framework skills in Copilot Studio\n.\nNext steps (Teams)\nArticle\nDescription\nConnect and configure an agent for Teams and Microsoft 365\nMake your agent available to users in Microsoft Teams and Microsoft 365.\nCreate a privacy statement and terms of use in Microsoft Teams\nCreate and link to a privacy statement and terms of use for agents you create.\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "content_hash": "sha256:80819214ba8c3806351ed9446d2a3f5be8bbc36ed81de37b8de103947d518124", + "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nKey concepts - Publish and deploy your agent\nFeedback\nSummarize this article for me\nWith Copilot Studio, you can publish agents to engage with your customers on multiple platforms or channels. For example, live websites, mobile apps, Microsoft 365 Copilot, and messaging platforms like Teams and Facebook.\nEach time you update your agent, you can publish it again from within Copilot Studio. Publishing your agent applies to all the channels associated with your agent.\nWeb app\nTeams\nYou need to publish your agent before your customers can engage with it. You can publish your agent on multiple platforms, or\nchannels.\nAfter you publish your agent to at least one channel, you can connect it to more channels. Remember to publish your agent again after you make any changes to it.\nWhen you publish your agent, this agent updates on all connected channels. If you make changes to your agent but don't publish after doing so, your customers won't be engaging with the latest content.\nThe agent comes with the\nAuthenticate with Microsoft\noption turned on. The agent automatically uses Microsoft Entra ID authentication for Teams, Power Apps, and Microsoft 365 Copilot without requiring any manual setup.\nIf you want to allow anyone to chat with your agent, select\nNo authentication\n.\nCaution\nSelecting the\nNo authentication\noption allows anyone who has the link to chat and interact with your bot or agent.\nWe recommend you apply authentication, especially if you are using your bot or agent within your organization or for specific users, along with\nother security and governance controls\n.\nIf you want to use other channels and still have authentication for your agent, select\nAuthenticate manually\n.\nImportant\nIf you select\nNo authentication\n, your agent can't use\ntools\nwith\nuser credentials\n.\nPublish the latest content\nWith your agent open for editing, in the navigation menu, select\nPublish\n.\nSelect\nPublish\n, and then confirm. Publishing can take a few minutes.\nTest your agent\nTest your agent after you publish it. You can\nmake the agent available to users in Teams and Microsoft 365 Copilot\nby using the installation link or from various places in the Microsoft Teams app store.\nYou can share your agent later by selecting\nMake the agent available to others\nfrom the\nPublish\npage, in Teams.\nYou can also install the agent for your own use in Microsoft Teams by selecting\nOpen the agent in Teams\n.\nIf you selected\nNo authentication\nor\nAuthenticate manually\n, select the\nDemo website\nlink to open a prebuilt website in a new browser tab, where you and your teammates can interact with the agent.\nThe demo website is also useful to gather feedback from stakeholders before you roll your agent out to customers. Learn how to\nconfigure the demo website and add the agent to your live website\n.\nTip\nWhat's the difference between the test chat and the demo website?\nUse the test chat (the\nTest agent\npane) while you're building your agent to make sure conversation flows as you expect and to spot errors.\nShare the demo website URL with members of your team or other stakeholders to try out the agent. The demo website isn't intended for production use. You shouldn't share the URL with customers.\nConfigure channels\nAfter publishing your agent at least once, you can add channels to make it reachable by your customers.\nTo configure channels for your agent:\nOn the top menu bar, select\nChannels\n.\nSelect the channel you want from the list of available channels.\nEach channel has different connection steps. Learn more:\nTeams and Microsoft 365 Copilot\nSharePoint\nWhatsApp\nDemo Website\nCustom Website\nMobile App\nFacebook\nAzure Bot Service channels\n, including:\nCortana\nSlack\nTelegram\nTwilio\nLine\nKik\nGroupMe\nDirect Line Speech\nEmail\nChannel experience reference table\nDifferent channels offer different user experiences. The following table shows a high-level overview of the experiences for each channel. Consider the channel experiences when you optimize your agent content for specific channels.\nExperience\nWebsite\nTeams and Microsoft 365 Copilot\nFacebook\nDynamics Omnichannel for Customer Service\nCustomer satisfaction survey\nAdaptive card\nText-only\nText-only\nText-only\nMultiple-choice options\nSupported\nSupported up to six (as hero card)\nSupported up to 13\nPartially Supported\nMarkdown\nSupported\nPartially Supported\nPartially supported\nPartially Supported\nWelcome message\nSupported\nSupported\nNot supported\nSupported for\nChat\n. Not supported for other channels.\nDid-You-Mean\nSupported\nSupported\nSupported\nSupported for\nMicrosoft Teams\n,\nChat\n, Facebook, and text-only channels (short message service (SMS) via\nTeleSign\nand\nTwilio\n,\nWhatsApp\n,\nWeChat\n, and\nTwitter\n).\nSuggested actions are presented as a text-only list; users must retype an option to respond.\nTroubleshoot publishing errors\nIf you run into issues when publishing your agent, use the following troubleshooting steps to resolve common publishing errors:\nVerify all configurations are correct.\nMake sure that the agent settings, authentication options, and channel configurations are set up properly before publishing.\nCheck for any missing dependencies.\nEnsure that all required components, such as topics, flows, connectors, and data sources, are available and properly configured.\nReview error logs for specific error codes and messages.\nGo to the\nPublish\npage and check the publish status for any error details. Use the error codes and messages to identify and address the root cause.\nNext steps (Web app)\nArticle\nDescription\nPublish an agent to a live or demo website\nPublish your agent on your live website, or use a demo website to share internally.\nConnect and configure an agent for Teams and Microsoft 365 Copilot\nUse Teams and Microsoft 365 Copilot to distribute your agent.\nPublish an agent to Facebook\nAdd your agent to Facebook Messenger.\nPublish an agent to mobile or custom apps\nAdd your agent to mobile or custom native apps (developer coding required).\nPublish an agent to Azure Bot Service channels\nAdd your agent to Azure Bot Service channels (developer coding required).\nSee the\nweb app instructions for publishing latest content\nas they're the same in the Teams app.\nWhen publication is successful, you can\nmake the agent available to users in Microsoft Teams\nwith the installation link or from various places in the Microsoft Teams app store. You can share your agent later by selecting\nMake the agent available to others\nfrom the\nPublish\npage.\nYou can also install the agent for your own use in Microsoft Teams by selecting\nOpen the agent in Teams\n.\nTip\nTo prevent disrupting users who are having an existing conversation with the agent, they don't receive the latest published content until a new conversation starts. A new conversation starts after 30 minutes of inactivity.\nYou might want to try out the latest published content in Microsoft Teams right away. To do so, enter\nstart over\nin an existing conversation. This command restarts the conversation with the latest content you published.\nKnown limitations\nThe customer satisfaction survey in Microsoft Teams is a text-only version instead of an adaptive card.\nMicrosoft Teams can render up to six suggested actions in one question node.\nA user can't send or upload attachments to the chat. If they try to send an attachment, the agent replies:\nLooks like you tried to send an attachment. Currently, I can only process text. Please try sending your message again without the attachment.\nThis limitation applies to all channels, even if the channel or user-facing experience supports attachments. For example, if you're using the Direct Line API or Microsoft Teams.\nAttachments can be supported if the message is sent to a skill, where the skill bot supports the processing of attachments. Learn more in\nUse Microsoft Bot Framework skills in Copilot Studio\n.\nNext steps (Teams)\nArticle\nDescription\nConnect and configure an agent for Teams and Microsoft 365\nMake your agent available to users in Microsoft Teams and Microsoft 365.\nCreate a privacy statement and terms of use in Microsoft Teams\nCreate and link to a privacy statement and terms of use for agents you create.\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, - "last_changed": "2026-03-11T12:59:29.613756+00:00", + "last_changed": "2026-03-14T06:51:10.083468+00:00", "topic": "Agent Publishing", "section": "Copilot Studio" }, "https://learn.microsoft.com/en-us/microsoft-copilot-studio/admin-share-bots": { "content_hash": "sha256:761913d4c2a718db1afaf449a8f77fa357fd16d3b5f5093dcbd89f5b130c72ae", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nShare agents with other users\nFeedback\nSummarize this article for me\nYou can share your agents with others in either of the following ways:\nGrant security groups, or your whole organization, permission to chat with the agent.\nInvite users to collaborate on your agent project. Collaborators always have permission to chat with the agent.\nPrerequisites\nUser authentication\nfor the agent must be configured to\nAuthenticate manually\n, with\nAzure Active Directory\nor\nMicrosoft Entra ID\nas the provider.\nRequired user sign-in\nmust be enabled to manage who can chat with the agent in your organization.\nShare an agent for chat\nWeb app\nTeams\nCollaborators\nwith authoring permissions for a shared agent can always chat with it. However, you can also grant users permission to chat with an agent in Copilot Studio without granting them authoring permissions.\nTo grant users permission to only chat with the agent, you can either:\nShare your agent with individual users.\nShare your agent with a security group.\nShare your agent with everyone in your organization.\nNote\nWhen sharing an agent for\nchat\n, you can't share it with Microsoft 365 groups with\nSecurityEnabled\nset to false. To change this setting, see\nShare an app with Microsoft 365 groups\nin the Power Apps documentation.\nTo author agents in Copilot Studio, makers need at least the\nEnvironment Maker\nrole. The\nBot Author\nrole is deprecated. When a maker shares an agent for coauthoring, the other user is granted the\nBot Contributor\nand\nEnvironment Maker\nroles. Users in these roles can only access agents they created or the agents shared with them. Additionally, makers must have the\nprvAssignRole\nprivilege, included in the\nSystem Administrator\nrole, to share an agent for coauthoring. If the new coauthor holds the\nEnvironment Maker\nrole, the original maker doesn't need the\nprvAssignRole\nprivilege.\nImportant\nSharing Copilot Studio agents that have connections to external services such as Azure Databricks Genie can be challenging because connections are tied to the identity of each individual agent user. This behavior explains why an agent might work for you but not for other users.\nTo address this behavior, use one of the following approaches:\nUse a\nservice principal\nfor authentication, if supported.\nConfigure\nenvironment-level connections\nso the agent runs using a shared, non user-specific identity.\nRequire users to\nauthenticate individually\nwhen OAuth-based authentication is required.\nShare an agent with security groups\nShare an agent with security groups so their members can chat with it.\nOpen the agent you want to share in Copilot Studio.\nOn the top menu bar, select the three dots (\n…\n) and then select\nShare\n.\nEnter the name of every security group that you want to share the agent with.\nReview the permissions for each security group.\nIf you want to let users know you shared the agent with them, select the three dots (\n…\n) at the top of the sharing screen, select\nUse Classic Sharing\n, and then select\nSend an email invitation to new users\nat the bottom of the screen.\nNote\nUsers can only receive an email invitation if their security group has email enabled. Alternatively, select\nCopy link\nand then share the link directly with the users to inform them that they can now chat with your agent.\nSelect\nShare\nto share the agent with the security groups you specified.\nShare an agent with everyone in the organization\nYou can share your agent so that everyone in the same organization as the agent can chat with it.\nOpen the agent you want to share in Copilot Studio.\nOn the top menu bar, select the three dots (\n…\n) and then select\nShare\n.\nSelect\nEveryone in \n(where\n\nis your organization's name).\nSelect\nUser - can use the agent\n.\nNote\nCopilot Studio doesn't send email invitations to everyone in an organization. To inform users that they can now chat with your agent, select\nCopy link\nand then share the link directly with the users.\nSelect\nShare\nto share the agent with everyone in the organization.\nShare your agent with other users so they can chat with the agent or collaborate to author it.\nA user can always chat with an agent if the user created the agent in the same team. You can also share agents with users outside of a team.\nShare an agent with security groups\nShare an agent with security groups so their members can chat with it.\nOpen the agent you want to share in Copilot Studio for Teams.\nSelect\nShare\nat the top of the\nOverview\npage.\nEnter the security group name that you want to share the agent with.\nNote\nYou can only share an agent with security groups. You can't share with individual users directly.\nYou can manage individual user access by adding or removing users from the security group.\nLearn more about\nsecurity groups\nin the Microsoft Graph documentation.\nReview the security group's permissions.\nIf you want to let users know you shared the agent with them, select the three dots (\n…\n) at the top of the sharing screen, select\nUse Classic Sharing\n, and then select\nSend an email invitation to new users\nat the bottom of the screen.\nNote\nUsers can only receive an email invitation if their security group has email enabled. Alternatively, select\nCopy link\nand then share the link directly with the users to inform them that they can now chat with your agent.\nSelect\nShare\nto share the agent with the security group you specified.\nShare an agent with everyone in the organization\nYou can share your agent so that everyone in the same organization as the agent can chat with it.\nOpen the agent you want to share in Copilot Studio for Teams.\nSelect\nShare\nat the top of the\nOverview\npage.\nSelect\nEveryone in \n(where\n\nis your organization's name).\nSelect\nUser - can use the chatbot\n.\nNote\nCopilot Studio doesn't send email invitations to everyone in an organization. To inform users that they can now install the agent in Microsoft Teams and chat with it, select\nCopy link\nand then share the link directly with the users.\nSelect\nShare\nto share the agent with everyone in the organization.\nShare an agent for collaborative authoring\nWeb app\nTeams\nWhen you share an agent with individual users, you give them permission to view, edit, configure, share, and publish the agent. They can't delete the agent.\nNote\nYou can only share an agent with users who have a Microsoft Copilot Studio per user license. Users who don't have a license can\nsign up for a trial\n.\nOpen the agent you want to share in Copilot Studio.\nOn the top menu bar, select the three dots (\n…\n) and then select\nShare\n.\nEnter the name or email address of each user that you want to share the agent with.\nNote\nWhen sharing an agent for\ncollaborative authoring\n, you can only share it with individual users in your organization.\nReview the permissions for each user.\nIf you want to let the collaborators know you shared the agent with them, select the three dots at the top of the sharing screen (\n…\n), then\nUse Classic Sharing\n. You can then select\nSend an email invitation to new users\nat the bottom of the screen.\nNote\nUsers can only receive an email invitation if their security group has email enabled. Alternatively, select\nCopy link\nand then share the link directly with the users to inform them that they can now chat with your agent.\nSelect\nShare\nto share the agent with the users you specified.\nImportant\nIf a user isn't already a\nmember of the environment\nfor the shared agent, it can take up to 10 minutes before the agent becomes available in Copilot Studio for this user.\nYou can always collaborate with others when building agents in the Copilot Studio app for Teams. This collaboration means other members of your team can make changes, and you can see who else is editing a topic.\nYour\nMicrosoft Teams roles\ndetermine your permissions in the team where you create an agent:\nTeam Owners can create, view, edit, and configure all agents in teams they own.\nTeam Members can create, edit, and configure their own agents. They can view the other team members' agents.\nNote\nIf you're an owner for a Microsoft Entra ID group associated with a team, without being a member of that team, you might not be able to see the team in the Power Apps and Copilot Studio apps in Microsoft Teams. To resolve this problem, add yourself to the team. It can take a few minutes before the team becomes visible to you.\nTo share an agent with other users for collaboration,\nadd them to your team\n. It can take up to 15 minutes before a new team member sees the team in the Copilot Studio app for Teams.\nWhen you select the\nAgents\ntab on the top menu bar, you see a list of your teams. Select a team to see the agents in that team.\nTip\nMy agents\nshows all the agents you created and is an easy way for you to find your agent across multiple teams. You can find agents created by other team members by selecting the team.\nSelect the name of an agent to open it for editing.\nIf you select the check mark next to the name of an agent, a secondary menu bar appears. From this menu bar, you can go straight to the\nTopics\nor\nAnalytics\npage for your agent. You can also select\nEdit\nto go to the\nOverview\npage.\nIf you select the three dots (\n…\n) next to the name of an agent, you can select\nEdit\nto go to the\nOverview\npage, or go to the\nTopics\nor\nAnalytics\npages.\nIf you rename, restore, or delete a team, it can take up to two hours for the changes to be reflected in the Copilot Studio app.\nCollaborate on agents\nAfter you share an agent with other users, they can all edit its topics.\nOn the\nTopics\npage, the\nEditing\ncolumn shows who's working on topics. Select a person's icon to quickly chat with them in Teams or send them an email.\nThis information can help prevent conflicts when multiple authors work on the same topic.\nNote\nThe list of authors in the\nEditing\ncolumn refreshes only when the page loads.\nWhen you open a topic for editing, icons at the top of the authoring canvas also show who's currently working on this topic.\nIf an author doesn't make any changes to the topic, disconnects their computer, or closes the browser window, they're considered to have abandoned the topic. After 30 minutes of inactivity, the user isn't identified as editing the topic.\nOccasionally, multiple authors make changes to a topic and attempt to save their changes concurrently. For example, you might open and start editing a topic. Your coworker opens the same topic, makes a small change, and saves it. Then, when you finish editing the topic, and attempt to save it, Copilot Studio detects a conflict. When a conflict happens, Copilot Studio prevents you from overwriting your coworker's changes, by offering you two options:\nSelect\nDiscard changes\nto reload your agent with the latest changes (discarding your work).\nSelect\nSave copy\nto save a copy of the topic (keeping your changes in a copy of the topic).\nIf you save your changes as a new topic, you can then review your coworker's changes, merge the two topics, and delete the copy when you're done.\nStop sharing an agent\nWeb app\nTeams\nYou can stop sharing an agent with individual users, a security group, or everyone in your organization.\nStop sharing with security groups\nOn the top menu bar, select the three dots (\n…\n) and then select\nUpdate\n. (If you're using the classic sharing panel, select\nShare\n.)\nSelect the\nX\nicon next to each security group you want to stop sharing the agent with.\nSelect\nShare\nto stop sharing the agent with these security groups.\nStop sharing with everyone in the organization\nOn the top menu bar, select the three dots (\n…\n) and then select\nShare\n.\nSelect\nEveryone in \n(where\n\nis your organization's name).\nSelect\nNone\n.\nSelect\nShare\nto stop sharing the agent with everyone in the organization.\nStop sharing an agent with individual users\nYou can stop sharing an agent with a user. Any shared user can stop the agent from being shared with other users, except for the owner. Owners always have access to their agents.\nOn the top menu bar, select the three dots (\n…\n) and then select\nShare\n.\nSelect the\nX\nicon next to each user you want to stop sharing the agent with.\nSelect\nShare\nto stop sharing the agent with these users.\nYou can stop sharing an agent with security groups or everyone in your organization.\nNote\nWhen you stop sharing an agent, affected users can't access the agent after their current conversation times out (after 30 minutes of idle time).\nStop sharing with security groups\nSelect\nShare\nat the top of the\nOverview\npage.\nSelect the\nX\nicon next to each security group you want to stop sharing the agent with.\nSelect\nShare\nto stop sharing the agent with these security groups.\nStop sharing with everyone in the organization\nSelect\nShare\nat the top of the\nOverview\npage.\nSelect\nEveryone in \n(where\n\nis your organization's name).\nSelect\nNone\n.\nSelect\nShare\nto stop sharing the agent with everyone in the organization.\nShare Power Automate flows used in an agent\nYou can\nadd actions to an agent using flows in Power Automate\n. However, sharing an agent doesn't automatically share the flows in the agent.\nUsers who don't have access to flows in a shared agent can still run these flows by using the Test panel in Copilot Studio.\nTest your agents to make sure users who chat with them have the required permissions to run the\nPower Automate flows\n.\nTo let other users edit or add flows, you must share them in Power Automate. You can open flows directly from the topic where the flow is used.\nSelect\nView flow details\nto go to the flow's details page in Power Automate.\nSelect\nEdit\nin the\nOwners\nsection.\nEnter the name or email address of the user you want to give editing permissions to.\nAssign environment security roles\nIf you're a\nSystem Administrator\n, you can assign and manage environment security roles when sharing an agent.\nThe\nEnvironment security roles\nsection appears when you share an agent and only if you're a\nSystem Administrator\n. It lets you share agents with users who don't have sufficient environment permissions to use Copilot Studio.\nYou must be a\nSystem Administrator\nof the environment where the agent is located to view and add security roles.\nNote\nYou can only\nassign\nsecurity roles when sharing an agent. You can't remove security roles when sharing. For full security role management, use the\nPower Platform admin center\n.\nLearn more about\nsecurity roles\nand\npredefined security roles\nin the Power Platform admin documentation.\nAssign the Environment Maker security role during agent sharing\nWhen sharing an agent, if a user doesn't have sufficient permissions to use Copilot Studio in the environment, you're notified that the\nEnvironment Maker\nsecurity role is assigned to the user so they can use the agent.\nAssign the Bot Transcript Viewer security role during agent sharing\nWhen sharing an agent, you can assign the\nBot Transcript Viewer\nsecurity role to users who don't have conversation transcript access.\nDepending on the content and target audience of the agent, consider granting transcript access only to users who have the appropriate privacy training.\nImportant\nEnvironment security roles determine conversation transcript access. Users with the\nBot Transcript Viewer\nsecurity role can access conversation transcripts for all agents that they create or that are shared with them in the environment.\nBy default, only admins have the\nBot Transcript Viewer\nrole. To control which users can view conversation transcripts,\ncreate a new environment for your agents\n.\nInsufficient environment permissions\nUsers in an environment need the\nEnvironment Maker\nsecurity role before you can share an agent with them.\nA system administrator for the environment must assign the\nEnvironment Maker\nsecurity role to a user before you share an agent with them. If you have the\nSystem Administrator\nsecurity role, you can\nassign the Environment Maker role\nto users when you share agents.\nLearn more about\nsecurity roles\nand\npredefined security roles\n.\nManage security roles\nYou can\nmanage environment security roles at the Power Platform admin center\n.\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Share and Manage Agents", @@ -431,7 +431,7 @@ "https://learn.microsoft.com/en-us/microsoft-copilot-studio/analytics-overview": { "content_hash": "sha256:c20b5888a6d29e365c42c1c59f1735e9bccba95e539dbeed8ead6164d1638522", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nAnalytics overview\nFeedback\nSummarize this article for me\nUse analytics to understand how well your agent is performing and to identify areas for improvement.\nThe\nAnalytics\npage in Copilot Studio shows you comprehensive data for your agent, from an overview of key metrics to in-depth usage analytics for your agent's components. You can drill down into each piece of data to get more details.\nThe analytics experience is tailored for\nconversational agents\nand for\nautonomous agents\n.\nAnalytics are available in all geographies. Time-and-date stamps in analytics are in Coordinated Universal Time (UTC). The time-and-date stamps include day start and end times, session times, and any other time markers in your agent's data.\nNote\nAnalytics aren't available on the\nAnalytics\npage for activity completed when you test your agent in Copilot Studio by using the\ntest panel\n.\nTo access analytics:\nOpen your agent in Copilot Studio.\nSelect\nAnalytics\non the top menu bar.\nThe\nSummary\narea uses Copilot to generate an AI summary of key analytics insights about your agent. Select\nView More\nto see the full summarized list of insights.\nYou can provide feedback to Microsoft about this section with the\nThumbs up\nand\nThumbs down\nicons\n. Use the\nSubmit feedback to Microsoft\npanel to add a comment and share related files. By providing descriptive feedback like this, we can work together to continuously improve our product.\nOn the\nSubmit feedback to Microsoft\npanel, describe in natural language your likes or dislikes, depending on which icon you selected to open the panel.\nChoose whether to share prompt, generated response, relevant content samples, and additional log files.\nSelect\nSubmit\n.\nYou can also check a high-level performance summary in the\nOverview\narea, then dive deeper into its performance.\nConversational sessions only\nAnalytics for conversational agents\nin Copilot Studio track user engagement with your agent and try to capture how well your agent handles user tasks.\nConversational analytics uses the following concepts and terms:\nConversations are an ongoing interaction between a specific user, or group of users, on a\nchannel\nand your agent.\nConversations can pause and resume later, or be\ntransferred to a customer service representative\n. The conversation might be one-way, either from the customer to the agent, or from the agent to the customer, but it's more commonly a back-and-forth interaction between the customer and the agent.\nA conversation times out after 30 minutes of inactivity.\nFor agents published to the Telephony channel, conversations time out 3 minutes after an\nEnd Conversation\nevent.\nA single conversation can contain one or more analytics sessions.\nAn analytics session in classic mode is associated with the last custom topic triggered by a user. If the session doesn't include custom topics, it's the last system topic triggered directly by the user.\nEvent trigger sessions only\nAn autonomous agent is an agent with an event trigger. Only\nanalytics for agents with triggers\nare available for these agents.\nAn\nanalytics session\nfor agents with triggers tracks from when an agent receives a payload from a trigger through any actions the agent runs in response. These analytics sessions capture what your agent is responding to, and how well your agent performs.\nNote\nIf a trigger fails and the agent doesn't receive a trigger payload, an analytics session can't begin. Only successfully triggered runs are tracked in analytics.\nHybrid view - both conversational and event-triggered sessions\nWhen your agent's data includes sessions for at least one conversation and at least one trigger-based run in the report's period, the\nAnalytics\npage displays a\nhybrid\nview of relevant metrics for both conversational and autonomous sessions. In this view, the\nOverview\nand\nEffectiveness\nsections show metrics side by side for both the conversational and event-triggered sessions. The\nUse\nsection allows you to select between\nConversations\nor\nRuns\n, which displays only relevant data. Selecting\nAll\nshows all use-related data. The\nSatisfaction\nsection displays satisfaction metrics for conversational sessions.\nFor more information about:\nDisplayed metrics relevant to conversation sessions, see\nAnalyze conversational agent effectiveness\n.\nDisplayed metrics relevant to event-triggered sessions, see\nAnalyze autonomous agent health\n.\nDownload conversational transcripts\nYou can download conversation transcripts a few minutes after the conversation times out. You can download any time period within the last 29 days. You can download them in\nDataverse using the Power Apps portal\nand as\nsession chat transcripts using the Copilot Studio app\n. It can take up to an hour after the analytic session ends before the related data appears on the analytics dashboard.\nNote\nConversation transcripts in Dataverse aren't available for download on the Copilot Studio app in Teams. To review and export transcripts in Dataverse, you need to sign up for the\nCopilot Studio web app\n. Session chat transcripts can be downloaded using the\nCopilot Studio app\n. For more information, see\nDownload agent session transcripts\n.\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Analytics", @@ -449,7 +449,7 @@ "https://learn.microsoft.com/en-us/microsoft-copilot-studio/advanced-connectors": { "content_hash": "sha256:e3b0e8d41c9ac8a228143d98ff4807068c57e41845f7388ed6355d5b5878e16b", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nUse Power Platform connectors as tools\nFeedback\nSummarize this article for me\nConnectors from Microsoft Power Platform\nact as proxies or \"wrappers\" around APIs. They enable Copilot Studio, Power Automate, Power Apps, and Azure Logic Apps to communicate with other apps and services. By using connectors, you can connect your accounts and use prebuilt tools and triggers to build your apps and workflows.\nBy using connectors, you can access various services (both within the Microsoft ecosystem and outside it) to perform a wide array of tasks automatically.\nMany connectors are available. Connectors include connections between and to Microsoft services like Office 365, SharePoint, and Dynamics 365, as well as connections to non-Microsoft services like Twitter, Google services, Salesforce, and more. Connectors are categorized as:\nPrebuilt connectors\n, which are built-in connections to popular services available to use in Copilot Studio agents. These connectors include:\nStandard connectors\n, such as SharePoint, which are included with all Copilot Studio plans.\nPremium connectors\n, which are available in select\nCopilot Studio plans\n.\nCustom connectors\n, which you use to connect to any publicly available API for services not covered by existing connectors.\nIntegration with Copilot Studio\nConnectors are useful tools that greatly extend the functionality of Copilot Studio agents. By using connectors, you can connect to various external services and applications to perform a wide range of tasks. By using these connectors, you can create more dynamic, responsive, and useful agents that are tailored to specific business needs and processes.\nYou can call connectors as tools in your agent, at the agent level, or in a\ntopic\n.\nNote\nFor more information on adding connectors as a knowledge source, see\nAdd Power Platform connectors as knowledge (preview)\n.\nAdd tools from a prebuilt connector to your agent\nYou can select and add tools from prebuilt connectors directly to your agent. Connector tools represent specific actions or operations you want your agent to perform by using that connector.\nAdd a tool from a prebuilt connector\nSelect\nAgents\nand select the agent you want to add a connector to.\nGo to the\nTools\npage of your agent and select\nAdd a tool\n.\nSelect\nConnector\n. The different services with connectors available are displayed.\nSelect the service you want to connect to, or search for the service by name in the search box. You see a list of tools available for the service connector.\nSelect the tool you want to add. The\nAdd tool\npane opens.\nIf the connection doesn't already exist, select\nCreate new connection\n. The details of setting up the connection depend on the connector you selected.\nSelect\nSubmit\nor\nCreate\nas applicable when you're done.\nSelect\nAdd and configure\n. The configuration page for the new tool opens, showing the new tool and its details.\nBy default, the connection uses user credentials. For more information about the supported authentication modes, see\nConfigure user authentication for tools\n. To change this behavior, see the following section.\nAdd a tool from a prebuilt connector in a topic\nSelect\nAgents\nand select the agent you want to add a connector to.\nGo to the\nTopics\npage and select the topic you want to add a connector to.\nSelect\nAdd node\n(\n+\n) on the authoring canvas.\nIn the node selection window, select\nAdd a tool\n>\nConnector\n, and search for the connector tool you want to add.\nSet up connection details as needed for the connector.\nSelect\nSubmit\n.\nBy default, the connection uses user credentials. For more information about the supported authentication modes, see\nConfigure user authentication for tools\n. To change this behavior, see\nCreate a custom connector to add to an agent\n.\nCreate a custom connector to add to an agent\nSelect\nAgents\nand select the agent you want to add a connector to.\nGo to the\nTools\npage and select\nAdd a tool\n.\nSelect\nNew tool\n>\nCustom connector\n. You're taken to the Power Apps portal under the\nCustom connectors\nsection.\nSelect\nNew custom connector\nand select the method you want to use to create the connector.\nUse connectors with maker-provided credentials\nConnectors need a valid set of credentials. By default, connectors ask users (users of your agent) to enter their credentials for the associated service when they use the tool. To have your agent use the maker's credentials, follow these steps:\nConfigure your agent to use an\nauthenticated channel\n.\nAdd a connector tool to your agent, and configure it.\nGo to the connector tool\nOverview\npage.\nUnder\nDetails\n>\nAdditional details\n>\nCredentials to use\n, select\nMaker-provided credentials\n.\nPublish and test the experience in the\nTest your agent\npane, or in the desired channel.\nShare connection\nTo share your connection with others:\nGo to\nmake.powerapps.com\n.\nSelect\nConnections\nin the left navigation bar.\nSelect the connection and select\nShare\n.\nIn the\nShare\ndialog, search for the desired user and select the user.\nUnder\nPermission\n, next to the user, select\nCan use + share\n.\nLimitations\nSingle sign-on (SSO) isn't supported for connectors when an agent uses custom Active Directory authentication and the agent is deployed to Microsoft Teams. In this configuration, connectors can't use SSO, and users must authenticate to each connector manually. To enable your users to connect successfully, make sure the necessary connections are set up in Copilot Studio. For guidance on setting up connections, see\nConfigure and manage connections\n.\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Connectors", @@ -458,18 +458,18 @@ "https://learn.microsoft.com/en-us/microsoft-copilot-studio/knowledge-copilot-studio": { "content_hash": "sha256:84e89b67b49371797457158a69ce9d9ec9b99b7826ce0160e119794ebfdbe47e", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nKnowledge sources summary\nFeedback\nSummarize this article for me\nIn Copilot Studio, knowledge sources work together with generative answers. When knowledge sources are added, agents can use enterprise data from Power Platform, Dynamics 365 data, websites, and external systems. Knowledge sources allow your agents to provide relevant information and insights for your customers.\nPublished agents that contain knowledge use the configured knowledge sources to ground the published agent. Knowledge can be incorporated at the agent level, in the\nKnowledge\npage, or at the topic level, with a\ngenerative answers node\nin an agent topic.\nKnowledge sources can be incorporated into agents during their initial creation, added after the agent is created, or added to a generative answers topic node.\nAdd and manage knowledge for generative answers\nGenerative answers allow your agent to find and present information from multiple sources, internal or external, without having to create specific topics. Generative answers can be used as primary information sources or as a fallback source when authored topics can't answer a user's query. As a result, you can quickly create and deploy a functional agent. Makers don't need to manually author multiple topics, which might not address all customer questions.\nBy default, when you create an agent, Copilot Studio automatically creates the\nConversational boosting\nsystem topic. This topic contains a generative answers node, which allows you to begin utilizing knowledge sources immediately. All knowledge sources that are added at the agent level are added to generative answers node in the\nConversational boosting\nsystem topic.\nFor prerequisites and information on limitations, see\nGenerative answers\n.\nFor information on analytic metrics on a per knowledge source basis, see:\nGenerated answer rate and quality\nfor conversational agents.\nKnowledge source use\nfor autonomous agents.\nDrill down on a theme\nfor knowledge source metrics in the context of themes.\nSupported knowledge sources\nName\nSource\nDescription\nNumber of inputs supported in generative answers\nAuthentication\nPublic website\nExternal\nSearches the query input on Bing, only returns results from provided websites\nGenerative mode: 25 websites\nClassic mode: Four public URLs (for example,\nmicrosoft.com\n)\nNone\nDocuments\nInternal\nSearches documents uploaded to Dataverse, returns results from the document contents\nGenerative mode: All documents\nClassic mode: Limited by the Dataverse file storage allocation\nNone\nSharePoint\nInternal\nConnects to a SharePoint URL, uses GraphSearch to return results\nGenerative mode: 25 URLs\nClassic mode: Four URLs per generative answers topic node\nAgent user's Microsoft Entra ID authentication\nDataverse\nInternal\nConnects to the configured Dataverse environment and uses a retrieval-augmented generative technique in Dataverse to return results\nGenerative mode: Unlimited\nClassic mode: Two Dataverse knowledge sources (and up to 15 tables per knowledge source)\nAgent user's Microsoft Entra ID authentication\nEnterprise data using connectors\nInternal\nConnects to connectors where your organization data is indexed by Microsoft Search\nGenerative mode: Unlimited\nClassic mode: Two per custom agent\nAgent user's Microsoft Entra ID authentication\nNote\nAgent user authentication for knowledge sources means that when a specific user asks a question of the agent, the agent only surfaces content that the specific user can access.\nKnowledge sources in generative answers nodes currently don't support Bing Custom Search, Azure OpenAI, or Custom Data. Instead, from the generative answers node properties, use the\nClassic data\noption for\nBing Custom Search\n,\nAzure OpenAI\n, or\nCustom Data\nsources.\nFor websites, you need to confirm which website(s) your organization owns that Bing will search through Copilot Studio.\nYou can perform language-agnostic querying across all supported file types and languages.\nIf you're using unstructured data, such as individual SharePoint files and folders, OneDrive files and folders, or connectors, there are different limits and limations. For more information, go to\nLimits and limitations\n.\nCurrently, citations returned from a knowledge source can't be used as inputs to other tools or actions.\nKnowledge search in classic and generative modes\nHow knowledge sources are searched depends on which\norchestration mode\nthe agent uses:\nclassic\nor\ngenerative\n.\nClassic orchestration\nWhen an agent is configured to use classic orchestration, the following applies:\nIn the\nConversational boosting\nsystem topic, the number of knowledge sources the agent can search is limited, and depends on the type of knowledge source. Your agent can search any combination of knowledge sources, up to the maximum number indicated for each type in the following table:\nType of knowledge source\nLimit\nAzure OpenAI Service connection\n5\nBing Custom Search Custom Configuration IDs\n2\nCustom data sources\n3\nDataverse knowledge sources\n2 sources with up to 15 tables each\nSharePoint URLs\n4\nUploaded files\nUnlimited\nWebsite URLs\n4\nYou can also embed a\ngenerative answers node\nin a topic, so that a search is performed for specific intents, and not only as a fallback. The preceding knowledge source limits apply.\nClassic orchestration supports\ncustom data sources\n, in addition to the other knowledge sources.\nGenerative orchestration\nWhen an agent is configured to use generative orchestration, the following applies:\nIf there are more than 25 different knowledge sources, the agent filters the knowledge sources using an internal GPT based on the description given to the knowledge source. More information:\nAuthoring descriptions\nNote\nFiles uploaded\nto the agent aren't part of the 25 knowledge source search limit.\nGenerative orchestration doesn't support\ncustom data\nor\nBing Custom Search\nas knowledge sources. To use those knowledge sources, you must embed them inside a\ngenerative answers node\nin a topic.\nEnable Web Search for your agent\nThe\nUse information from the web\nsetting is available on the\nGenerative AI\nsettings page or the\nWeb Search\nsetting in the\nKnowledge\nsection of the agent's\nOverview\npage. This setting lets your agent access broad, real-time, and up-to-date information beyond what is available in predefined or enterprise-specific knowledge bases. This setting requires that the agent has generative orchestration turned on.\nWhen turned on, the\nUse information from the web\n/\nWeb Search\nsetting is triggered when a user's question might benefit from information on the web. It searches all public websites indexed by Bing. This type of search happens in parallel with any searches of public websites you added as knowledge sources. Results from\nUse information from the web\n/\nWeb Search\nare interleaved with results from your configured public website knowledge sources.\nNote\nUse information from the web\n/\nWeb Search\nuses\nGrounding with Bing Search\nto return information from the web.\nAllow the agent to use general knowledge\nThe\nUse general knowledge\nsetting on the\nGenerative AI\nsettings page configures your agent to use\ngenerative AI\n. This setting requires that the agent has generative orchestration turned on.\nGenerative AI includes general knowledge, which refers to the foundational knowledge that the generative AI is trained on. When this setting is turned on, it allows your agent to use this general knowledge in its answers. This general knowledge setting means that the agent answers questions unrelated to the domain of your agent. If you prefer that your agent is grounded with your specific knowledge sources only, turn off this setting.\nTenant graph grounding with semantic search\nThe\nTenant graph grounding with semantic search\nsetting on the\nGenerative AI\nsettings page determines whether your agent uses\nsemantic search\nto improve search results. This setting requires that the agent has generative orchestration turned on.\nThis feature requires the agent to share a tenant with a Microsoft 365 Copilot license. It also requires that a semantic index is configured for use. To use a semantic index, the Microsoft 365 Copilot license must be assigned to at least one user in the enterprise.\nImportant\nThe\nTenant graph grounding with semantic search\nfeature requires that the agent's\nuser authentication\nis set to\nAuthenticate with Microsoft\n. If authentication is set to any other method than\nAuthenticate with Microsoft\n, the setting can't be changed.\nWhen the feature is turned on and the maker has a Microsoft 365 license in the same tenant, the agent supports SharePoint and connectors containing files up to 200 MB. The feature is turned on by default.\nNote\nFor agents grounded in SharePoint knowledge sources, turning on\nTenant graph grounding with semantic search\nprovides significantly better knowledge retrieval and response quality. This feature uses cutting-edge internal retrieval tools that allow the agent to obtain a greater volume of context, with greater precision. However, due to the increased system complexity, certain users and queries might experience a small increase in latency.\nIf you don't have a Microsoft 365 Copilot license in the same tenant as your agent, or you experience lower response quality, turn off the feature.\nThe agent maker doesn't need to have a Microsoft 365 Copilot license to create an agent with a semantic index.\nSharePoint and Microsoft Copilot connectors support files up to 512 MB if they have PDF, PPTX, or DOCX extensions. For more information on supported file types, see\nSupported content types\n.\nThe\nTenant graph grounding with semantic search\nfeature is a separate feature from the\nDataverse search\nfeature. For more information about how Dataverse search works, see\nFrequently asked questions about Dataverse search\n.\nSource authentication\nIf you're using SharePoint, Dataverse, or enterprise data using Microsoft Copilot connectors, you need to incorporate authentication. For more information, see\nConfigure user authentication in Copilot Studio\n, and for individual generative answers nodes, see\nAuthentication\n.\nIn addition, you might need to account for\nURL considerations\nthat require extra authentication for your sources.\nContent moderation\nThe content moderation settings allow your agent to provide more answers. However, the increase in answers might affect the allowance of\nharmful content\nfrom the agent.\nThe following areas allow you to configure the content moderation settings:\nThe setting in the\nGenerative AI\nsettings page sets the moderation at the agent level.\nThe setting in the generative answers node sets the moderation at the topic level.\nThe setting in the prompt tool sets the moderation at the prompt level.\nAt runtime, the setting at the topic level takes precedence. If content moderation isn't set at the topic level, it defaults to the\nGenerative AI\nsettings configuration.\nTo override agent or topic content moderation for prompt tools, configure the\nCompletion\nsetting of the prompt tool to\nsend a specific response\n.\nTo adjust the\ncontent moderation settings at the agent level\n, change your agent's\nGenerative AI\noption to\nGenerative\n.\nSelect the desired moderation level for your agent.\nThe moderation levels range from\nLowest\nto\nHighest\n. The lowest level generates the most answers, but they might contain harmful content. The highest level of content moderation generates fewer answers, and applies a stricter filter to restrict harmful content. The default moderation level is\nHigh\n.\nSelect\nSave\n.\nTo adjust the\ncontent moderation settings at the topic level\n, change the setting in your generative answers node.\nTo adjust the\ncontent moderation settings for the prompt tool\n, change the setting in the prompt builder.\nOfficial sources\nWhen adding knowledge sources to your agent, you might not always control how the information evolves over time, or you might not fully trust this information. It's important to let your users know that they should consider answers with caution, and they should verify them when appropriate.\nHowever, when you know that information from a specific knowledge source goes through a strict verification process and is highly trusted, you can mark this knowledge source as an official source that can be used directly, without verification.\nTo mark a knowledge source as official, on the\nKnowledge\npage, select the three dots (\n⋮\n) for the knowledge source, point to\nOfficial source\nand select\nYes\n.\nNote\nThis feature isn't yet compatible with\ngenerative orchestration\n. If you want your agent to use official knowledge sources and mark them as such, turn off generative orchestration.\nWhen an agent uses authoritative knowledge sources, the response starts with a distinctive indication.\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Knowledge Sources", "section": "Copilot Studio" }, "https://learn.microsoft.com/en-us/microsoft-copilot-studio/nlu-gpt-overview": { - "content_hash": "sha256:bae2de8906a38fc14a4a2cedaf77830cbb3f91a96d386c1aab651ce181b62166", - "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nAI-based agent authoring overview\nFeedback\nSummarize this article for me\nCopilot Studio offers generative AI features to reduce manual authoring and dramatically expand the scope of an agent's knowledge and its ability to interact with users.\nGenerative AI is an artificial intelligence technology that uses language models to generate original content and provide natural language understanding and responses. Learn more about\nGenerative AI\nin the Artificial Intelligence (AI) playbook.\nIn Copilot Studio, you can use the following generative AI features to retrieve and create content, either individually or all together.\nCreate an agent\n. With no manual authoring of topics required, an\nempty\nagent can generate answers based on knowledge sources you specify such as websites and files. See\nGenerative answers\nand the\nQuickstart\n.\nHarness AI general knowledge\n. When this option is enabled, the agent can answer general questions unrelated to your specific knowledge sources or topics. See\nAI general knowledge\n.\nAuthor topics using natural language\n. Describe what you want your topic to do, and Copilot Studio creates it for you. Your agent includes conversational responses and multiple types of nodes. Use the suggested default topic or as a starting point for further development. See\nCreate and edit topics with Copilot\n.\nAuthor prompts using natural language\n. Describe the prompt you want to create, and Copilot Studio generates it for you. You can use the suggested default prompt or as a starting point for further development. See\nCreate and edit prompts with Copilot\n.\nCreate agent flows using natural language\n. Describe the overall flow you want your agent to follow, and Copilot Studio generates it for you. You can use the suggested default flow or as a starting point for further development. See\nBuild an agent flow with natural language\n.\nTurn on generative orchestration\n. Let the agent select the most appropriate topics, tools, agents, and knowledge sources at runtime. See\nOrchestrate agent behavior with generative AI\n.\nUsing generative AI in Copilot Studio transforms how you build agents, significantly reducing manual work and configuration.\nGenerative answers\nGenerative answers in Copilot Studio let your agent find and present information from multiple sources, internal or external, without created topics. Generative answers can be used as primary information sources or as a fallback source when authored topics can't answer a user's query. As a result, you can quickly create and deploy a functional agent. You don't need to manually author multiple topics that might not address all customer questions.\nWhat changed?\nIn traditional chatbots, when an agent can't determine a user's intent, it asks the user to rephrase their question. If after two prompts, the agent still can't determine the user's intent, the agent escalates to a live agent, using the\nEscalate\nsystem topic.\nWith generative answers, before escalating to a live agent, the agent uses natural language processing (NLP) to:\nParse what a user types to determine what they're asking.\nFind, collate, and parse relevant information from a specified source. This source can be your company's website, or from multiple sources, including Sharepoint.\nSummarize search results into plain language delivered to the agent user.\nYour workflow might look like this:\nYou create an agent and enable the\nGenerative\noption in the\nGenerative AI\npage of Settings. You test the agent thoroughly.\nAfter testing, you publish your agent to instantly provide answers, help, and guidance to your agent users.\nYou create individual topics for frequently asked questions. These topics might develop from\nanalytics from previous agents\nor existing support issues.\nAI general knowledge\nIn addition to generative answers, you can use AI general knowledge to allow your agent to find and present information in response to your customer's questions. General knowledge saves you from needing to manually author multiple topics, which might not address all your customer's questions. It can also help when a user's intent can't be addressed by existing agent topics.\nWhat is AI general knowledge?\nAI general knowledge applies the capabilities of AI to access and provides information, insights, and assistance across a wide range of topics.\nWhy use it?\nAccessibility\n: The agent can instantly access a vast repository of information and expertise across a wide range of subjects.\nVersatility\n: It's capable of addressing diverse topics and tasks, making it a versatile resource for various needs.\nNote\nWhile AI general knowledge can provide valuable information and assistance, you need to critically evaluate the information it provides and consider consulting other sources for verification or further clarification when necessary.\nPrerequisites\nAn account for Copilot Studio. If you don't have an account, follow the instruction in\nSign up for a Copilot Studio trial\n.\nThe current version of Copilot Studio. The agent type must not be Classic. Classic agents have (classic) added to their name, for example \"Contoso store hours (classic).\"\nReview AI response generation training, model, and usage in the\nFAQ for generative answers\nand\nLearn more about Azure OpenAI\n.\nWhat's supported?\nAI-based authoring might be subject to usage limits or capacity throttling.\nQuotas\nQuotas are default constraints applied to agents that limit how often messages can be sent to an agent. The purpose of quotas is to throttle the client's service load, which protects a service from being overloaded and the client from unexpected resource usage.\nAgents with generative answers enabled have a limit on the number of queries they can make derive answers from the URL you specified. Normal conversations that use agent topics follow the\nusual quotas and limitations\n.\nLanguages\nSee\nLanguage support\nfor the list of supported languages.\nRelated content\nGet up and running with\nQuickstart: Create and deploy an agent\n.\nAdd\nknowledge sources\nto your agent.\nHave a conversation to\nauthor topics using natural language\n.\nUse\ngenerative orchestration\nto call your actions automatically at runtime.\nBuild an agent flow with natural language\n.\nCreate prompts with Copilot\n.\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "content_hash": "sha256:65aed619e47ee85536ca42fc004108748440502d5e3e2b3c251c6b4337992108", + "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nAI-based agent authoring overview\nFeedback\nSummarize this article for me\nCopilot Studio offers generative AI features to reduce manual authoring and dramatically expand the scope of an agent's knowledge and its ability to interact with users.\nGenerative AI is an artificial intelligence technology that uses language models to generate original content and provide natural language understanding and responses. Learn more about\nGenerative AI\nin the Artificial Intelligence (AI) playbook.\nIn Copilot Studio, you can use the following generative AI features to retrieve and create content, either individually or all together.\nCreate an agent.\nWith no manual authoring of topics required, an\nempty\nagent can generate answers based on knowledge sources you specify such as websites and files. Learn more in\nQuickstart: Create and deploy an agent\n.\nHarness AI general knowledge.\nWhen\nUse general knowledge\nis turned on, the agent can answer general questions unrelated to your specific knowledge sources or topics. Learn more in\nAllow the agent to use general knowledge\n.\nAuthor topics using natural language.\nDescribe what you want your topic to do, and Copilot Studio creates it for you. Your agent includes conversational responses and multiple types of nodes. Use the suggested default topic as a starting point for further development. Learn more in\nCreate and edit topics with Copilot\n.\nAuthor prompts using natural language.\nDescribe the prompt you want to create, and Copilot Studio generates it for you. You can use the suggested default prompt as a starting point for further development. Learn more in\nCreate a prompt with Copilot\n.\nCreate agent flows using natural language.\nDescribe the overall flow you want your agent to follow, and Copilot Studio generates it for you. You can use the suggested default flow as a starting point for further development. Learn more in\nBuild an agent flow with natural language\n.\nUse generative orchestration.\nLet the agent select the most appropriate topics, tools, agents, and knowledge sources at runtime. Learn more in\nOrchestrate agent behavior with generative AI\n.\nThe generative AI features of Copilot Studio transform how you build agents, significantly reducing manual work and configuration.\nPrerequisites\nHave an account for Copilot Studio. If you don't have an account, follow the instruction in\nSign up for a Copilot Studio trial\n.\nUnderstand the\nlimitations of generative answers\n.\nLearn about\nAzure OpenAI\n.\nGenerative answers\nWith generative answers, your agent can find and present information from multiple sources, both internal and external. You don't need to manually create multiple topics that might not address all customer questions. Use generative answers as primary information sources or as a fallback when authored topics can't answer a user's query. By using generative answers, you can quickly create and deploy a functional agent.\nWhat changed?\nWhen a traditional chatbot can't determine a user's intent, it asks the user to rephrase their question. If after two prompts, the chatbot still can't determine the user's intent, it escalates to a live agent by using the\nEscalate\nsystem topic.\nBefore it escalates to a live agent, a Copilot Studio agent with generative answers capabilities uses natural language processing (NLP) to:\nParse what a user types to determine what they're asking.\nFind, collate, and parse relevant information from specified sources, such as your company's website or multiple sources including SharePoint.\nSummarize search results into plain language delivered to the agent user.\nYour workflow might look like this:\nYou create an agent and test it thoroughly. By default, your agent has generative orchestration turned on in its\nGenerative AI\nsettings.\nAfter testing, you publish your agent. It can instantly provide answers, help, and guidance to your customers.\nYou create individual topics for frequently asked questions. These topics might develop from\nanalytics from previous agents\nor existing support issues.\nAI general knowledge\nIn addition to generative answers, your can let your agent use AI general knowledge to find and present information in response to your customer's questions. General knowledge saves you from needing to manually create multiple topics, which might not address all your customer's questions. It can also help when the existing topics can't address the customer's intent.\nWhat is AI general knowledge?\nAI general knowledge applies the capabilities of AI to access and provide information, insights, and assistance across a wide range of topics.\nWhy use it?\nAccessibility\n: The agent can instantly access a vast repository of information and expertise across a wide range of subjects.\nVersatility\n: It's capable of addressing diverse topics and tasks, making it a versatile resource for various needs.\nNote\nWhile AI general knowledge can provide valuable information and assistance, you need to critically evaluate the information it provides and consider consulting other sources for verification or further clarification when necessary.\nWhat's supported?\nAI-based authoring might be subject to usage limits or capacity throttling.\nNote\nLong prompts can cause\ngenerative answers nodes\nto fail due to usage limits or capacity throttling. Use short, focused prompts or split queries. Learn more in\nFAQ for generative answers\n.\nQuotas\nQuotas are default constraints applied to agents that limit how often you can send messages to an agent. The purpose of quotas is to throttle the client's service load, which protects a service from being overloaded and protects the client from unexpected resource usage.\nAgents with generative answers capabilities have a limit on the number of queries they can make to derive answers from the URL you specified. Normal conversations that use agent topics follow the\nusual quotas and limitations\n.\nLanguages\nFor the list of supported languages, see\nLanguage support\n.\nRelated content\nGet up and running with\nQuickstart: Create and deploy an agent\n.\nAdd\nknowledge sources\nto your agent.\nHave a conversation to\nauthor topics using natural language\n.\nUse\ngenerative orchestration\nto call your actions automatically at runtime.\nBuild an agent flow with natural language\n.\nCreate a prompt with Copilot\n.\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, - "last_changed": "2026-03-11T12:59:29.613756+00:00", + "last_changed": "2026-03-14T06:51:10.083468+00:00", "topic": "Generative AI", "section": "Copilot Studio" }, @@ -485,7 +485,7 @@ "https://learn.microsoft.com/en-us/microsoft-copilot-studio/external-security-provider": { "content_hash": "sha256:f7ece1e2c133529ac7687c247bf75d9e3374530b15b0f9ebaf82dc880288b271", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nEnable external threat detection and protection for Copilot Studio custom agents (preview)\nFeedback\nSummarize this article for me\n[This article is prerelease documentation and is subject to change.]\nCustom agents created in Copilot Studio are secure by default. They include built-in protection against various threats, such as user prompt-injection attacks (UPIA) and cross-domain prompt injection Attacks (XPIA). At runtime, the agent blocks attacks of these types, reducing the risk of data exfiltration.\nTo further increase the monitoring capabilities and security of custom agents, Copilot Studio lets organizations configure\nexternal threat detection systems\nfor enhanced oversight. These tools operate during the agent's runtime, continuously evaluating agent activity. If the system detects any suspicious tools or actions, it can intervene to block them from executing. This threat detection provides an extra layer of real-time protection and compliance enforcement.\nImportant\nExternal threat detection is only called on generative agents that use generative orchestration. External threat detection is skipped for classic agents.\nHow it works\nAn external threat detection system is set up as a web service, exposing a REST API with a threat detection endpoint. A secure connection is configured between the agent and the endpoint. At runtime, every time the orchestrator considers invoking a tool, it sends relevant data about the proposed tool use to the threat detection endpoint for evaluation. The threat detection system analyzes the data and returns a decision to either allow or block the tool invocation.\nIf the threat detection system detects a security issue during an agent's operation, the agent immediately stops processing, and notifies the user that their message is blocked. On the other hand, if the system approves the operation, the agent proceeds seamlessly, with no visible effect or interruption for the user.\nImportant\nThis article contains Microsoft Copilot Studio preview documentation and is subject to change.\nPreview features aren't meant for production use and may have restricted functionality. These features are available before an official release so that you can get early access and\nprovide feedback\n.\nIf you're building a production-ready agent, see\nMicrosoft Copilot Studio Overview\n.\nOptions for setting up external threat detection\nCopilot Studio supports a flexible \"bring your own protection\" approach. Organizations have the freedom to integrate security solutions that best fit their unique requirements.\nOptions include:\nDevelop your own custom monitoring tools, or have someone develop them for you. For more information on how to set up the system endpoint so that your agent can connect to it, see\nBuild a runtime threat detection system for Copilot Studio agents\nApply a robust enterprise solution by\nMicrosoft Defender\nUse products from other trusted security providers\nWhat data is shared with the threat detection provider?\nOnce you configure a connection to a threat detection system, the agent shares data with the external security provider during its run. The agent communicates with the service whenever it considers invoking a tool. This data sharing ensures efficient decision-making by the configured system, without degrading the experience of your agent's users.\nThe high-level data shared with the system includes:\nThe user's recent prompt and the latest history of chat messages exchanged between the agent and the user.\nOutputs of previous tools used by the agent.\nConversation metadata: Identity of the agent, the user who interacts with it, the user's tenant, and the trigger that triggered it (where applicable).\nThe tool the agent wants to invoke, including agent-generated reasoning of why this tool was selected and the proposed inputs and values.\nImportant\nThe provider data-handling policies might be different from the policies used by Microsoft. The differences could include processing and storing your data outside your geographic region.\nYou must ensure that the provider and terms meet the standards and comply with the regulations required to protect your organization's data.\nIf you want to block sharing data with the threat detection service, you can disconnect the integration at any time.\nPrerequisites\nBefore you begin, you need:\nAn external threat detection service set up to evaluate agent tool use requests. The service needs to expose a REST API endpoint. For the setup on the Copilot Studio side of the integration, you need the\nbase URL\nfor the security provider web service. This article refers to this URL as the\nendpoint\n. The agent sends requests for threat detection to APIs at this base URL. You should receive this URL from your security provider.\nA Microsoft Entra tenant where you can register an application for authentication between the agent and the threat detection service.\nA user with a Power Platform Administrator role to configure a connection between the agent and the external threat detection system for both the individual environment level, and the environment group level.\nConfigure an external threat detection system\nThe process to configure an external threat detection system for your agent has two steps:\nConfigure a Microsoft Entra application.\nConfigure threat detection in Power Platform admin center.\nStep 1: Configure a Microsoft Entra application\nThere are two paths you can take to configure a Microsoft Entra application:\nOption A: Configure using PowerShell script (recommended)\nOption B: Configure manually using Azure portal\nOption A: Configure using PowerShell (Recommended)\nYou can use a provided PowerShell script to automate the creation and configuration of your Microsoft Entra application.\nPrerequisites for PowerShell Configuration\nWindows PowerShell 5.1 or later\nSufficient permissions to create application registrations in your Microsoft Entra tenant\nThe base URL of the threat detection web service. The URL is referred to as the\nEndpoint\nin the script parameters that follow. You should receive this URL from your security provider.\nYour organization's Microsoft Entra tenant ID. The tenant ID is referred to as the\nTenantId\nin the script parameters that follow.\nDownload and prepare the script\nDownload the\nCreate-CopilotWebhookApp.ps1\nscript.\nScript parameters\nThe script accepts the following parameters:\nParameter\nType\nRequired\nDescription\nTenantId\nString\nYes\nYour Microsoft Entra tenant ID in GUID format (for example,\n12345678-1234-1234-1234-123456789012\n).\nEndpoint\nString\nYes\nThe base URL for the external threat detection service (provided by your security provider). If you're using Microsoft Defender as your security provider, you can get the endpoint from the\nDefender portal\n.\nDisplayName\nString\nYes\nA unique display name you provide for the application registration. Can be between 1 and 120 characters.\nFICName\nString\nYes\nA unique name you provide for the Federated Identity Credential. Can be between 1 and 120 characters.\nDryRun\nSwitch\nNo\nOptional flag. When the\n-DryRun\nflag is provided, the script performs a validation run without creating resources.\nExecute the script\nTo create the application:\nOpen Windows PowerShell as an administrator.\nGo to the directory containing the script.\nExecute the following script, replacing the placeholder values for\nTenantId\n,\nEndpoint\n,\nDisplayName\n, and\nFICName\nwith your own parameters:\n.\\Create-CopilotWebhookApp.ps1 `\n-TenantId \"11111111-2222-3333-4444-555555555555\" `\n-Endpoint \"https://provider.example.com/threat_detection/copilot\" `\n-DisplayName \"Copilot Security Integration - Production\" `\n-FICName \"ProductionFIC\"\nThe interactive script runs in the command line. The script outputs the App ID of the created Microsoft Entra application. You need the App ID later when configuring threat detection in Power Platform admin center.\nOption B: Configure manually using Azure portal\nPrerequisites for manual configuration\nYour organization's Microsoft Entra tenant ID. The tenant ID is referred to as the\ntenantId\nin the instructions that follow.\nSufficient permissions to create application registrations in your Microsoft Entra tenant.\nThe base URL of the threat detection web service you're using. This is referred to as the\nendpoint\nin the following instructions. You should receive this URL from your security provider.\nRegister a Microsoft Entra application\nFollow these steps to create a Microsoft Entra application registration. The application is used to secure authentication between the agent and the threat detection web service. See\nRegister an application in Microsoft Entra ID\nto learn how to create such an app.\nSign in to Azure portal and navigate to the\nMicrosoft Entra ID\npage.\nUnder\nApp registrations\n, select\nNew registration\n.\nProvide a name and select\nAccounts in this organizational directory only (Single tenant)\nas the supported account type.\nRegister\nthe app.\nAfter the app is created, copy the App ID. You need the App ID later when configuring threat detection in Power Platform admin center.\nAuthorize the Microsoft Entra application with your provider of choice\nThe agent uses Federated Identity Credentials (FIC) as a secure, secret-less authentication method for exchanging data with the threat detection system provider. Follow these steps to configure FIC for your Microsoft Entra application. For more information, see\nConfigure a user-assigned managed identity to trust an external identity provider\n.\nOpen the Azure portal and go to\nApp registrations\n. Select the application you created in step 1 above.\nIn the sidebar, select\nManage\n>\nCertificates & secrets\n>\nFederated credentials\n.\nSelect\nAdd credential\n.\nIn the Federated credentials scenario drop-down, select\nOther issuer\n.\nFill the fields according to these instructions:\nIssuer\n: Enter the following URL, replacing\n{tenantId}\nwith your organization's Microsoft Entra tenant ID:\nhttps://login.microsoftonline.com/{tenantId}/v2.0\nType\n: Select\nExplicit subject identifier\n.\nValue\n: Input a string structured as follows:\n/eid1/c/pub/t/{base 64 encoded tenantId}/a/m1WPnYRZpEaQKq1Cceg--g/{base 64 encoded endpoint}\nPerform base64 encoding for your organization's Microsoft Entra tenant ID and the base URL of the threat detection web service. Replace the placeholder\n{base 64 encoded tenantId}\nwith the base64-encoded value of your tenant ID, and the placeholder\n{base 64 encoded endpoint}\nwith the base64-encoded base URL.\nTo get the base64 encoding of your tenant ID and endpoint URL, use the following PowerShell script. Make sure to replace the placeholder values \"11111111-2222-3333-4444-555555555555\" and \"https://provider.example.com/threat_detection/copilot\" with your actual values for tenant ID and endpoint URL:\n# Encoding tenant ID\n$tenantId = [Guid]::Parse(\"11111111-2222-3333-4444-555555555555\")\n$base64EncodedTenantId = [Convert]::ToBase64String($tenantId.ToByteArray()).Replace('+','-').Replace('/','_').TrimEnd('=')\nWrite-Output $base64EncodedTenantId\n\n# Encoding the endpoint\n$endpointURL = \"https://provider.example.com/threat_detection/copilot\"\n$base64EncodedEndpointURL = [Convert]::ToBase64String([Text.Encoding]::UTF8.GetBytes($endpointURL)).Replace('+','-').Replace('/','_').TrimEnd('=')\nWrite-Output $base64EncodedEndpointURL\nName\n: Choose a descriptive name.\nSelect the\nAdd\nbutton.\nStep 2: Configure the threat detection system\nNext, you need to configure the threat detection system in Power Platform admin center to connect your agent to the external security provider.\nPrerequisites for configuring threat detection in Power Platform admin center\nThe App ID of the Microsoft Entra application you created in the previous step.\nThe endpoint link provided by your external monitoring system's provider. The endpoint link is the same base URL endpoint you use when configuring the Microsoft Entra application.\nA user with a Power Platform Administrator role to configure the connection.\nPerform any other steps required by your security provider to authorize your registered application. You should consult your provider's documentation (as applicable) for any specific onboarding and authorization steps.\nTo configure the threat detection system in Power Platform admin center, follow these steps:\nSign in to the\nPower Platform admin center\n.\nOn the side navigation, select\nSecurity\nand then select\nThreat detection\n. The\nThreat detection\npage opens.\nSelect\nAdditional threat detection\n. A pane opens.\nSelect the environment for which you want to enhance agent protection and select\nSet up\n. A pane opens.\nSelect\nAllow Copilot Studio to share data with a threat detection provider\n.\nUnder\nAzure Entra App ID\n, enter the App ID of the Microsoft Entra application you created previously.\nEnter the\nEndpoint link\nprovided by your external monitoring system's provider. The endpoint link is the same base URL endpoint you use when configuring the Microsoft Entra application.\nUnder\nSet error behavior\n, define the system's default behavior for when the threat detection system doesn't respond in time or responds with an error. By default, this is set to\nAllow the agent to respond\n, but you can also chose the\nBlock the query\noption to further reduce risk.\nSelect\nSave\n.\nImportant\nThe save will fail if your Microsoft Entra app is not properly configured in Microsoft Entra or not properly authorized with your provider of choice.\nNote\nOnce configured, the threat detection system triggers before any tool invocation by an agent. If the agent doesn't receive a decision from the system (either allow or block) within one second, it proceeds to\nallow\nthe tool to execute as planned.\nTroubleshooting\nHere's some information on issues that might occur and how to handle them.\nPower Platform admin center threat detection configuration issues\nThe following table describes common errors that might happen when you select\nSave\nin the previous step, and how to handle these errors:\nError\nHow to handle\nThere was a problem saving your settings. Try saving again, and if that doesn't work, contact your admin for help.\nA general issue in saving the configuration. Try again. If that doesn't work, contact Copilot Studio for support.\nThere was a problem connecting to the protection provider. Contact the provider for help.\nThis error is displayed when a call to the provided endpoint times out or fails. Contact the provider and verify there are no issues with its service.\nThere was a problem connecting to the protection provider. Try checking the endpoint link. If that doesn't work, contact the protection provider for help.\nThis error is displayed when a call to the provided endpoint fails. Check the provided endpoint link and if that doesn't work, contact the threat detection service provider, and verify there are no issues with its service.\nThere was a problem connecting to the protection provider. Try again, and if that doesn't work, contact the protection provider for help.\nThis error is displayed when a call to the provided endpoint fails. Try again, and if that doesn't work, contact the provider and verify there are no issues with its service.\nThere was a problem with the configuration. Try checking the details you entered and the Microsoft Entra configuration. If the problem persists, contact your admin for help.\nThe token acquisition failed. Check the Microsoft Entra application configuration and the Federated Identity Credentials. More details on the specific issue can be found after selecting \"Copy error info.\"\nTo change a configuration, make sure you have Power Platform admin permissions.\nHave a user with the required permissions\nFor more error details, select\nCopy error info\n.\nCommon Microsoft Entra and authentication issues\nHere are some other common issues that might occur with your Microsoft Entra app and authentication.\nMicrosoft Entra application doesn't exist\nExample\n: Failed to acquire token: AADSTS700016: Application with identifier '55ed00f8-faac-4a22-9183-9b113bc53dd4' wasn't found in the directory 'Contoso'. This can happen if the application isn't installed by the administrator of the tenant or consented to by any user in the tenant. You might have sent your authentication request to the wrong tenant.\nHow to handle\n: Make sure the application ID provided is correct and exists in Azure.\nNo FIC configured on the app\nExample:\nFailed to acquire token: A configuration issue is preventing authentication—check the error message from the server for details. You can modify the configuration in the application registration portal. See\nhttps://aka.ms/msal-net-invalid-client\nfor details. Original exception: AADSTS70025: The client '57342d48-0227-47cd-863b-1f4376224c21'(Webhooks test) has no configured federated identity credentials.\nHow to handle\n: The app provided doesn't have any FIC configured on it. Follow the documentation and configure FIC accordingly.\nInvalid FIC Issuer\nExample:\nFailed to acquire token: A configuration issue is preventing authentication—check the error message from the server for details. You can modify the configuration in the application registration portal. See\nhttps://aka.ms/msal-net-invalid-client\nfor details. Original exception: AADSTS7002111: No matching federated identity record found for presented assertion issuer 'https://login.microsoftonline.com/262d6d26-0e00-40b3-9c2f-31501d4dcbd1/v2.0'. Make sure the federated identity credential Issuer is 'https://login.microsoftonline.com/{tenantId}/v2.0'.\nHow to handle\n: No FIC with the expected issuer was found on the app. Open your FIC configuration and set the issuer to\nhttps://login.microsoftonline.com/{tenantId}/v2.0\n(filling in your tenant ID).\nInvalid FIC Subject\nExample:\nFailed to acquire token: A configuration issue is preventing authentication—check the error message from the server for details. You can modify the configuration in the application registration portal. See\nhttps://aka.ms/msal-net-invalid-client\nfor details. Original exception: AADSTS7002137: No matching federated identity record found for presented assertion subject '/eid1/c/pub/t/Jm0tJgAOs0CcLzFQHU3L0Q/a/iDQPIrayM0GBBVzmyXgucw/aHR0cHM6Ly9jb250b3NvLnByb3ZpZGVyLmNvbeKAiw'. Make sure the federated identity credential Subject is '/eid1/c/pub/t/{tenantId}/a/iDQPIrayM0GBBVzmyXgucw/aHR0cHM6Ly9jb250b3NvLnByb3ZpZGVyLmNvbeKAiw'.\nHow to handle\n: No FIC with the expected subject is found on the app. Open your FIC configuration and set the subject to the expected value as suggested in the error (fill in your tenant ID). Make sure there are no extra whitespaces or blank lines in the subject fields.\nApp isn't allowlisted with provider (Microsoft Defender specific)\nExample:\nThe application ID in your authentication token doesn't match the registered application for webhook access. Ensure you're using the correct application credentials.\nHow to handle\n: Application isn't allowlisted with the provider. Refer to the provider documentation to grant the app webhook access.\nDisconnect the protection by the threat detection system\nIf you no longer want the threat detection system to monitor your agent, follow these steps:\nSign in to the\nPower Platform admin center\n.\nOn the side navigation, select\nSecurity\nand then select\nThreat detection\n. The\nThreat detection\npage opens.\nSelect\nAdditional threat detection\n. A pane opens.\nSelect the environment for which you want to turn off enhanced agent protection and select\nSet up\n. A pane opens.\nUnselect\nAllow Copilot Studio to share data with your selected provider\n.\nSelect\nSave\n.\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "External Threat Detection", @@ -494,7 +494,7 @@ "https://learn.microsoft.com/en-us/microsoft-copilot-studio/advanced-hand-off": { "content_hash": "sha256:624af01242672e835bf611a9225c9b4e18bbaf142b8d7de0c59ee17f1074ba43", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nHand off to a live agent\nFeedback\nSummarize this article for me\nWith Copilot Studio, you can configure your agent to hand off conversations to live agents seamlessly and contextually.\nWhen your agent hands off a conversation, it can share the full history of the conversation, and all relevant variables. With this context, a live agent that uses a connected engagement hub can be notified that a conversation requires a live agent, see the context of the prior conversation, and resume the conversation.\nFor more information about how to configure handoff with\nOmnichannel for Customer Service\n, see\nConfigure handoff to Dynamics 365 Customer Service\n.\nNote\nYou can choose to escalate an agent conversation without linking to an engagement hub:\nAt the bottom of the desired topic, select the\nAdd node\nicon\n, point to\nTopic Management\n, and select\nGo to another topic\n.\nSelect\nEscalate\n.\nEscalate\nis a\nsystem topic\nthat, by default, provides a simple message to a user if they ask for a human agent.\nYou can edit the topic to include a simple URL to a support website or ticketing system, or to include instructions for emailing or contacting support.\nPrerequisites\nA agent built with Microsoft Copilot Studio\nAn engagement hub that is being used by live agents, such as\nOmnichannel for Customer Service\n, and you need to configure the connection, as described in\nConfigure handoff to Omnichannel for Customer Service\nConfigure the Escalate system topic\nWhen you create an agent from Dynamics 365 Customer Service, the\nEscalate\nsystem topic already includes a\nTransfer conversation\nnode. However, agents created in Copilot Studio aren't configured with this node by default. To add a\nTransfer conversation\nnode to the\nEscalate\nsystem topic, follow these steps:\nIn the side navigation pane, select\nTopics\n, switch to the\nSystem\ntab, and select the\nEscalate\ntopic.\nAt the bottom of the topic, select the\nAdd node\nicon\n, point to\nTopic Management\n, and select\nTransfer conversation\n.\nTrigger handoff to a live agent\nCustomers engaging with your agent can ask for a live agent at any point in the conversation. This escalation can happen in two ways, with an implicit trigger or an explicit trigger.\nUpon triggering the handoff topic, the agent starts the handoff to the configured engagement hub, and sends over all conversation context to find the next best live agent to ramp them up so they can resume the conversation.\nImplicit triggers\nIn some instances, your agent might be unable to determine the intent of a customer's conversation. For example, the customer might be asking a specific question for which there's no\ntopic\n, or no matching option within a topic.\nIn other instances, the customer might ask to be handed off to a live agent immediately. For example, a customer might type \"talk to agent\" mid-way into a conversation.\nWhen the agent detects an escalation in this manner, it automatically redirects the user to the\nEscalate system topic\n. This type of trigger is known as\nimplicit\ntriggering.\nExplicit triggers\nWhen creating topics for your agent, you may determine that some topics require interaction with a human. This type of trigger is known as\nexplicit\ntriggering.\nIn these instances, you must add a\nTransfer conversation\nnode to the topic. This node lets you add a\nPrivate message to agent\n, which is sent to the connected engagement hub to help the live agent understand the history and context of the conversation.\nNote\nConversations that reach this node are marked as\nEscalated\nsessions in\nreporting analytics\n.\nTo configure explicit triggering for a topic:\nAt the bottom of the topic, select the\nAdd node\nicon\n, then select\nSend a message\nto add a message node. Enter what the agent should say to indicate that transfer to a live agent is about to occur.\nBelow the message node, select the\nAdd node\nicon\n, point to\nTopic Management\n, and select\nTransfer conversation\n.\nEnter an optional private message to the live agent in the\nTransfer conversation\nnode. This optional message can be useful if you have multiple topics with\nTransfer conversation\nnodes as the information is stored in the\nva_AgentMessage\ncontext variable\n.\nThe topic starts the transfer to a live agent when this node is reached. You can test the handoff by triggering the topic in the test canvas.\nNote\nOnce you add a\nTransfer conversation\nnode into a conversation, each time you trigger handoff your users will see a \"No renderer for this activity\" message on the demo website. This message suggests the need to\ncustomize your chat canvas\nto implement custom client-side code that brings in a human agent from your engagement hub into the conversation.\nContext variables available upon handoff\nBeyond providing an automated way for a conversation to be ported into an engagement hub, it's important to ensure that the best agent for a specific problem is engaged. To help route conversations to the most appropriate live agent there are context variables that are also passed to the engagement hub.\nYou can use these variables to automatically determine where the conversation should be routed. For example, you might have added\nTransfer conversation\nnodes to several different topics, and you want to route conversations related to certain topics to specific agents.\nThe following table lists the context variables available by default.\nContext\nPurpose\nExample\nva_Scope\nRoute escalations to a live agent.\n\"agent\"\nva_LastTopic\nRoute escalations to a live agent and help them ramp-up. Includes the last topic that was triggered by an utterance from the user.\n\"Return items\"\nva_Topics\nRamp-up a live agent. Only includes topics triggered by customers using a trigger phrase. Doesn't include topics that were redirected to.\n[ \"Greetings\", \"Store Hours\", \"Return Item\" ]\nva_LastPhrases\nRoute escalation to a live agent and help them ramp-up.\n\"Can I return my item\"\nva_Phrases\nRamp-up a live agent.\n[\"Hi\", \"When does store open\", \"Can I return my item\" ]\nva_ConversationId\nUniquely identify an agent conversation.\n6dba796e-2233-4ea8-881b-4b3ac2b8bbe9\nva_AgentMessage\nRamp-up a live agent.\n\"Got a gift from: HandoffTest\"\nva_BotId\nIdentify the agent that's handing off a conversation.\n6dba796e-2233-4ea8-881b-4b3ac2b8bbe9\nva_Language\nRoute escalation to a live agent.\n\"en-us\"\nAll\nuser-defined topic variables\nRamp-up a live agent.\n@StoreLocation = \"Bellevue\"\nA customer might go through several topics prior to escalating. Your agent gathers all context variables across topics and merges them before sending to the engagement hub.\nIf there are topics with similarly named context variables, the agent promotes the most recently defined topic variable.\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Human Agent Handoff", @@ -503,7 +503,7 @@ "https://learn.microsoft.com/en-us/microsoft-copilot-studio/authoring-test-bot": { "content_hash": "sha256:341b662f25ce8dd87ff988db1c2bf438b8ebe40947ab4456ad8b2be2d9d29ba6", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nTest your agent\nFeedback\nSummarize this article for me\nAs you design your agent in Copilot Studio, use a test panel to see how the agent leads a customer through the conversation. It's a good way to make sure your topics work and that conversations flow as you expect.\nWhen you test an agent that uses generative orchestration, you can follow the orchestrator's plan in real time on the\nactivity map\n. Close the activity map if you want to follow through the conversation path step by step with tracking between topics turned on.\nIn addition to testing your agent in the\nTest your agent\npanel, you can create test sets of multiple queries for automated testing. For more information, see\nCreate test cases to evaluate your agent (preview)\n.\nUse the test chat\nWeb app\nClassic\nTeams\nUse the\nTest your agent\npanel to walk through your agent conversations as a user. It's a good way to make sure your topics are working and that conversations flow as you expect.\nIn addition to testing your agent in\nTest your agent\npanel, you can create test sets of multiple queries for\nautomated testing\n. To start an automated test, select the evaluate\nbutton.\nPreview a conversation\nIf the\nTest your agent\npanel is hidden, open it by selecting\nTest\nat the top of any page.\nIn the field at the bottom of the\nTest your agent\npanel, enter some text. If the text is similar to a trigger phrase for a topic, that topic begins.\nSelect the agent response in the test chat. This action takes you to the topic and the node that sent the response. Nodes that fired have a colored checkmark and a colored bottom border.\nAs you continue the conversation within the active topic, notice that each node that fires is marked with the checkbox and bottom border, and centered on the canvas.\nIf you want to follow the whole conversation automatically as it moves from topic to topic, select the three dots (\n…\n) at the top of the test panel and turn on\nTrack between topics\n. For an agent that uses generative orchestration (default), consider turning off\nShow activity map when testing\nto avoid having to collapse the activity map at every conversation turn.\nContinue the conversation until you're satisfied that it flows as intended.\nTip\nYou can update a topic at any time while interacting with the test agent. Save your topic to apply changes and continue the conversation with your agent.\nYour conversation isn't automatically cleared when you save a topic. If you want your agent to forget the test conversation and start over, select the\nReset\nicon\nat the top of the test panel.\nYou can use queries you send in the test chat to\ncreate automated test sets\nfor evaluating your agent.\nIf the\nTest bot\npanel is hidden, select\nTest\non the top menu bar.\nUnless you want to continue an earlier conversation, select the\nReset\nicon\nat the top of the\nTest bot\npanel to clear the previous conversation. Clearing previous conversations makes it easier to see the flow of the topic you want to test.\nAt the\nType your message\nprompt at the bottom of the\nTest bot\npanel, enter a trigger phrase for the topic you want to test. The trigger phrase starts the topic's conversation, and the\nTest bot\npanel displays the bot responses and user response choices you specified.\nContinue the conversation and verify that it flows as expected. If you want to follow the conversation automatically as it moves from topic to topic, turn on\nTracking between topics\nat the top of the panel. You can turn it off if you prefer to focus on a specific topic.\nSelect any response in the test chat. This action takes you to the originating node in the topic editor. The test panel automatically resets itself when you save changes to a topic.\nIf the\nTest bot\npanel is hidden, select the\nTest your chatbot\nicon at the bottom of the left pane.\nUnless you want to continue an earlier conversation, select the\nReset\nicon\nat the top of the\nTest bot\npanel to clear the previous conversation. Clearing previous conversations makes it easier to see the flow of the topic you want to test.\nAt the\nType your message\nprompt at the bottom of the\nTest bot\npanel, enter a trigger phrase for the topic you want to test. The trigger phrase starts the topic's conversation, and the\nTest bot\npanel displays the bot responses and user response choices you specified.\nContinue the conversation and verify that it flows as expected. If you want to follow the conversation automatically as it moves from topic to topic, turn on\nTracking between topics\nat the top of the panel. You can turn it off if you prefer to focus on a specific topic.\nSelect any response in the test chat. This action takes you to the originating node in the topic editor. The test panel automatically resets itself when you save changes to a topic.\nTest variable values\nYou can observe the values of your variables as you test your agent.\nSelect\nVariables\non the secondary toolbar. The\nVariables\npanel appears.\nSwitch to the\nTest\ntab and expand the desired variable categories. As you proceed with your test conversation, you can monitor the value of the variables in use.\nTo inspect variable properties, select the desired variable. The\nVariable properties\npanel appears.\nSave conversation snapshots\nWhile you're testing your agent, you can capture the content of the conversation and diagnostics data, and save it. You can then analyze the data to troubleshoot issues, such as the agent not responding in the way you expect.\nWarning\nThe snapshot file contains all your agent content, which might include sensitive information.\nWeb app\nClassic / Teams\nAt the top of the \nTest your agent\n pane, select the three dots (\n…\n), then select\nSave snapshot\n. A message appears, notifying you that the snapshot file might include sensitive information.\nSelect \nSave\nto save the agent content and conversational diagnostics in a .zip archive named \nbotContent.zip\n.\nThe \nbotContent.zip\n archive contains two files:\ndialog.json\n contains conversational diagnostics, including detailed descriptions of errors.\nbotContent.yml\n contains the agent's topics and other content, including entities and variables.\nAt the top of the\nTest bot\npanel, select the three dots (\n⋮\n), then select\nSave snapshot\n. A message appears, notifying you that the snapshot file might include sensitive information.\nSelect\nSave\nto save the bot content and conversational diagnostics in a .zip archive named\nDSAT.zip\n.\nThe\nDSAT.zip\narchive contains two files:\ndialog.json\ncontains conversational diagnostics, including detailed descriptions of errors.\nbotContent.json\ncontains the bot's topics and other content, including entities and variables.\nManage connections\nIf your agent requires\nuser connections\n, you can manage the connections used by your test chat: Select the three dots (\n…\n) at the top of the test panel, then select\nManage connections\n.\nReport issues\nHelp us improve Copilot Studio by reporting issues. All information collected remains anonymous.\nWeb app\nClassic / Teams\nAt the top of the \nTest your agent\n pane, select the three dots (\n…\n), then select\nFlag an issue\n.\nSelect\nFlag issue\n. This action sends your conversation ID to Microsoft. The ID is a unique identifier that Microsoft uses to troubleshoot issues in a conversation. When you report an issue, you don't send other information, such as what is stored in a conversation snapshot file.\nSelect the\nFlag an issue\nicon on the\nChat\nbanner at the top of the\nTest bot\npanel.\nSelect\nFlag issue\n. This action sends your conversation ID to Microsoft. The ID is a unique identifier that Microsoft uses to troubleshoot issues in a conversation. When you report an issue, you don't send other information, such as what is stored in a conversation snapshot file.\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Test Your Agent", @@ -512,7 +512,7 @@ "https://learn.microsoft.com/en-us/microsoft-copilot-studio/admin-network-isolation-vnet": { "content_hash": "sha256:068da96883a115dd4432bc5bcc2b06c555851f597d2d67a6e28452e6cc2214b9", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nConfigure Virtual Network support for outbound connections from agents\nFeedback\nSummarize this article for me\nWhen you use\nVirtual Network support in a Power Platform environment\n, you can securely connect to and integrate Power Platform and Dataverse components with cloud services, or services hosted inside your private enterprise network, without exposing them to the public internet.\nCopilot Studio integrates with Power Platform virtual networks over a private endpoint for these scenarios:\nAgents that retrieve keys from Azure Key Vault\nover HTTP\nAgents that send telemetry to a private endpoint-enabled instance of\nApplication Insights\nAgents that use a virtual network-supported connector (like the SQL Server connector) to get data from Azure SQL Server\nIf you set up a virtual network for a Power Platform environment and enable Copilot Studio to\ncapture telemetry with Application Insights\nor\nmake HTTP requests with your agent\nover the virtual network, then the calls from Power Platform to Azure resources and Application Insights go through your private network.\nPrerequisites\nYour environment must be\na Managed Environment in Power Platform\nYou must have\nVirtual Network support enabled for your Power Platform environment\n. Also see\nSet up Virtual Network support for Power Platform\nto create virtual networks and delegate subnets that can connect between Azure resources and your Power Platform environment.\nYou must be a Power Platform\ntenant admin\nor have the\nEnvironment Admin role\nEnable virtual network support for your environment\nTo connect to services through a private endpoint, you must have\nvitual network support enabled for Power Platform\n.\nYou can enable virtual network support manually, by following the instructions at\nSet up Virtual Network support for Power Platform\nto create virtual networks and delegate subnets that can connect between Azure resources and your Power Platform environment.\nYou can also use a prebuilt Azure Resource Manager (ARM) template to configure and connect your Power Platform environment with Azure and enable virtual network support:\nDownload the\nARM template from the Microsoft Copilot Studio samples repository on GitHub\n.\nOpen PowerShell, connect to your Azure subscription and deploy the template with the\nNew-AzDeployment command\nas follows:\nConnect-AzAccount -Subscription \"\"\nNew-AzSubscriptionDeployment -Name \"\" -TemplateFile \"\" -Location \"\"\nwhere:\n\nis your subscription ID.\n\nis the name you want to give this deployment.\nThe name can be anything you choose, but defaults to the template's filename if you leave it blank.\n\nis the path and filename of the template file.\n\nis the geographic region where you want the deployment management files to go, such as\nWest US\n. The region doesn't control where the template creates the resources.\nSee\nDeploy resources with ARM templates and Azure PowerShell\nfor more information about ARM templates and management.\nNote\nYou only need to configure your virtual network using either the ARM template, or manually. You don't need to do both.\nReview the overview about\nVirtual Network support for Power Platform\n, before following the insutrctions at\nSet up Virtual Network support for Power Platform\nto create virtual networks and delegate subnets that can connect between Azure resources and your Power Platform environment.\nRetrieve keys from Azure Key Vault over HTTP\nWhen you\nset up a virtual network for your Power Platform environment\n, you can configure your Copilot Studio agents to retrieve information from Azure resources with HTTP calls.\nFirst, you set up a private link and endpoint for Azure Key Vault. Then, after validating that the link is working, you add a HTTP Request node from the agent's authoring canvas in Copilot Studio to connect to Key Vault.\nSet up a private link\nFollow the instructions at\nIntegrate Key Vault with Azure Private Link\nto:\nCreate a new key vault and establish a private link\nthat scopes the link to your Azure subscription and the resource group where your Key Vault is located, or\nEstablish a private link connection to an existing key vault\n.\nValidate that the private link to Key Vault is working\n.\nTip\nIf your endpoint isn't correct, review the instructions and related articles for private links and private endpoints in the\nDiagnose private links configurations issues on Azure Key Vault\narticle.\nUse HTTP Request nodes to connect over a private network\nAfter you configure the private link to Key Vault, you add an HTTP Request node to an agent in Copilot Studio to connect over the private network. You specify the connection details to the private endpoint in the node, and when that node is reached in the agent's conversation, the request is made and the information retrieved.\nIn Copilot Studio, on the top menu bar, select an environment where Virtual Network support is enabled.\nCreate or open an existing agent in that environment. If you create a new agent, you can skip the initial configuration steps in the conversational canvas.\nWith the agent open, create or modify a topic in the authoring canvas.\nFollow the instructions at\nMake HTTP requests\nto add an HTTP request node to the topic.\nUse the following settings in the HTTP Request node:\nURL\n: Enter the URL for your Azure Key Vault private endpoint, for example,\nhttps://yourkeyvault.vault.azure.net/secrets?api-version=7.3\n. Replace\nyourkeyvault\nwith the name of your Key Vault.\nMethod\n: Select\nGET\nto retrieve secrets from Key Vault.\nHeaders and body\n: Select edit.\nIn\nHTTP Request properties\n, enter\nAuthorization\nas the\nKey\n, and\nBearer \nas the\nValue\n, where\n\nis your Azure access token.\nSave the topic, and test that the node works by triggering the conversation in the agent's test canvas.\nSend telemetry to a private endpoint-enabled instance of Application Insights\nWhen you\nset up a virtual network for your Power Platform environment\n, you can configure your Copilot Studio agents to send telemetry to a private endpoint-enabled instance of Application Insights. Doing so allows you to monitor and analyze the performance and usage of your agents without exposing the data to the public internet.\nFirst, you set up a private link and endpoint for Application Insights. Then, after validating that the link is working, you connect Copilot Studio to Application Insights and it'll send telemetry data over the private link.\nSet up a private link\nAn\nAzure Private Link to Azure Monitor\nlets Copilot Studio use your virtual network to send agent telemetry to Azure Monitor over a private IP address instead of a public IP address.\nAzure Monitor is the backend data platform used to collect and store telemetry data, including Application Insights data.\nFollow the instructions at\nConfigure private link for Azure Monitor\nand:\nCreate an Azure Monitor Private Link Scope (AMPLS)\nto scope the link to your Azure subscription and the resource group where your Azure Monitor resources are located.\nConnect Application Insights component resources to the AMPLS\n.\nCreate a private endpoint for the Application Insights resources you added to the scope\nthat Copilot Studio can connect to in your virtual network and over your subnet. This endpoint is used to send telemetry data from the agent to the AMPLS.\nValidate that the private link to Azure Monitor is working\n.\nYou can also\nconfigure which networks can connect to resources in your AMPLS, without using a scope, in the\nNetwork Isolation\npage for your AMPLS\n. Directly configuring networks is useful if you have multiple virtual networks and want to restrict access to the AMPLS to only certain networks or subnets.\nTip\nIf your endpoint isn't correct, review the instructions and related articles for private links and private endpoints in the\nConfigure private link for Azure Monitor\narticle.\nConnect Copilot Studio to Application Insights\nWith the private link set up, you can connect Copilot Studio to Application Insights and it'll use your virtual network to send telemetry data.\nFollow the instructions at\nCapture telemetry with Application Insights\n.\nImportant\nEnsure you get the correct\nConnection string\nfor the private endpoint-enabled Application Insights .\nYou can validate it's the correct resource by checking the values under\nResource group\nand\nSubscription\non the\nOverview\nsection for Application Insights in the Azure portal.\nTelemetry from Copilot Studio agents appears in the Application Insights resource you configured. You can use the\nLive Metrics Stream\nto see telemetry data in real time, or use the\nLogs\nsection to query and analyze the data.\nUse virtual network-supported connectors to get data\nWhen you\nset up a virtual network for your Power Platform environment\n, you can configure your Copilot Studio agents to use virtual network-supported connectors to connect to data and services over your private network.\nYou can\nuse any connector that has native support for virtual networks\n.\nUsing virtual network-supported connections lets you securely connect to your cloud-hosted data sources, such as\nAzure SQL\nor\nSQL Server\n, over private endpoints without exposing them to the internet.\nFollow the instructions at\nUse Power Platform connectors in Copilot Studio\nto add and configure the connector you want to use in a topic or tool.\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "VNet Support", @@ -521,7 +521,7 @@ "https://learn.microsoft.com/en-us/microsoft-copilot-studio/whats-new": { "content_hash": "sha256:54a998cb6222354487fd7d532b61c618eb1d2a7de9a4599d6f43b34244eecfb3", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nWhat's new in Copilot Studio\nFeedback\nSummarize this article for me\nThis article provides resources to learn about new features in Copilot Studio.\nRelease plans\nFor information about new features being released over the next few months that you can use for planning, see\nRelease Planner\n.\nReleased versions\nFor information about the new features, fixes, and improvements released in the past few weeks, see\nReleased versions of Microsoft Copilot Studio\n.\nNote\nReleases are rolled out over several days. New or updated functionality might not appear immediately.\nNotable changes\nThe following sections list features released in the past months, with links to related information.\nMarch 2026\n(Preview) Use\nWork IQ\ntools to connect Microsoft 365 Copilot and your agents to the Work IQ service, enabling access to real-time work insights and context from Microsoft 365 files, emails, meetings, chats, and more.\nFebruary 2026\nImproved agent responses for ticket‑based Microsoft 365 Graph\nconnectors\n. Agents more accurately retrieve\nServiceNow\ntickets and\nAzure DevOps\nwork items and generate clear, actionable summaries, which improves workflow reliability, and time to value.\nSelect Claude Sonnet 4.5 (beta) for\nComputer Use\nagents. This model improves nuanced decision‑making for complex tasks, increasing reliability, and success rates for advanced uses.\nEnhancements to the\nprompt builder\ninclude:\nConfigure\ncontent moderation sensitivity per prompt\nto control how hate/fairness, sexual, violence, and self‑harm content is filtered—supporting regulated and document‑processing scenarios with low or high sensitivity settings for managed models.\nOptimize\nprompts\nwith new Claude models by choosing Claude Opus 4.6 or Claude Sonnet 4.5, enabling fine‑grained control over reasoning depth, quality, latency, and cost per prompt.\nEdit prompt instructions and settings inline\nin agent tool details, bringing model selection, inputs, knowledge, and testing into a single, streamlined authoring experience.\nJanuary 2026\n(Preview) New enhancements to agent evaluations:\nProvide real‑time\nthumbs‑up/down feedback\non evaluation results to verify grading performance and drive ongoing improvements to evaluation reliability.\nView your agent's sequence of inputs, decisions, and outputs with\nactivity maps\nso you can quickly diagnose issues and get clearer insight into how your agent behaves at runtime.\nUse a validated\nCSV template\nto create test sets, reducing formatting errors, and helping your team standardize evaluation data more quickly.\n(Preview) Expand\ncomputer use\ncapabilities with new model support, built‑in credentials,\nenhanced audit logging\nwith session replay, and\nCloud PC pooling\n—giving you greater security, scalability, and governance for agent‑driven workflows.\n(General availability) Use the\nMicrosoft Copilot Studio extension for Microsoft Visual Studio Code\nto build, edit, and manage agents inside Microsoft Visual Studio Code, supporting advanced and highly flexible developer workflows.\nDecember 2025\nCompare multiple agent versions side by side to validate improvements and quickly spot regressions when\nevaluating agents with test sets\n.\nNovember 2025\nUpdates for models used in Copilot Studio:\nOn November 24, 2025, GPT-5 Chat rolls out to Copilot Studio in\ngeneral availability\nfor European and United States regions. You can use generally available models for orchestration in production agents. For more information on choosing orchestration models, see\nSelect a primary AI model for your agent\n.\n(Preview) Automatically create\nMicrosoft Entra agent identities\nfor agents. When turned on, automatically apply identity management to individual agents by assigning a Microsoft Entra agent identity, helping admins secure and manage agents more effectively.\nImproved knowledge retrieval for SharePoint-grounded agents using\ntenant graph grounding\n. Updated system architecture and new retrieval methods deliver more precise, context-rich responses, enhancing answer quality.\nNote\nSome queries might lead to slightly higher latency.\nImprove response accuracy with\nSharePoint metadata filters\n. Use metadata like filename, owner, and modified date to refine knowledge retrieval and ensure responses come from the most relevant, up-to-date documents.\nOrchestrate multiple agents\nto break down complex tasks across specialized agents, improving accuracy and speeding up end‑to‑end automation. Enhance your agents by linking them to other agents—either within your environment or external sources like\nMicrosoft Fabric\ndata agents—for modular, task-specific functionality.\n(Preview) Add tool groups to agents for faster setup. Quickly equip your agents with curated sets of tools from Outlook and SharePoint connectors in one step. This streamlines setup, reduces errors, and ensures consistent, reliable orchestration.\nCopy agents from\nMicrosoft 365 Copilot to Copilot Studio\n. Easily move agents you created in Microsoft 365 Copilot into Copilot Studio to unlock advanced capabilities like multistep workflows, custom integrations, and broader deployment options.\n(Preview) Add human input to agent workflows with the\nrequest for information\naction. Pause an agent flow to collect details from designated reviewers via Outlook, then resume execution using their responses as dynamic parameters. This action ensures workflows can handle missing data or context without relying on hard-coded values.\nUpdate Power Platform API calls to use the\nnew 'copilotstudio' namespace\n. The previous namespace will continue to work temporarily, but switching now ensures compatibility with future updates.\nUse\ncomponent collections\nwith new enhancements. Access collections directly from the sidebar, quickly export or import collections, and take advantage of support for primary agents and new connector types, including child agents and Model Context Protocol (MCP).\nOctober 2025\nUpdates for models used in Copilot Studio:\nBetween October 27 and 31, 2025, GPT‑4o will be retired in Copilot Studio for agents using generative orchestration, except for GCC customers who can continue using GPT‑4o. The new default model is\nGPT‑4.1\n, which delivers improved performance, reliability, and consistency across experiences. GPT‑4o remains available until November 26, 2025 if you turn on the \"\nContinue using retired models\n\" option.\nChoose from multiple\nAI models\nto tailor your agent's performance to your needs.\n(Preview) Test and deploy\nGPT‑5\nmodels to explore advanced capabilities and enhance your agent's performance.\nLearn about\nCopilot Studio Kit\n, a suite of tools developed by the Power Customer Advisory Team (Power CAT) to help test custom agents, validate AI-generated content, analyze conversation key performance indicators, and more.\n(Preview) Group related user questions into\nthemes\nand drill down into analytics to uncover patterns and gain deeper insights.\nTrack\ntime and cost savings\nfor both autonomous and conversational agents to measure ROI and optimize performance.\nAccess a\nunified activity and transcript view\n, pin sessions, and submit feedback for faster, more effective troubleshooting.\n(Preview)\nAccelerate flow execution\nto minimize timeouts and deliver a faster, smoother user experience.\nUse the\nModel Context Protocol (MCP) server\nto access dynamic, real-time content—such as files, database records, and API responses—for richer context and improved agent responses.\n(Preview) Evaluate your agents using customizable\ntest sets\n—whether uploaded, manually created, or AI-generated. Test sets can include test cases using different test methods (graders) measured against defined reference answers, helping teams identify strengths and areas for improvement. This capability supports more reliable, high-quality agent experiences across diverse scenarios.\nSeptember 2025\n(Preview) Automate tasks in desktop applications on Windows using\nComputer-Using Agents (CUA)\n, which combines vision and reasoning to interact with interfaces—even when APIs aren't available.\nEmbed Copilot agents into Android, iOS, and Windows apps using the\nClient SDK\nto provide rich, multimodal conversations within native experiences.\n(Preview) Upload Excel, CSV, and PDF files for your agent to analyze using Python code, powered by the\ncode interpreter in chat\n.\nAugust 2025\n(General availability) Use\ncode interpreter\nto generate Python code-based actions from natural language in both the prompt builder and agent workflows.\n(General availability) Enhance agentic response accuracy in Copilot Studio agents by using\nfile groups\nto organize local files to be uploaded as a single knowledge source and apply variable-based instructions.\nAllow users to\nupload files and images\nthat your Copilot Studio agent can analyze and use to generate responses, then pass those files to downstream systems using Agent Flows, Power Automate, connectors, tools, and topics for seamless integration.\n(General availability) Track and analyze unanswered queries and response generative AI quality with the\ngenerated answer rate and quality\nsection in the Analytics page to improve your agent's performance.\nConnect to an existing\nMCP server\ndirectly within Copilot Studio using a guided experience.\nJuly 2025\nUse\nadvanced NLU customization\nto define topics and entities using your own data for higher accuracy and improved containment, especially for Dynamics 365 scenarios.\nSearch across your agent\n's knowledge, topics, tools, skills, and entities instantly using a new in-app search experience accessible via keyboard shortcut or top-level search.\nEstimate time and cost savings based on successful runs or actions and customizable to your organization's metrics with\nROI analytics\nfor agents with autonomous capabilities.\nView user comments submitted with\nthumbs up/down reactions\nin analytics, offering deeper insight into customer feedback on agent responses.\n(Preview) Display\nMicrosoft Information Protection (MIP)\nlabels across connectors, test chat, Teams, and Microsoft 365 Copilot to prevent oversharing and support secure, compliant AI experiences. With new integrations between Copilot Studio, Dataverse, and Microsoft Purview, you can automatically classify sensitive data and ensure agents respect Purview sensitivity labels.\nPublish agents directly to a\nWhatsApp\nphone number, making it easier to reach customers.\n(Preview) Streamline authentication for Microsoft Entra ID–backed actions and connectors with the SSO Consent Card by allowing users to grant consent directly within the chat with no redirects and no interruptions.\nJune 2025\nImproved experience for tools:\nGrouping and filtering for easier search and discovery of tools.\nSupport for IntelliSense automatic completion and input widgets such as calendar control, file picker, and timezone picker, when configuring tools.\nImproved tool invocation experience for customers with more affordances for complex inputs and clearer error messaging.\nAutomatic detection of SSO for connectors.\n(Preview)\nSupport for Microsoft 365 Copilot Tuning\nto train models on your own enterprise data for domain-specific tasks and integrate these models into Microsoft 365 experiences like Copilot in Teams, Word, and Chat. You can also connect your fine-tuned models to custom agents.\nActionable insights for questions that generative AI left unanswered, grouped by themes, in the\nAnswer rate and quality\nsection of the\nAnalytics\npage.\nKnowledge sources analysis for autonomous agents.\nAbility to insert Power Fx formulas directly in the embedded\nprompt builder prompt editor\n.\nSimplified text validation and extraction with regular expression support for\nPower Fx formulas\nthat use IsMatch, Match, or MatchAll functions.\n(Preview) Support for\nfile groups\nas knowledge sources.\n(Preview) Generative orchestration available for all\nsupported languages\n.\nRedesigned\nChannels\npage.\n(US-only preview) Ability to select the GPT-4.1 mini\nexperimental response model\nfor generative answers.\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "What's New", @@ -557,7 +557,7 @@ "https://learn.microsoft.com/en-us/microsoft-365/admin/manage/manage-copilot-agents-integrated-apps": { "content_hash": "sha256:6a0f37766e6073def34ef0cc5bb82857b6a66fe4bed6a127e51d7c4b8a9aaad3", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nManage Copilot agents in the Microsoft 365 admin center\nFeedback\nSummarize this article for me\nImportant\nThis article is intended for IT administrators.\nThe capability is enabled by default in all Microsoft 365 Copilot licensed tenants.\nMicrosoft 365 Copilot combines the power of large language models with your data and apps in Microsoft 365. It captures natural language commands to produce content and analyze data. It enables access to and use of other apps, such as Jira,\nDynamics 365\n, or Bing Web Search.\nYou can manage agents for Copilot by using the\nMicrosoft 365 admin center\n. You can enable, disable, assign, block, or remove agents for your organization, and manage Copilot capabilities.\nNote\nResearcher and Analyst are first-party Microsoft experiences built on the same foundation as Microsoft 365 Copilot, operating entirely within the Microsoft 365 commercial data processing boundary. These tools inherit all existing security, privacy, and compliance commitments that apply across the suite of Microsoft 365 products. These tools are available in Microsoft 365 Copilot Chat under\nTools\nand can be invoked by the user anytime. While Researcher and Analyst coexist with agents and abide by all the agent-related governance capabilities, Researcher and Analyst are part of the core Copilot chat experience and will not fall under any agent-related settings. For related information, see\nAgent settings in Microsoft 365 admin center\n.\nMicrosoft Agent 365 is the control plane for AI agents, empowering your organization to confidently deploy, govern, and manage all your agents at scale, regardless of where these agents are built or acquired. For more information, see\nOverview of Microsoft Agent 365\nand\nMicrosoft Agent 365 documentation\n.\nOverview\nAgents enhance the functionality of Copilot by adding search capabilities, custom actions, connectors, and APIs. Agents are custom versions of Microsoft 365 Copilot that combine instructions, knowledge, and skills to perform specific tasks or scenarios. For more information, see\nGet started with agents for Microsoft 365 Copilot\n.\nHowever, before users can access these agents, the agents must undergo a streamlined process of submission and approval. To learn more, see\nPublish agents\n.\nThe hub Copilot experience shows the list of agents that are available and deployed for the user. Users can toggle it on or off to restrict access of Copilot to any specific agents during the interaction. Users can also add or remove agents in their Copilot experience by right-clicking on the agents and selecting the appropriate option. Users can only access the agents that the admin allows and that they install or are assigned to.\nAgent types you can manage\nYou can manage several types of agents in Microsoft 365 Copilot, each serving different purposes:\nPublished by your organization\n: Built with predefined instructions and actions. These agents follow structured logic and are best for predictable, rule-based tasks. Before agents become available to users, these agents go through an admin approval and publishing process to ensure compliance and readiness.\nNote\nPublishing agents to the organization is supported in Microsoft 365 Government Community Cloud High (GCCH) and Government Community Cloud Moderate (GCCM) environments.\nShared by creator\n: Shared agents are custom versions of Microsoft 365 Copilot that combine instructions, knowledge, and skills to perform specific tasks or scenarios. Creators can create and share these agents through multiple channels, such as Microsoft 365 Copilot Studio, Microsoft 365 Copilot Agent Builder, and more. Shared agents enhance the functionality of Copilot by adding search capabilities, custom actions, connectors, and APIs. For more information, see\nShare agents with other users\n.\nAs an admin, you can view shared agents on the\nAgents\npage in the Microsoft 365 admin center. You can see a list of all shared agents, including details such as the agent's name, creator, creation date, host products, and availability status. You can search for specific agents and manage their lifecycle, including blocking agents that are deemed unsafe or noncompliant.\nFor your users, shared agents are available through Copilot on different surfaces. Users can interact with these agents to perform specific tasks or get assistance based on the agent's capabilities.\nMicrosoft agents\n: Developed by Microsoft and integrated with Microsoft 365 services.\nExternal partner agents\n: Created by external developers or vendors. You can control their availability and permissions.\nFrontier agents\n: Experimental or advanced agents that use new capabilities or integrations. These agents might be in early stages of development or testing and could require more oversight or limited rollout.\nApp Builder agent\n: A type of Frontier agent developed by Microsoft that can be managed as part of Microsoft 365 Copilot. You can also manage App Builder using\nPower Platform admin center\n.\nWorkflows agent\n: A type of Frontier agent developed by Microsoft that can be managed as part of Microsoft 365 Copilot. Flows created in Copilot are saved to the default environment unless\nenvironment routing\nis enabled for Copilot Studio. You can also manage flows using the\nPower Platform admin center\n.\nGet started\nThe following administrator roles can manage agents in the Microsoft 365 admin center:\nAI Admin\nGlobal Reader (view-only, no edit)\nImportant\nUse roles with the fewest permissions. Accounts with lower permissions help improve security for your organization. Global Administrator is a highly privileged role. Limit its use to emergency scenarios when you can't use an existing role. For more information, see\nAbout admin roles in the Microsoft 365 admin center\n.\nYou can manage agents in the\nMicrosoft 365 admin center\nby using the\nAgents\npage. On this page, you can:\nView available, deployed, or blocked agents.\nConfigure agent availability and access.\nPerform actions such as publishing, deploying, blocking, or removing agents.\nRelated articles\nAgent 365 Overview in the Microsoft 365 admin center\n.\nAgent Registry in the Microsoft 365 admin center\n.\nAgent Settings in Microsoft 365 admin center\n.\nManage agent instances in Microsoft 365 admin center\n.\nManage Connected Agents for Researcher in the Microsoft 365 admin center\n.\nManage Tools for Agent 365 in the Microsoft 365 admin center\nOverview of Microsoft Agent 365\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Manage Copilot Agents", @@ -566,7 +566,7 @@ "https://learn.microsoft.com/en-us/microsoft-365/admin/activity-reports/microsoft-365-copilot-usage": { "content_hash": "sha256:f2119b2cd39b6175f457baa7055318c7d9818be0e33e985f5b2d6cd1620f7b47", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nMicrosoft 365 reports in the admin center – Microsoft 365 Copilot usage\nFeedback\nSummarize this article for me\nThe Microsoft 365 Usage page shows you the activity overview across the Microsoft 365 productivity apps in your organization. It enables you to drill into individual product-level reports to give you granular insight about the activities within each app. To view all reports, check out the\nReports overview article\n.\nIn the Microsoft 365 Copilot usage report, which is in continuous enhancement, you can view a summary of how users' adoption, retention, and engagement are with Microsoft 365 Copilot and its associated enabled apps, including agent usage. For Copilot activity on a given day, the report becomes available typically within 72 hours of the end of that day (in UTC).\nHow do I get to the Microsoft 365 Copilot usage report?\nIn the admin center, go to\nReports\n>\nUsage\n.\nSelect the\nMicrosoft 365 Copilot\npage.\nSelect the Usage tab to view adoption and usage metrics.\nInterpret the Microsoft 365 Copilot usage report\nYou can use this report to see the usage of Microsoft 365 Copilot in your organization.\nAt the top, you can filter by different timeframes. The Microsoft 365 Copilot report can be viewed over the last 7 days, 30 days, 90 days, or 180 days.\nYou can view several numbers for Microsoft 365 Copilot usage, which highlight the enablement number and the adoption of the enablement:\nEnabled Users\nshows the total number of unique users in your organization with Microsoft 365 Copilot licenses over the selected timeframe.\nActive Users\nshows the total number of enabled users in your organization who tried a user-initiated Microsoft 365 Copilot feature, in one or more apps in Microsoft 365 over the selected timeframe.\nActive users rate\nshows you the number of active users in your organization divided by the number of enabled users.\nIn\nRecommendations\n, the recommended action card highlights\nMicrosoft Copilot Dashboard\n, where you can deliver insights to your IT leaders to explore Copilot readiness, adoption, and impact in Viva Insights.\nActive agent users\nshows you the total number of unique Microsoft 365 Copilot users in your org who used agents built by your org (including admin-approved agents and agents created via agent builder and shared with users in your org).\nNote\nAgent usage is available starting November 1, 2024, and is currently limited to agents built by your org. Usage of agents built by Microsoft and Microsoft Partners will be introduced in the coming months.\nTotal prompts submitted\nshows you the total number of prompts users sent to Microsoft 365 Copilot Chat during the selected time frame.\nAverage prompts submitted per user\nrepresents the mean number of prompts each active user sent to Microsoft 365 Copilot Chat during the selected timeframe.\nIn the\nAdoption\nsection,\nAdoption by app\nshows enabled users and active users of Copilot in Microsoft 365 apps.\nYou can see the following summary charts in this report as default view:\nThe definitions for Enabled Users and Active Users metrics are the same as provided earlier.\nSummary view\nshows you the total usage of Microsoft 365 Copilot apps of the time frame.\nTrend view\nshows you the daily time trend of Microsoft 365 Copilot apps of the time frame.\nWhen switching to\nTrend\nview, you can select one product in the dropdown list to see daily usage.\nIn the\nPrompts submitted\nsection,\nSummary view\nshows the total number of prompts users submitted to Microsoft Copilot Chat over the selected time frame.\nTrend view\nshows the daily trend of prompts submitted in Microsoft 365 Copilot over the selected time frame.\nCopilot Chat adoption\nshows enabled users and total usage of Copilot Chat and split usage between Copilot Chat (work) and Copilot Chat (web).\nAgent adoption\nshows active users of agents in Microsoft 365 Copilot for the selected period. Only usage of agents created by your org, including both admin-approved agents and agents shared by users in your org, are included in this chart.\nSummary view\nshows the total number of agent users in Microsoft 365 Copilot over the selected time frame.\nTrend view\nshows the daily trend of active agent users in Microsoft 365 Copilot over the selected time frame.\nThe following table lists the features included for active users of Copilot apps:\nCopilot app\nFeatures\nHow to use\nLearn more about the feature\nMicrosoft Edge\nCopilot Chat (web)\nTyping a message into the chat window or selecting a suggested prompt and submitting. Or selecting ‘Ask Copilot’ in the right-click of contextual web info.\nCopilot - Microsoft Edge\nCopilot Chat (work)\nTyping a message into the chat window or selecting a suggested prompt and submitting.\nCopilot - Microsoft Edge\nMicrosoft 365 Copilot (app)\nCopilot Chat (web)\nTyping a message into the chat window or selecting a suggested prompt and submitting.\nGet started with Microsoft 365 Copilot Chat\nCopilot Chat (work)\nTyping a message into the chat window or selecting a suggested prompt and submitting.\nGet started with Microsoft 365 Copilot Chat\nOutlook\nSummarize an Outlook email thread\nIn an email thread, selecting\nSummarize by Copilot or Summarize\nat the top of the email thread. (User experience is slightly different among web, Windows, Mac, or mobile.)\nSummarize an email thread with Copilot in Outlook - Microsoft Support\nGenerate an Outlook email draft\nSelecting Copilot icon from the toolbar, selecting\nDraft with Copilot\n, typing prompt in Copilot box and submitting. (User experience is slightly different among web, Windows, Mac, or mobile.)\nDraft an email message with Copilot in Outlook - Microsoft Support\nCoach\nSelecting Copilot icon in the email message, choosing\nCoaching by Copilot\nand Copilot reviews email and offers suggestions on improving the tone, clarity, and reader sentiment. (User experience is slightly different among web, Windows, Mac, or mobile.)\nEmail coaching with Copilot in Outlook - Microsoft Support\nCopilot Chat (work)\nGoing to the left side of Outlook web app, selecting Copilot from the apps list, typing a prompt and sending. This feature is included in the Outlook app level and all up Microsoft 365 active usage count effective August 28, 2024.\nGet started with Microsoft 365 Copilot Chat\nCopilot Chat (web)\nGoing to the left navigation of Outlook app, selecting Copilot from the apps list and select “Web” option at the top of the chat pane, typing a prompt into the chat window or selecting a suggested prompt and submitting. This feature is included in the Outlook app level and all up Microsoft 365 active usage count effective July 01, 2025.\nGet started with Microsoft 365 Copilot Chat\nApp Chat\nGoing to top right corner of Outlook web app, selecting Copilot placed next to settings option, typing a prompt and sending. This feature is included in the Outlook app level and all up Microsoft 365 active usage count effective August 17, 2024.\nFrequently asked questions about Copilot in Outlook\nTeams\nSummarizing key points during meetings\nSummarizing key discussion points during meeting using Copilot in Microsoft Teams.\nGet started with Copilot in Microsoft Teams meetings - Microsoft Support\nSummarize chats and channel conversations\nTyping a prompt or selecting a prompt from 'More prompts' in Copilot compose box in a chat or channel and submitting.\nUse Copilot in Microsoft Teams chat and channels - Microsoft Support\nRewrite and adjust messages\nWriting a message in message box, selecting\nRewrite/Adjust\nin Copilot beneath the message box to rewrite/adjust the whole/specific selection of the message.\nRewrite and adjust your messages with Copilot in Microsoft Teams - Microsoft Support\nIntelligent Recap\nSelecting\nRecap\ntab in the meeting chat for Teams calendar event and viewing the AI Notes section after the meeting ends (meeting is recorded and transcribed).\nGet started with Microsoft 365 Copilot in Teams - Microsoft Support\nInterpreter\nTurning on Interpreter under the\nMore\nicon and listening to a meeting in one of the selected languages.\nInterpreter in Microsoft Teams meetings and calls\nFacilitator\nTurning on Facilitator under the\nMore\nicon and activating AI-generated notes, including Agenda building.\nFacilitator in Microsoft Teams meetings\nCopilot Chat (work)\nGoing to Chat on the left side of Teams, selecting Copilot from the top of your Teams chat list, typing a prompt and sending.\nGet started with Microsoft 365 Copilot in Teams - Microsoft Support\nCopilot Chat (web)\nGoing to the left navigation of Teams application, selecting Copilot from the apps list and select “Web” option at the top of the chat pane, typing a prompt into the chat window or selecting a suggested prompt and submitting. This feature is included in the Teams app level and all up Microsoft 365 active usage count effective July 01, 2025.\nGet started with Microsoft 365 Copilot in Teams - Microsoft Support\nWord\nAll Copilot in Word features are automatically included in the Microsoft 365 Copilot usage report. Usage of any Copilot in Word feature counts towards the Active users metric and is indicated in the per-user Last activity date (UTC).\nTo learn more about Copilot in Word features, refer to\nWelcome to Copilot in Word - Microsoft Support\n.\nExcel\nAll Copilot in Excel features are automatically included in the Microsoft 365 Copilot usage report, with the exception of Agent Mode. Usage of any Copilot in Excel features (other than Agent Mode) counts towards the Active users metric and is indicated in the per-user Last activity date (UTC).\nTo learn more about Copilot in Excel features, refer to\nGet started with Copilot in Excel - Microsoft Support\n.\nPowerPoint\nAll Copilot in Powerpoint features are automatically included in the Microsoft 365 Copilot usage report, with the exception of Agent Mode. Usage of any Copilot in Powerpoint features (other than Agent Mode) counts towards the Active users metric and is indicated in the per-user Last activity date (UTC).\nTo learn more about Copilot in PowerPoint features, refer to\nWelcome to Copilot in PowerPoint - Microsoft Support\n.\nOneNote\nAll Copilot in OneNote features are automatically included in the Microsoft 365 Copilot usage report. Usage of any Copilot in OneNote feature counts towards the Active users metric and is indicated in the per-user Last activity date (UTC).\nTo learn more about Copilot in OneNote features, refer to\nWelcome to Copilot in OneNote - Microsoft Support\n.\nLoop\nAll Copilot in Loop features are automatically included in the Microsoft 365 Copilot usage report. Usage of any Copilot in Loop feature counts towards the Active users metric and is indicated in the per-user Last activity date (UTC).\nTo learn more about Copilot in Loop features, refer to\nGet started with Microsoft 365 Copilot in Loop - Microsoft Support\n.\nTo note, Active users of Word, Excel, and PowerPoint is incomplete prior to January 25, 2024.\nThe following table lists the features included for active users of agents:\nFeature\nHow to use\nLearn more about the feature\nUX interactions that count towards agent usage\nEnd-users can interact with agents in two ways:\n1. by at-mentioning the agent in a chat experience or\n2. by selecting the agent from the right-side panel in Copilot Chat or from the menu icon in the top left corner in Copilot in Word or PowerPoint.\nAn active user of an agent is a user who sends a prompt request to an agent and receives a response\nLearn about\nGetting started with agents for Microsoft 365 Copilot\nImportant\nThe metrics displayed in the Microsoft 365 Copilot usage report are powered by data that is classified as required service data. Optional diagnostic data isn't required for comprehensive information, although this might change in the future.\nLearn more about required service data\n.\nIn the\nAdoption\nsection, you might see a recommendation card:\nTo learn more about using organizational messages for Microsoft 365 Copilot, see\nMicrosoft 365 features adoption using organizational messages\n.\nYou can also export the report data into an Excel .csv file by selecting the ellipses and then\nExport\nin the top-right corner.\nYou can view a table list to show each Microsoft 365 Copilot enabled user's last activity date among Microsoft 365 Copilot apps.\nSelect\nChoose columns\nto add or remove columns from the table.\nYou can also export the report data into an Excel .csv file by selecting the\nExport\nlink. This link exports the Microsoft 365 Copilot usage data of all users and enables you to do simple sorting, filtering, and searching for further analysis.\nTo ensure data quality, we perform daily data validation checks for the past three days and fill any gaps detected. You might notice differences in historical data during the process.\nUser last activity table\nItem\nDescription\nUser name\nThe user's principal name.\nDisplay name\nThe full name of the user.\nPrompts submitted (any app)\nThe total number of prompts submitted by this user to Microsoft 365 Copilot Chat during the selected timeframe.\nCopilot Chat (work) prompts submitted\nThe total number of prompts submitted by this user to Copilot Chat (work) during the selected timeframe.\nCopilot Chat (web) prompts submitted\nThe total number of prompts submitted by this user to Copilot Chat (web) during the selected timeframe.\nActive Days\nThe total number of days the user submitted prompts to Microsoft 365 Copilot Chat within the selected timeframe.\nLast activity date (UTC (Universal Time Code))\nThe most recent date on which the user sent a message to Microsoft 365 Copilot Chat in Teams, Outlook, m365.cloud.microsoft/chat, Microsoft Edge, the Microsoft 365 Copilot (app), Word, Excel, PowerPoint, or OneNote. This date remains fixed even if the timeframe of the report is changed. \nLast activity date of Teams Copilot (UTC)\nThe latest date the user had activity in Microsoft Teams Copilot, including any of the intentional activities, regardless of the selected timeframe of past 7/30/90/180 days.\nLast activity date of Word Copilot (UTC)\nThe latest date the user had activity in Word Copilot, including any of the intentional activities, regardless of the selected timeframe of past 7/30/90/180 days.\nLast activity date of Excel Copilot (UTC)\nThe latest date the user had activity in Excel Copilot, including any of the intentional activities, regardless of the selected timeframe of past 7/30/90/180 days.\nLast activity date of PowerPoint Copilot (UTC)\nThe latest date the user had activity in PowerPoint Copilot, including any of the intentional activities, regardless of the selected timeframe of past 7/30/90/180 days.\nLast activity date of Outlook Copilot (UTC)\nThe latest date the user had activity in Outlook Copilot, including any of the intentional activities, regardless of the selected timeframe of past 7/30/90/180 days.\nLast activity date of OneNote Copilot (UTC)\nThe latest date the user had activity in OneNote Copilot, including any of the intentional activities, regardless of the selected timeframe of past 7/30/90/180 days.\nLast activity date of Loop Copilot (UTC)\nThe latest date the user had activity in Loop Copilot, including any of the intentional activities, regardless of the selected timeframe of past 7/30/90/180 days.\nLast activity date of Copilot Chat (work) (UTC)\nThe latest date the user had activity in Copilot Chat (work), including any of the intentional activities, regardless of the selected timeframe of past 7/30/90/180 days.\nLast activity date of Copilot Chat (web) (UTC)\nThe latest date the user had activity in Copilot Chat (web), including any of the intentional activities, regardless of the selected timeframe of past 7/30/90/180 days.\nLast activity date of Microsoft 365 App (UTC)\nThe latest date the user had activity in Copilot Chat in entry point Microsoft 365 App, including any of the intentional activities, regardless of the selected timeframe of past 7/30/90/180 days.\nLast activity date of Microsoft Edge (UTC)\nThe latest date the user had activity in Copilot Chat in entry point Microsoft Edge, including any of the intentional activities, regardless of the selected timeframe of past 7/30/90/180 days.\nLast activity date of any agent (UTC)\nThe latest date the user had activity with an agent built by your org, regardless of the selected timeframe of past 7/30/90/180 days.\nDisplay user-specific data\nBy default, usernames and display names in Copilot Search usage reports are anonymous. Global administrators can update the settings to reveal usernames and display names.\nImportant\nMicrosoft recommends that you use roles with the fewest permissions. This helps improve security for your organization. Global Administrator is a highly privileged role that should be limited to emergency scenarios when you can't use an existing role.\nIn the admin center, go to the\nSettings\n>\nOrg Settings\npage.\nSelect the\nServices\ntab, then select\nReports\n.\nIn the\nReports\npanel, select the checkbox next to\nDisplay Concealed user, group, and site names in all reports\n.\nSelect\nSave\n.\nFAQ\nHow is a user considered active in Microsoft 365 Copilot usage?\nA user is considered active in a given app if they performed an intentional action for an AI-powered capability. For example, if a user selects the Copilot icon in the Word ribbon to open the Copilot chat pane, this action doesn't count towards active usage. However, if the user interacts with the chat pane by submitting a prompt, this action would count towards active usage.\nWhat's the difference between the user activity table and audit log?\nThe audit log data that powers Microsoft Purview solutions, such as Data Security Posture Management for AI (previously called AI Hub), are built for data security and compliance purposes, and provide comprehensive visibility into Copilot interactions for these use cases. (For example, to discover data oversharing risks or to collect interactions for regulatory compliance or legal purposes). They aren't, however, intended to be used as the basis for Copilot usage reporting. Any aggregated metrics that customers build on top of this data, such as \"prompt count\" or \"active user count,\" might not be consistent with the corresponding data points in the official Copilot usage reports provided by Microsoft. Microsoft can't provide guidance on how to use audit log data as the basis for usage reporting, nor can Microsoft guarantee that aggregated usage metrics built on top of audit log data will match similar usage metrics reported in other tools.\nTo access accurate information on Microsoft 365 Copilot usage, use one of the following reports: the\nMicrosoft 365 Copilot usage report\nin the Microsoft 365 Admin Center or the\nCopilot Dashboard\nin Viva Insights.\nWhat's the scope of the user-level table?\nThe user-level table in the report is configured to show all users who were licensed for Microsoft 365 Copilot at any point over the past 180 days, even if the user has since had the license removed or never had any Copilot active usage.\nI assigned the Microsoft 365 Copilot license to users, but why is 'last activity date' for users empty in rare cases?\nBased on system constraints, some users might not have a 'last activity date' in the user-level table of the report under the following conditions:\nThe user used Microsoft 365 Copilot within a short time window (less than 24 hours) after the Microsoft 365 Copilot license was assigned.\nThe user later had no other Microsoft 365 Copilot usage up to the date on which the report is viewed.\nWhy is the 'Last activity date of Word, Excel, PowerPoint, OneNote, or Outlook Copilot (UTC)' sometimes blank or newer than the actual date, even when users have recently used Copilot features?\nThis might be caused by a known limitation: the uploading of client events data for Copilot features in Word, Excel, PowerPoint, OneNote, and Outlook can be delayed for various reasons, such as when end users disconnect from the internet immediately after taking a Copilot action.\nHow do the numbers in this report compare to what is shown in the Microsoft Copilot Dashboard in Viva Insights?\nThe data in these reports is based on the same underlying definitions of active usage, but the population of users included in the analysis and the timeframe displayed might differ. To learn more, see\nUse Microsoft Copilot Dashboard advanced features with a Viva Insights subscription\n.\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Copilot Usage Reports", @@ -593,7 +593,7 @@ "https://learn.microsoft.com/en-us/purview/dlp-policy-reference": { "content_hash": "sha256:cef2ad6f98889f291a3ad72caa7a9f5014625e4ac6a714a941df6c60809fa8e5", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nData Loss Prevention policy reference\nFeedback\nSummarize this article for me\nMicrosoft Purview Data Loss Prevention (DLP) policies have many components to configure. To create an effective policy, you need to understand what the purpose of each component is and how its configuration alters the behavior of the policy. This article provides a detailed anatomy of a DLP policy.\nTip\nGet started with Microsoft Security Copilot to explore new ways to work smarter and faster using the power of AI. Learn more about\nMicrosoft Security Copilot in Microsoft Purview\n.\nRecommended reading\nAdministrative units\nLearn about Microsoft Purview Data Loss Prevention\nPlan for data loss prevention (DLP)\n- by working through this article you will:\nIdentify stakeholders\nDescribe the categories of sensitive information to protect\nSet goals and strategy\nCollection Policies solution overview\nCollection policy reference\nData Loss Prevention policy reference\n- this article introduces all the components of a DLP policy and how each one influences the behavior of a policy\nDesign a DLP policy\n- this article walks you through creating a policy intent statement and mapping it to a specific policy configuration.\nCreate and Deploy data loss prevention policies\n- This article presents some common policy intent scenarios that you map to configuration options, then it walks you through configuring those options.\nLearn about investigating data loss prevention alerts\n- This article introduces you to the lifecycle of alerts from creation, through final remediation and policy tuning. It also introduces you to the tools you use to investigate alerts.\nDLP platform considerations\nAlso, you need to be aware of these constraints of the platform:\nMaximum number of MIP + MIG policies in a tenant: 10,000\nMaximum size of a DLP policy (100 KB)\nMaximum number of DLP rules:\nIn a policy: Limited by the size of the policy\nIn a tenant: 600\nMaximum size of an individual DLP rule: 100 KB (102,400 characters)\nGenerate Incident Report evidence limit: 100, with each SIT evidence, in proportion of occurrence\nMaximum size of text scanned from a file: The first 2 million characters (~2 MB) of extractable text. If a file exceeds this limit, first two million characters are scanned, and a “Document didn’t complete scanning” signal is emitted.\nMaximum number of nested levels of data scanned from a file: The first three levels. If a file exceeds this limit, data in the first three levels are scanned, and a\nDocument didn’t complete scanning\nevent is emitted.\nRegex size limit for all matches predicted: 20 KB\nPolicy name length limit: 64 characters\nPolicy rule length limit: 64 characters\nComment length limit: 1,024 characters\nDescription length limit: 1,024 characters\nMaximum size of Endpoint DLP Settings: 16,384 characters\nPolicy templates\nThere are five types of DLP policy templates across two categories.\nEnterprise applications & devices\nenhanced policies\npolicies that can detect and protect types of\nFinancial\ninformation.\npolicies that can detect and protect types of\nMedical and health\ninformation.\npolicies that can detect and protect types of\nPrivacy\ninformation.\nA\nCustom\npolicy template that you can use to build your own policy if none of the others meet your organization's needs.\nThe following table lists all policy templates and the sensitive information types (SIT) that they cover.\nCategory\nTemplate\nSIT\nFinancial\nAustralia Financial Data\n-\nSWIFT code\n-\nAustralia tax file number\n-\nAustralia bank account number\n-\nCredit card number\nFinancial\nCanada Financial data\n-\nCredit card number\n-\nCanada bank account number\nFinancial\nFrance Financial data\n-\nCredit card number\n-\nEU debit card number\nFinancial\nGermany Financial Data\n-\nCredit card number\n-\nEU debit card number\nFinancial\nIsrael Financial Data\n-\nIsrael bank account number\n-\nSWIFT code\n-\nCredit card number\nFinancial\nJapan Financial Data\n-\nJapan bank account number\n-\nCredit card number\nFinancial\nPCI Data Security Standard (PCI DSS)\n-\nCredit card number\nFinancial\nSaudi Arabia Anti-Cyber Crime Law\n-\nSWIFT code\n-\nInternational banking account number (IBAN)\nFinancial\nSaudi Arabia Financial Data\n-\nCredit card number\n-\nSWIFT code\n-\nInternational banking account number (IBAN)\nFinancial\nUK Financial Data\n-\nCredit card number\n-\nEU debit card number\n-\nSWIFT code\nFinancial\nUS Financial Data\n-\nCredit card number\n-\nU.S. bank account number\n-\nABA Routing Number\nFinancial\nU.S. Federal Trade Commission (FTC) Consumer Rules\n-\nCredit card number\n-\nU.S. bank account number\n-\nABA Routing Number\nFinancial\nU.S. Gramm-Leach-Bliley Act (GLBA) Enhanced\n-\nCredit card number\n-\nU.S. bank account number\n-\nU.S. Individual Taxpayer Identification Number (ITIN)\n-\nU.S. social security number (SSN)\n-\nU.S./U.K. passport number\n-\nU.S. driver's license number\n-\nAll Full Names\n-\nU.S. Physical Addresses\nFinancial\nU.S. Gramm-Leach-Bliley Act (GLBA)\n-\nCredit card number\n-\nU.S. bank account number\n-\nU.S. Individual Taxpayer Identification Number (ITIN)\n-\nU.S. social security number (SSN)\nMedical and health\nAustralia Health Records Act (HRIP Act) Enhanced\n-\nAustralia tax file number\n-\nAustralia medical account number\n-\nAll Full Names\n-\nAll Medical Terms And Conditions\n-\nAustralia Physical Addresses\nMedical and health\nAustralia Health Records Act (HRIP Act)\n-\nAustralia tax file number\n-\nAustralia medical account number\nMedical and health\nCanada Health Information Act (HIA)\n-\nCanada passport number\n-\nCanada social insurance number\n-\nCanada health service number\n-\nCanada Personal Health Identification Number\nMedical and health\nCanada Personal Health Information Act (PHIA) Manitoba\n-\nCanada social insurance number\n-\nCanada health service number\n-\nCanada Personal Health Identification Number\nMedical and health\nCanada Personal Health Act (PHIPA) Ontario\n-\nCanada passport number\n-\nCanada social insurance number\n-\nCanada health service number\n-\nCanada Personal Health Identification Number\nMedical and health\nU.K. Access to Medical Reports Act\n-\nU.K. national health service number\n-\nU.K. national insurance number (NINO)\nMedical and health\nU.S. Health Insurance Act (HIPAA) Enhanced\n-\nInternational classification of diseases (ICD-9-CM)\n-\nInternational classification of diseases (ICD-10-CM)\n-\nAll Full Names\n-\nAll Medical Terms And Conditions\n-\nU.S. Physical Addresses\nMedical and health\nU.S. Health Insurance Act (HIPAA)\n-\nInternational classification of diseases (ICD-9-CM)\n-\nInternational classification of diseases (ICD-10-CM)\nPrivacy\nAustralia Privacy Act Enhanced\n-\nAustralia driver's license number\n-\nAustralia passport number\n-\nAll Full Names\n-\nAll Medical Terms And Conditions\n-\nAustralia Physical Addresses\nPrivacy\nAustralia Privacy Act\n-\nAustralia drivers license number\n-\nAustralia passport number\nPrivacy\nAustralia Personally Identifiable Information (PII) Data\n-\nAustralia tax file number\n-\nAustralia driver's license number\nPrivacy\nCanada Personally Identifiable Information (PII) Data\n-\nCanada driver's license number\n-\nCanada bank account number\n-\nCanada passport number\n-\nCanada social insurance number\n-\nCanada health service number\n-\nCanada Personal Health Identification Number\nPrivacy\nCanada Personal Information Protection Act (PIPA)\n-\nCanada passport number\n-\nCanada social insurance number\n-\nCanada health service number\n-\nCanada Personal Health Identification Number\nPrivacy\nCanada Personal Information Protection Act (PIPEDA)\n-\nCanada driver's license number\n-\nCanada bank account number\n-\nCanada passport number\n-\nCanada social insurance number\n-\nCanada health service number\n-\nCanada Personal Health Identification Number\nPrivacy\nFrance Data Protection Act\n-\nFrance national ID card (CNI)\n-\nFrance social security number (INSEE)\nPrivacy\nFrance Personally Identifiable Information (PII) Data\n-\nFrance social security number (INSEE)\n-\nFrance driver's license number\n-\nFrance passport number\n-\nFrance national ID card (CNI)\nPrivacy\nGeneral Data Protection Regulation (GDPR) Enhanced\n-\nAustria Physical Addresses\n-\nBelgium Physical Addresses\n-\nBulgaria Physical Addresses\n-\nCroatia Physical Addresses\n-\nCyprus Physical Addresses\n-\nCzech Republic Physical Addresses\n-\nDenmark Physical Addresses\n-\nEstonia Physical Addresses\n-\nFinland Physical Addresses\n-\nFrance Physical Addresses\n-\nGermany Physical Addresses\n-\nGreece Physical Addresses\n-\nHungary Physical Addresses\n-\nIreland Physical Addresses\n-\nItaly Physical Addresses\n-\nLatvia Physical Addresses\n-\nLithuania Physical Addresses\n-\nLuxembourg Physical Addresses\n-\nMalta Physical Addresses\n-\nNetherlands Physical Addresses\n-\nPoland Physical Addresses\n-\nPortuguese Physical Addresses\n-\nRomania Physical Addresses\n-\nSlovakia Physical Addresses\n-\nSlovenia Physical Addresses\n-\nSpain Physical Addresses\n-\nSweden Physical Addresses\n-\nAustria Social Security Number\n-\nFrance Social Security Number (INSEE)\n-\nGreece Social Security Number (AMKA)\n-\nHungarian Social Security Number (TAJ)\n-\nSpain Social Security Number (SSN)\n-\nAustria Identity Card\n-\nCyprus Identity Card\n-\nGermany Identity Card Number\n-\nMalta Identity Card Number\n-\nFrance National ID Card (CNI)\n-\nGreece National ID Card\n-\nFinland National ID\n-\nPoland National ID (PESEL)\n-\nSweden National ID\n-\nCroatia Personal Identification (OIB) Number\n-\nCzech Personal Identity Number\n-\nDenmark Personal Identification Number\n-\nEstonia Personal Identification Code\n-\nHungary Personal Identification Number\n-\nLuxemburg National Identification Number natural persons\n-\nLuxemburg National Identification Number (Non-natural persons)\n-\nItaly Fiscal Code\n-\nLatvia Personal Code\n-\nLithuania Personal Code\n-\nRomania Personal Numerical Code (CNP)\n-\nNetherlands Citizen's Service (BSN) Number\n-\nIreland Personal Public Service (PPS) Number\n-\nBulgaria Uniform Civil Number\n-\nBelgium National Number\n-\nSpain DNI\n-\nSlovenia Unique Master Citizen Number\n-\nSlovakia Personal Number\n-\nPortugal Citizen Card Number\n-\nMalta Tax ID Number\n-\nAustria Tax Identification Number\n-\nCyprus Tax Identification Number\n-\nFrance Tax Identification Number (numéro SPI.)\n-\nGermany Tax Identification Number\n-\nGreek Tax identification Number\n-\nHungary Tax identification Number\n-\nNetherlands Tax Identification Number\n-\nPoland Tax Identification Number\n-\nPortugal Tax Identification Number\n-\nSlovenia Tax Identification Number\n-\nSpain Tax Identification Number\n-\nSweden Tax Identification Number\n-\nAustria Driver's License\n-\nBelgium Driver's License Number\n-\nBulgaria Driver's License Number\n-\nCroatia Driver's License Number\n-\nCyprus Driver's License Number\n-\nCzech Driver's License Number\n-\nDenmark Driver's License Number\n-\nEstonia Driver's License Number\n-\nFinland Driver's License Number\n-\nFrance Driver's License Number\n-\nGerman Driver's License Number\n-\nGreece Driver's License Number\n-\nHungary Driver's License Number\n-\nIreland Driver's License Number\n-\nItaly Driver's License Number\n-\nLatvia Driver's License Number\n-\nLithuania Driver's License Number\n-\nLuxemburg Driver's License Number\n-\nMalta Driver's License Number\n-\nNetherlands Driver's License Number\n-\nPoland Driver's License Number\n-\nPortugal Driver's License Number\n-\nRomania Driver's License Number\n-\nSlovakia Driver's License Number\n-\nSlovenia Driver's License Number\n-\nSpain Driver's License Number\n-\nSweden Driver's License Number\n-\nAustria Passport Number\n-\nBelgium Passport Number\n-\nBulgaria Passport Number\n-\nCroatia Passport Number\n-\nCyprus Passport Number\n-\nCzech Republic Passport Number\n-\nDenmark Passport Number\n-\nEstonia Passport Number\n-\nFinland Passport Number\n-\nFrance Passport Number\n-\nGerman Passport Number\n-\nGreece Passport Number\n-\nHungary Passport Number\n-\nIreland Passport Number\n-\nItaly Passport Number\n-\nLatvia Passport Number\n-\nLithuania Passport Number\n-\nLuxemburg Passport Number\n-\nMalta Passport Number\n-\nNetherlands Passport Number\n-\nPoland Passport\n-\nPortugal Passport Number\n-\nRomania Passport Number\n-\nSlovakia Passport Number\n-\nSlovenia Passport Number\n-\nSpain Passport Number\n-\nSweden Passport Number\n-\nEU Debit Card Number\n-\nAll Full Names\nPrivacy\nGeneral Data Protection Regulation (GDPR)\n-\nEU debit card number\n-\nEU driver's license number\n-\nEU national identification number\n-\nEU passport number\n-\nEU social security number or equivalent identification\n-\nEU Tax identification number\nPrivacy\nGermany Personally Identifiable Information (PII) Data\n-\nGermany driver's license number\n-\nGermany passport number\nPrivacy\nIsrael Personally Identifiable Information (PII) Data\n-\nIsrael national identification number\nPrivacy\nIsrael Protection of Privacy\n-\nIsrael national identification number\n-\nIsrael bank account number\nPrivacy\nJapan Personally Identifiable Information (PII) Data enhanced\n-\nJapan Social Insurance Number (SIN)\n-\nJapan My Number - Personal\n-\nJapan passport number\n-\nJapan driver's license number\n-\nAll Full Names\n-\nJapan Physical Addresses\nPrivacy\nJapan Personally Identifiable Information (PII) Data\n-\nJapan resident registration number\n-\nJapan Social Insurance Number (SIN)\nPrivacy\nJapan Protection of Personal Information Enhanced\n-\nJapan Social Insurance Number (SIN)\n-\nJapan My Number - Personal\n-\nJapan passport number\n-\nJapan driver's license number\n-\nAll Full Names\n-\nJapan Physical Addresses\nPrivacy\nJapan Protection of Personal Information\n-\nJapan resident registration number\n-\nJapan Social Insurance Number (SIN)\nPrivacy\nSaudi Arabia Personally Identifiable (PII) Data\n-\nSaudi Arabia National ID\nPrivacy\nU.K. Data Protection Act\n-\nU.K. national insurance number (NINO)\n-\nU.S./U.K. passport number\n-\nSWIFT code\nPrivacy\nU.K. Privacy and Electronic Communications Regulations\n-\nSWIFT code\nPrivacy\nU.K. Personally Identifiable Information (PII) Data\n-\nU.K. national insurance number (NINO)\n-\nU.S./U.K. passport number\nPrivacy\nU.K. Personal Information Online Code of Practice (PIOCP)\n-\nU.K. national insurance number (NINO)\n-\nU.K. national health service number\n-\nSWIFT code\nPrivacy\nU.S Patriot Act Enhanced\n-\nCredit card number\n-\nU.S. bank account number\n-\nU.S. Individual Taxpayer Identification Number (ITIN)\n-\nU.S. social security number (SSN)\n-\nAll Full Names\n-\nU.S. Physical Addresses\nPrivacy\nU.S. Patriot Act\n-\nCredit card number\n-\nU.S. bank account number\n-\nU.S. Individual Taxpayer Identification Number (ITIN)\n-\nU.S. social security number (SSN)\nPrivacy\nU.S. Personally Identifiable Information (PII) Data Enhanced\n-\nU.S. Individual Taxpayer Identification Number (ITIN)\n-\nU.S. social security number (SSN)\n-\nU.S./U.K. passport number\n-\nAll Full Names\n-\nU.S. Physical Addresses\nPrivacy\nU.S. Personally Identifiable Information (PII) Data\n-\nU.S. Individual Taxpayer Identification Number (ITIN)\n-\nU.S. social security number (SSN)\n-\nU.S./U.K. passport number\nPrivacy\nU.S. State Breach Notification Laws Enhanced\n-\nCredit card number\n-\nU.S. bank account number\n-\nU.S. driver's license number\n-\nU.S. social security number (SSN)\n-\nAll Full Names\n-\nU.S./U.K. passport number\n-\nAll Medical Terms And Conditions\nPrivacy\nU.S. State Breach Notification Laws\n-\nCredit card number\n-\nU.S. bank account number\n-\nU.S. driver's license number\n-\nU.S. social security number (SSN)\nPrivacy\nU.S. State Social Security Number Confidentiality Laws\n-\nU.S. social security number (SSN)\nInline web traffic policies\nA\nCustom\npolicy template that you can use to build your own policy.\nPolicy scoping\nSee,\nAdministrative units\nto make sure you understand the difference between an unrestricted admin and an administrative unit restricted admin.\nDLP policies are scoped at two different levels. The first level applies unrestricted admin scope policies to all of the following in your organization (depending on the locations that are selected) or to subgroups of your organization, called\nAdministrative Unit restricted policies\n:\nusers\ngroups\ndistribution groups\naccounts\nsites\ncloud app instances\non-premises repositories\nFabric and Power BI workspaces\nAt this level, an administrative unit restricted admin will only be able to pick from the administrative units that they're assigned to. DLP support admin unit scoping for some of the locations protected under Enterprise applications & devices.\nThe second level of DLP policy scoping is by the\nlocations\nthat DLP supports. At this level, both unrestricted and administrative unit restricted administrators see only the users, distribution groups, groups, and accounts that were included in the first level of policy scoping and that are available for that location.\nSupport for adding SharePoint sites to Administrative Units\nMicrosoft Purview supports\nadding SharePoint sites to existing administrative units\n. When you assign a DLP policy for the SharePoint location to an administrative unit, the policy will only apply to the sites that are part of that administrative unit. The option to further edit the scope to include or exclude specific sites isn't available. The policy applies to all sites that are part of the administrative unit.\nHere's an example use case:\nContoso has created an Entra ID administrative unit for the engineering department and has assigned certain administrators to manage the users and groups for that department. The engineering department has a SharePoint site that is used to store sensitive information. Contoso wants to ensure that the DLP policy for the engineering department SharePoint sites only applies to the SharePoint site that is part of the administrative unit. By assigning the DLP policy to the administrative unit, the policy will only apply to the SharePoint site that is part of that administrative unit. Also, the administrative unit restricted administrator will only be able to manage the DLP policy for that site and only see policy match result data for the administrative unit in activity explorer and the alert dashboard.\nUnrestricted policies\nUnrestricted policies are created and managed by users in these role groups:\nCompliance administrator\nCompliance data administrator\nInformation Protection\nInformation Protection Admin\nSecurity administrator\nSee, the\nPermissions\narticle for more details.\nUnrestricted administrators can manage all policies and see all the alerts and events that flow from policy matches into the\nAlerts dashboard\nand\nDLP Activity Explorer\n.\nAdministrative unit restricted policies\nAdministrative units are subsets of your Microsoft Entra ID directory and are created for the purposes of managing collections of users, groups, distribution groups, and accounts. These collections are typically created along business group lines or geopolitical areas. Administrative units have a delegated administrator who is associated with an administrative unit in the role group. These are called administrative unit restricted admins.\nDLP supports associating policies with administrative units. See\nAdministrative units\nfor implementation details in the Microsoft Purview portal. Administrative unit admins need to be assigned to one of the same roles or role groups as administrators of unrestricted DLP policies in order to create and manage DLP policies for their administrative unit.\nDLP Administrative Role Group\nCan\nUnrestricted administrator\n- create and scope DLP policies to entire organization\n- edit all DLP policies\n- create and scope DLP policies to administrative units\n- view all alerts and events from all DLP policies\nAdministrative Unit Restricted administrator\n- must be a member of/\nassigned to a role group/role\nthat can administer DLP\n- create and scope DLP policies only to the administrative unit that they're assigned to\n- edit DLP policies that are associated to their administrative unit\n- view alerts and events only from the DLP policies that are scoped to their administrative unit\nLocations\nA DLP policy can find and protect items that contain sensitive information across multiple locations.\nLocation\nSupports Administrative Units\nInclude/Exclude scope\nData state\nOther prerequisites\nExchange Online\nYes\n- distribution groups assigned or dynamic\n- security groups\n- non email enabled security groups assigned or dynamic)\n- Microsoft 365 groups assigned or dynamic\ndata-in-motion\nNo\nSharePoint\nYes\nsite location at the policy level. If the policy is scoped to an administrative unit that includes SharePoint sites, the policy will only apply to all sites in the administrative unit, no further scoping is possible\ndata-at-rest\ndata-in-use\nNo\nOneDrive\nYes\n- Distribution groups\n- Security groups\n- Non email enabled security groups\n- Microsoft 365 groups (Group members only, not the group as an entity)\ndata-at-rest\ndata-in-use\nNo\nTeams chat and channel messages\nYes\n- Distribution groups\n- Security groups\n- Mail-enabled security groups\n- Microsoft 365 groups (Group members only, not the group as an entity)\ndata-in-motion\ndata-in-use\nSee\nScope of DLP protection\nInstances\nNo\nCloud app instance\ndata-at-rest\n-\nUse data loss prevention policies for non-Microsoft cloud apps\nDevices\nYes\n- Distribution groups\n- Security groups\n- Non email enabled security groups\n- Microsoft 365 groups (Group members only, not the group as an entity)\ndata-in-use\ndata-in-motion\n-\nLearn about Endpoint data loss prevention\n-\nGet started with Endpoint data loss prevention\n-\nConfigure device proxy and internet connection settings for Information Protection\nOn-premises repositories (file shares and SharePoint)\nNo\nRepository\ndata-at-rest\n-\nLearn about the data loss prevention on-premises repositories\n-\nGet started with the data loss prevention on-premises repositories\nFabric and Power BI\nNo\nWorkspaces\ndata-in-use\nNo\nThird-party apps\nNone\nNo\nNo\nNo\nMicrosoft 365 Copilot (preview)\nNo\nAccount or Distribution group\ndata-at-rest\ndata-in-use\n- Only available in the\nCustom\npolicy template\nManaged cloud apps\nNo\nAccount or Distribution group\ndata-in-motion\n- Only available in the\nCustom\npolicy template\nUnmanaged cloud apps\nNo\nAccount or Distribution group\ndata-in-motion\n- Only available in the\nCustom\npolicy template\nExchange location scoping\nIf you choose to include specific distribution groups in Exchange, the DLP policy is scoped to the emails sent by members of that group or sent to members of that group. Similarly, excluding a distribution group excludes all the emails sent by the members of that distribution group or from policy evaluation.\nGroup Type\nMembership Type\nSupported during Policy Creation\nSupported during Rule Evaluation\nNotes\nNon-Mail Enabled Security Groups\nAssigned\nYes\nNo\nEnabled for specific customers only\nNon-Mail Enabled Security Groups\nDynamic\nYes\nNo\nMail-Enabled Security Groups\nAssigned\nYes\nYes\nDistribution Groups\nAssigned\nYes\nYes\nDistribution Groups\nDynamic\nYes\nYes\nMicrosoft 365 Groups\nAssigned\nYes\nYes\nMicrosoft 365 Groups\nDynamic\nYes\nYes\nAdaptive Scopes\nDynamic\nNo\nNo\nSender is\nRecipient is\nResultant behavior\nIn scope\nN/A\nPolicy is applied\nOut of scope\nIn scope\nPolicy isn't applied\nExchange location scope calculation\nHere's an example of how Exchange location scope is calculated:\nSay you have four users in your org and two distribution groups that you use for defining Exchange location inclusion and exclusion scopes. Group membership is set up like this:\nDistribution Group\nMembership\nGroup1\nUser1, User2\nGroup2\nUser2, User3\nNo group\nUser4\nInclude setting\nExclude setting\nPolicy applies to\nPolicy doesn't apply to\nExplanation of behavior\nAll\nNone\nAll senders in the Exchange org (User1, User2, User3, User4)\nN/A\nWhen neither are defined, all senders are included\nGroup1\nNone\nMember senders of Group1 (User1, User2)\nAll senders who aren't members of Group1 (User3, User4)\nWhen one setting is defined and the other isn't, the defined setting is used\nAll\nGroup2\nAll senders in the Exchange org who aren't members of Group2 (User1, User4)\nAll senders who are members of Group2 (User2, User3)\nWhen one setting is defined and the other isn't, the defined setting is used\nGroup1\nGroup2\nUser1\nUser2, User3, User4\nExclude overrides include\nYou can choose to scope a policy to the members of distribution lists, dynamic distribution groups, and security groups. A DLP policy can contain no more than 50 such inclusions and exclusions.\nOneDrive location scoping\nWhen scoping a policy for OneDrive locations, in addition to applying your DLP policies to all users and groups in your organization, you can limit the scope of a policy to specific users and groups. DLP supports scoping policies to up to 100 individual users.\nFor instance, if you want to include more than 100 users, you must first put those users in distribution groups or security groups, as appropriate. You can then scope your policy to up to 50 groups.\nIn some cases, you might want to apply a policy to one or two groups, plus two or three individual users who don't belong to either of those groups. Here, the best practice is to\nput those two or three individuals into a group of their own\n. This is the only way to make sure that the policy is scoped to all intended users.\nThe reason for this is that, when you list only users, DLP adds all of the users specified to the policy scope. Similarly, when you add only groups, DLP adds all the members of all the groups to the policy scope.\nSay you have the following groups and users:\nDistribution Group\nMembership\nGroup1\nUser1, User2\nGroup2\nUser2, User3\nIf you limit the scope of a policy to\nonly\nusers or\nonly\ngroups, the DLP applies the policy to users as illustrated in the following table:\nSpecified scope\nDLP Scope evaluation behavior\nUsers in scope\n(Users only)\nUser1\nUser2\nDLP takes the union of the specified users\nUser1, User2\n(Groups only)\nGroup1\nGroup2\nDLP takes the union of the specified groups\nUser1, User2, User3\nHowever, when users and groups are mixed in the scoping configuration, things get complicated. Here's why: DLP only scopes policies to users to\nthe intersection\nof the listed groups and users.\nDLP uses the following order of operations when determining which users and groups to include in the scope:\nEvaluate the union of group membership\nEvaluate the union of users\nEvaluate the intersection of group members and users, that is, where the results overlap\nIt then applies the scope of the policy to the intersection of group members and users.\nLet's extend our example, working with the same set of groups, and let's add User4, who isn't in a group:\nDistribution Group\nMembership\nGroup1\nUser1, User2\nGroup2\nUser2, User3\nNo group\nUser 4\nThe following table explains how policy scoping works in cases where users and groups are both included in the scoping instructions.\nSpecified scope\nDLP Scope evaluation behavior\nUsers in scope\nGroup1\nGroup2\nUser3\nUser4\nFirst evaluation: Union of groups:\n(Group1 + Group2) = User1, User2, User3\nSecond evaluation: Union of users:\n(User3 + User4) = User3, User4\nThird evaluation: Intersection of groups and Users (the overlap):\n(Group1 + Group2) = User1, User2, User3\n(User3 + User4) = User3, User4\nUser3\n(User3 is the only user that appears in the results of both the first and second evaluations.)\nGroup1\nGroup2\nUser1\nUser3\nUser4\nFirst evaluation: Union of groups:\n(Group1 + Group2) = User1, User2, User3\nSecond evaluation: Union of users:\n(User1 + User3 + User4) = User1, User3, User4\nThird evaluation: Intersection of groups and Users (the overlap):\n(Group1 + Group2) = User1, User3\n(User1 + User3, User4) = User1, User3, User4\nUser1\n,\nUser3\n(These are the only users that appear in the results of both the first and second evaluations.)\nSharePoint location scoping\nWhen scoping a policy for SharePoint locations, you can limit the scope of a policy to specific SharePoint sites. DLP supports scoping policies to up to 100 sites.\nDevice scoping\nMicrosoft Purview Data Loss Prevention (DLP) policies that include the\nDevices\nlocation can be scoped by\nusers\nand by\ndevices\n. A policy is enforced on an endpoint only when\nboth\nthe user and the device are included in the policy scope. If a user is in scope but the device is not, the policy isn’t applied. Similarly, if a device is in scope but the user is not, the policy isn’t applied.\nNote\nUse 101.25072 or higher build for this feature support on macOS.\nConfigure scope for common outcomes\nHere's how to configure the scope of a DLP policy for different outcomes.\nIf you want to target the policy to...\nSet user scope to...\nSet device scope to...\nExample use case\nAll users on all onboarded devices\nAll users and groups\nAll devices and device groups\nUse this for general enforcement of DLP policies on all devices in your organization. This is the default setting for DLP policies.\nAll users on specific devices\nAll users and groups\neither\nAll devices and device groups\nwith\nExclude devices and device groups\nand add the devices to be excluded or\nSpecific devices and device groups\nand add the devices to be included\nUse this to apply a restrictive policy to kiosk devices that will be used by multiple users.\nSpecific users on all onboarded devices\neither\nAll users and groups\nwith\nExclude users and groups\nand add the users to be excluded or\nSpecific users and groups\nand add the users to be included\nAll devices and device groups\nUse this to help control data leakage by specific users on all devices in your organization.\nSpecific users on specific devices\nSpecific users and groups\nSpecific devices and device groups\nSay you have special use devices in payroll that are used for printing checks and there are only a few accounts that are allowed to use those devices for printing checks. You can scope a restrictive endpoint DLP policy to those user accounts on those specific devices\nLocation support for how content can be defined\nDLP policies detect sensitive items by matching them to a sensitive information type (SIT), or to a sensitivity label or a retention label. Each location supports different methods of defining sensitive content. How content can be defined when you combine locations in a policy, can change from how it can be defined when limited to a single location.\nImportant\nWhen you select multiple locations for a policy, a \"no\" value for a content definition category takes precedence over \"yes\" value. For example, when you select SharePoint sites only, the policy supports detecting sensitive items by one or more of SIT, by sensitivity label or by retention label. But, when you select SharePoint sites\nand\nTeams chat and channel messages locations, the policy will only support detecting sensitive items by SIT.\nLocation\nContent can be defined by SIT\nContent can be defined sensitivity label\nContent can be defined by retention label\nExchange email online\nYes\nYes\nNo\nSharePoint in Microsoft 365 sites\nYes\nYes\nYes\nOneDrive for work or school accounts\nYes\nYes\nYes\nTeams Chat and Channel messages\nYes\nNo\nNo\nDevices\nYes\nYes\nNo\nInstances\nYes\nYes\nYes\nOn-premises repositories\nYes\nYes\nNo\nFabric and Power BI\nYes\nYes\nNo\nMicrosoft 365 Copilot (preview)\nNo\nYes\nNo\nManaged cloud apps\nYes\nNo\nNo\nUnmanaged cloud apps\nYes\nYes\nNo\nDLP supports using trainable classifiers as a condition to detect sensitive information. Content can be defined by trainable classifiers in Exchange, SharePoint sites, OneDrive accounts, Teams Chat and Channels, Devices, and unmanaged cloud apps. For more information, see\nTrainable Classifiers\n.\nNote\nDLP supports detecting sensitivity labels on emails and attachments. For more information, see\nUse sensitivity labels as conditions in DLP policies\n.\nRules\nRules are the business logic of DLP policies. They consist of:\nConditions\nthat when matched, trigger the policy\nActions\nthat determine the activities included and outcomes of a match.\nUser notifications\nto inform your users when they're doing something that triggers a policy and help educate them on how your organization wants sensitive information treated\nUser Overrides\nwhen configured by an admin, allow users to selectively override a blocking action\nIncident reports\nthat notify admins and other key stakeholders when a rule match occurs\nAdditional options\nwhich define the priority for rule evaluation and can stop further rule and policy processing.\nA policy contains one or more rules. Rules are executed sequentially, starting with the highest-priority rule in each policy.\nHow DLP classification works\nDLP evaluates an item for sensitive information when the item is created, read, or modified. Evaluation is also initiated by\non demand classification\n. However, events such as\nDLP Rule Matched\nonly appear in Audit log or activity explorer when a user attempts an activity that matches a DLP policy.\nHere's a list of some of the user activities that DLP can monitor and take actions on:\nText upload via Microsoft Edge browser using integrated capabilities\nFile upload via Microsoft Edge browser using integrated capabilities\nFile download via Microsoft Edge browser using integrated capabilities\nCut/copy data via Microsoft Edge browser using integrated capabilities\nPaste data via Microsoft Edge browser using integrated capabilities\nPrint data via Microsoft Edge browser using integrated capabilities\nPrint data from other locations\nCopy to removable media\nCopy to network share\nCopy to clipboard\nTransfer using Bluetooth\nFile accessed by unallowed app\nPaste to browsers other than Microsoft Edge\nPrint data\nTransfer using remote desktop\nAn item that is created, read, or modified will match a DLP rule and policy on the client if the conditions and user activity are met. This is audited as file activity, such as FileRead, or FileRenamed.\nIf an activity is met, then a DLP rule match event appears in activity explorer as a 'DLP Rule Matched' event. An event describing the mode of egress will also be generated.\nPolicies take actions and actions are different from conditions. A rule can match on a file even if no actions are performed.\nThe priority by which rules are evaluated and applied\nHosted service locations\nFor the hosted service locations, like Exchange, SharePoint, and OneDrive, each rule is assigned a priority in the order in which it's created. This means that the rule created first has first priority, the rule created second has second priority, and so on.\nWhen content is evaluated against rules, the rules are processed in priority order. If content matches multiple rules, the first rule evaluated that has the\nmost\nrestrictive action is enforced. For example, if content matches all of the following rules,\nRule 3\nis enforced because it's the highest priority, most restrictive rule:\nRule 1: only notifies users\nRule 2: notifies users, restricts access, and allows user overrides\nRule 3: notifies users, restricts access, and doesn't allow user overrides\nRule 4: restricts access\nRules 1, 2, and 4 would be evaluated, but not applied. In this example, matches for all of the rules are recorded in the audit logs and shown in the DLP reports, even though only the most restrictive rule is applied.\nYou can use a rule to meet a specific protection requirement, and then use a DLP policy to group together common protection requirements, such as all of the rules needed to comply with a specific regulation.\nFor example, you might have a DLP policy that helps you detect the presence of information subject to the Health Insurance Portability and Accountability Act (HIPAA). This DLP policy could help protect HIPAA data (the \"what\") across all SharePoint sites and all OneDrive sites (the \"where\") by finding any document containing this sensitive information that's shared with people outside your organization (the conditions) and then blocking access to the document and sending a notification (the actions). These requirements are stored as individual rules and grouped together as a DLP policy to simplify management and reporting.\nFor endpoints\nWhen an item matches multiple DLP rules, DLP goes uses through a complex algorithm to decide which actions to apply. Endpoint DLP applies the aggregate or sum of most restrictive actions. DLP uses these factors when making the calculation.\nPolicy priority order\nWhen an item matches multiple policies and those policies have identical actions, the actions from the highest priority policy is applied.\nRule priority order\nWhen an item matches multiple rules in a policy and those rules have identical actions, the actions from the highest priority rule is applied.\nMode of the policy\nWhen an item matches multiple policies and those policies have identical actions, the actions from all policies that are in\nTurn it on\nstate (enforce mode) are applied preferentially over the policies in\nRun the policy in simulation mode with policy tips\nand\nRun the policy in simulation mode\nstate.\naction\nWhen an item matches multiple policies and those policies differ in actions, the aggregate or sum of the most restrictive actions are applied.\nAuthorization groups\nconfiguration\nWhen an item matches multiple policies and those policies differ in action, the aggregate, or sum, of the most restrictive actions are applied.\noverride options\nWhen an item matches multiple policies and those policies differ in the override option, actions are applied in this order:\nNo override\n>\nAllow override\nHere are scenarios that illustrate the runtime behavior. For the first three scenarios, you have three DLP policies configured like this:\nPolicy name\nCondition to match\nAction\nPolicy priority\nABC\nContent contains credit card number\nBlock print, audit all other user egress activities\n0\nMNO\nContent contains credit card number\nBlock copy to USB, audit all other user egress activities\n1\nXYZ\nContent contains U.S. social security number\nBlock copy to clipboard, audit all other user egress activities\n2\nItem contains credit card numbers\nAn item on a monitored device contains credit card numbers, so it matches policy ABC and policy MNO. Both ABC and MNO are in\nTurn it on\nmode.\nPolicy\nCloud egress action\nCopy to clipboard action\nCopy to USB action\nCopy to network share action\nUnallowed apps action\nPrint action\nCopy via Bluetooth action\nCopy to remote desktop action\nABC\nAudit\nAudit\nAudit\nAudit\nAudit\nBlock\nAudit\nAudit\nMNO\nAudit\nAudit\nBlock\nAudit\nAudit\nAudit\nAudit\nAudit\nActions applied at runtime\nAudit\nAudit\nBlock\nAudit\nAudit\nBlock\nAudit\nAudit\nItem contains credit card numbers and U.S. social security numbers\nAn item on a monitored device contains credit card numbers and U.S. social security numbers, so this item matches policy ABC, policy MNO, and policy XYZ. All three policies are in\nTurn it on\nmode.\nPolicy\nCloud egress action\nCopy to clipboard action\nCopy to USB action\nCopy to network share action\nUnallowed apps action\nPrint action\nCopy via Bluetooth action\nCopy to remote desktop action\nABC\nAudit\nAudit\nAudit\nAudit\nAudit\nBlock\nAudit\nAudit\nMNO\nAudit\nAudit\nBlock\nAudit\nAudit\nAudit\nAudit\nAudit\nXYZ\nAudit\nBlock\nAudit\nAudit\nAudit\nBlock\nAudit\nAudit\nActions applied at runtime\nAudit\nBlock\nBlock\nAudit\nAudit\nBlock\nAudit\nAudit\nItem contains credit card numbers, different policy state\nAn item on a monitored device contains credit card number, so it matches policy ABC and policy MNO. Policy ABC is in\nTurn it on\nmode and policy\nMNO\nis in\nRun the policy in simulation mode\nstate.\nPolicy\nCloud egress action\nCopy to clipboard action\nCopy to USB action\nCopy to network share action\nUnallowed apps action\nPrint action\nCopy via Bluetooth action\nCopy to remote desktop action\nABC\nAudit\nAudit\nAudit\nAudit\nAudit\nBlock\nAudit\nAudit\nMNO\nAudit\nAudit\nBlock\nAudit\nAudit\nAudit\nAudit\nAudit\nActions applied at runtime\nAudit\nAudit\nAudit\nAudit\nAudit\nBlock\nAudit\nAudit\nItem contains credit card numbers, different override configuration\nAn item on a monitored device contains credit card number, so it matches policy ABC and policy MNO. Policy ABC is in\nTurn it on\nstate and policy\nMNO\nis in\nTurn it on\nstate. They have different\nOverride\nactions configured.\nPolicy\nCloud egress action\nCopy to clipboard action\nCopy to USB action\nCopy to network share action\nUnallowed apps action\nPrint action\nCopy via Bluetooth action\nCopy to remote desktop action\nABC\nAudit\nAudit\nBlock with override\nAudit\nAudit\nBlock\nAudit\nAudit\nMNO\nAudit\nAudit\nBlock without override\nAudit\nAudit\nAudit\nAudit\nAudit\nActions applied at runtime\nAudit\nAudit\nBlock without override\nAudit\nAudit\nBlock\nAudit\nAudit\nItem contains credit card numbers, different authorization groups configuration\nAn item on a monitored device contains credit card number, so it matches policy ABC and policy MNO. Policy ABC is in\nTurn it on\nstate and policy\nMNO\nis in\nTurn it on\nstate. They have different\nauthorization group\nactions configured.\nPolicy\nCloud egress action\nCopy to clipboard action\nCopy to USB action\nCopy to network share action\nUnallowed apps action\nPrint action\nCopy via Bluetooth action\nCopy to remote desktop action\nABC\nAudit\nAudit\nAuth group A - Block\nAudit\nAudit\nAuth group A - Block\nAudit\nAudit\nMNO\nAudit\nAudit\nAuth group A - Block with override\nAudit\nAudit\nAuth group B - block\nAudit\nAudit\nActions applied at runtime\nAudit\nAudit\nAuth group A - Block\nAudit\nAudit\nAuth group A - Block, Auth group B - Block\nAudit\nAudit\nConditions\nConditions are where you define what you want the rule to look for and the context in which those items are being used. They tell the rule: when you find an item that looks like\nthis\nand is being used like\nthat\n—it's a match and the rest of the actions in the policy should be taken on it. You can use conditions to assign different actions to different risk levels. For example, sensitive content shared internally might be lower risk and require fewer actions than sensitive content shared with people outside the organization.\nNote\nUsers who have non guest accounts in a host organization's Active Directory or Microsoft Entra tenant are considered as people inside the organization.\nContent contains\nAll locations support the\nContent contains\ncondition. You can select multiple instances of each content type and further refine the conditions by using the\nAny of these\n(logical OR) or\nAll of these\n(logical AND) operators:\nsensitive information types\nsensitivity labels\nretention labels\nTrainable Classifiers\nThe rule will only look for the presence of any\nsensitivity labels\nand\nretention labels\nyou pick.\nSITs have a predefined\nconfidence level\nwhich you can alter if needed. For more information, see\nMore on confidence levels\n.\nImportant\nSITs have two different ways of defining the maximum unique instance count parameters. To learn more, see\nInstance count supported values for SIT\n.\nAdaptive Protection in Microsoft Purview\nAdaptive protection integrates Microsoft Purview Insider Risk Management risk profiles into DLP policies so that DLP can help protect against dynamically identified risky behaviors. When configured in insider risk management, the\nInsider risk level for Adaptive Protection is\nwill show up as condition for Exchange Online, Devices, Teams, and unmanaged cloud apps locations. Refer to\nLearn about Adaptive Protection in Data Loss Prevention\nfor more details.\nConditions that Adaptive Protection supports\nInsider risk level for Adaptive Protection is supported with these values:\nElevated risk level\nModerate risk level\nMinor risk level\nCondition context\nThe available context options change depending on which location you choose. If you select multiple locations, only the conditions that the locations have in common are available.\nConditions Exchange supports\nNote\nDLP policies for Exchange scan non system generated emails and journaling emails.\nContent contains\nInsider risk level for Adaptive Protection is\nContent isn't labeled\nContent is shared from Microsoft 365\nContent is received from\nSender IP address is\nHeader contains words or phrases\nSender AD Attribute contains words or phrases\nContent character set contains words\nHeader matches patterns\nSender AD Attribute matches patterns\nRecipient AD Attribute contains words or phrases\nRecipient AD Attribute matches patterns\nRecipient is member of\nDocument property is\nAny email attachment's content could not be scanned\nDocument or attachment is password protected\nHas sender overridden the policy tip\nSender is a member of\nAny email attachment's content didn't complete scanning\nRecipient address contains words\nFile extension is\nRecipient domain is\nRecipient is\nSender is\nSender domain is\nRecipient address matches patterns\nDocument name contains words or phrases\nDocument name matches patterns\nSubject contains words or phrases\nSubject matches patterns\nSubject or body contains words or phrases\nSubject or body matches patterns\nSender address contains words\nSender address matches patterns\nDocument size equals or is greater than\nDocument content contains words or phrases\nDocument content matches patterns\nMessage size equals or is greater than\nMessage type is\nMessage importance is\nTip\nFor more information on the conditions that Exchange supports, including PowerShell values, see:\nData loss prevention Exchange conditions and actions reference\n.\nConditions SharePoint supports\nContent contains\nContent is shared from Microsoft 365\nDocument property is\nDocument could not be scanned\nDocument or attachment is password protected\nDocument didn't complete scanning\nFile extension is\nDocument name contains words or phrases\nDocument size equals or is greater than\nDocument created by\nDocument creation date is on or after (preview)\nDocument creation date is on or before (preview)\nDocument modification date is on or after (preview)\nDocument modification date is on or before (preview)\nConditions OneDrive accounts support\nContent contains\nContent is shared from Microsoft 365\nDocument property is\nDocument could not be scanned\nDocument or attachment is password protected\nDocument didn't complete scanning\nFile extension is\nDocument name contains words or phrases\nDocument size equals or is greater than\nDocument created by\nDocument is shared\nDocument creation date is on or after (preview)\nDocument creation date is on or before (preview)\nDocument modification date is on or after (preview)\nDocument modification date is on or before (preview)\nConditions Teams chat and channel messages support\nContent contains\nInsider risk level for Adaptive Protection is\nContent is shared from Microsoft 365\nRecipient domain is\nRecipient is\nSender is\nSender domain is\nConditions managed cloud apps support\nContent contains\nFile extension is\nDocument size equals or is greater than\nManaged or unmanaged devices (the cut/copy data, paste data, and print data activities only support the managed or unmanaged devices condition).\nConditions unmanaged cloud apps support\nContent contains\nContent not fully scanned\nFile cannot be scanned\nFile extension is\nFile is not labeled\nFile is password protected\nFile size is greater than\nFile type is\nInsider risk level for Adaptive Protection is\nImportant\nEncrypted files can't be inspected for sensitive info types or trainable classifiers.\nConditions supported for Endpoints\nContent contains:\nSpecifies content to be detected. For details on supported file types, see\nFiles scanned for content\n.\nContent is not labeled:\nDetects content that doesn't have a sensitivity label applied. To help ensure only supported file types are detected, you should use this condition with the\nFile extension is\nor\nFile type is\nconditions. PDF and Office files are supported:\nFile Type\nFormat\nMonitored file extensions\nWord processing\nWord, PDF\n.doc, .docx, .docm, .dot, dotx, .dotm, .docb, .pdf\nSpreadsheet\nExcel, CSV, TSV\n.xls, .xlsx, .xlt, .xlm, .xlsm, xltx, xltm, xlsb, .xlw, .csv, .tsv\nPresentation\nPowerPoint\n.ppt, .pptx, .pos, .pps, .pptm, .potx, .potm, .ppam, .ppsx\nDocument could not be scanned:\nApplies to files that can't be scanned for one of the following reasons:\nFile contains one or more transient text-extraction errors\nFile is password-protected\nFile size exceeds the supported limit (Maximum file sizes: 64 MB for uncompressed files; 256 MB for compressed files)\nMicrosoft classification engine (MCE) timeout or failure\nDocument name contains words or phrases:\nDetects documents with file names that contain any of the words or phrases you specify, for example:\nfile\n,\ncredit card\n,\npatent\n, etc.\nDocument name matches patterns:\nDetects documents where the file name matches specific patterns. The evaluation considers the entire path of the document, not just the document’s name. The pattern is checked as a string match, meaning it can match any part of the document path. To define the patterns, use wild cards. For information on regex patterns, see the Regular Expression documentation\nhere\n.\nNote\nDue to potential performance issues, this condition will gradually be phased out from Purview Endpoint DLP. We recommend using the 'Document name contains words or phrases' condition instead.\nDocument or attachment is password protected:\nDetects only protected files that are open. The following files are supported:\nArchive files (ZIP, .7z, RAR)\nOffice files\nPDFs\nSymantec PGP encrypted files\nDocument size equals or is greater than:\nDetects documents with file sizes that are equal to or greater than the specified value. DLP only supports content inspection for files less than 64 MB.\nImportant\nWe recommend setting this condition to detect items that are larger than 10KB\nFile type is:\nDetects the following file types:\nFile type\nApps\nMonitored file extensions\nWord processing\nWord, PDF\ndoc, .docx, .docm, .dot, dotx, .dotm, .docb, .pdf\nSpreadsheet\nExcel, CSV, TSV\n.xls, .xlsx, .xlt, .xlm, .xlsm, xltx, xltm, xlsb, .xlw, .csv, .tsv\nPresentation\nPowerPoint\n.ppt, .pptx, .pos, .pps, .pptm, .potx, .potm, .ppam, .ppsx\nEmail\nOutlook\n.msg\nImportant\nThe\nfile extensions\nand\nfile types\noptions\ncan't\nbe used as conditions in the same rule. If you want to use them as conditions in the same policy, they must be in separate rules.\nTo use the\nFile type is\ncondition, you must have one of the following versions of Windows:\nWindows Endpoints (X64):\nWindows 10 (21H2, 22H2)\nWindows 10 21H2 update details\nWindows 10 22H2 update details\nWindows Endpoints (ARM64):\nWindows 11 (21H2, 22H2)\nWindows 11 21H2 update details\nWindows 11 22H2 update details\nFile extension is:\nIn addition to detecting sensitive information in files with the same extensions as those covered by the\nFile type is\ncondition, you can use the\nFile extension is\ncondition to detect sensitive information in files with any file extension you need to monitor. To do so, add the necessary file extensions, separated by commas to a rule in your policy. The\nFile extension is\ncondition is supported only for those versions of Windows that support the\nFile type is\ncondition.\nFile extension is\ndoesn't support archive file types.\nWarning\nIncluding any of the following file extensions in your policy rules might significantly increase the CPU load:\n.dll, .exe, .mui, .ost, .pf, .pst.\nScanning did not complete:\nApplies when the scanning of a file started, but stopped before the entire file was scanned. The primary reason for an incomplete scan is that extracted text within the file exceeds the maximum size allowed. (Maximum sizes for extracted text: Uncompressed files: The first 4 MB of extractable text; Compressed files: N=1000 / Extraction Time = 5 minutes.)\nDocument property is:\nDetects documents with custom properties matching specified values. For example:\nDepartment = 'Marketing'\n,\nProject = 'Secret'\n. To specify multiple values for a custom property, use double quotes. For example, \"Department: Marketing, Sales\". Supported file types are Office and PDF:\nFile Type\nFormat\nMonitored file extensions\nWord processing\nWord, PDF\n.doc, .docx, .docm, .dot, dotx, .dotm, .docb, .pdf\nSpreadsheet\nExcel, CSV, TSV\n.xls, .xlsx, .xlt, .xlm, .xlsm, xltx, xltm, xlsb, .xlw, .csv, .tsv\nPresentation\nPowerPoint\n.ppt, .pptx, .pos, .pps, .pptm, .potx, .potm, .ppam, .ppsx\nThe user accessed a sensitive website from Microsoft Edge:\nFor more information, see\nScenario 6 Monitor or restrict user activities on sensitive service domains (preview)\n.\nInsider risk level for Adaptive Protection is:\nDetects the insider risk level.\nSee also:\nEndpoint activities you can monitor and take action on\n.\nOperating system requirements for five conditions\nDocument could not be scanned\nDocument name contains words or phrases\nDocument name matches patterns\nDocument size equals or is greater than\nScanning did not complete\nTo use any of these conditions, your endpoint devices must be running one of the following operating systems:\nWindows 11 23H2:\nDecember 4, 2023—KB5032288 (OS Builds 22621.2792 and 22631.2792) Preview\nWindows 11 22H2:\nDecember 4, 2023—KB5032288 (OS Builds 22621.2792 and 22631.2792) Preview - Microsoft Support\nWindows 11 21H2:\nDecember 12, 2023—KB5033369 (OS Build 22000.2652) - Microsoft Support\nWindows 10 22H2:\nNovember 30, 2023—KB5032278 (OS Build 19045.3758) Preview - Microsoft Support\nWindows 10 21H2:\nNovember 30, 2023—KB5032278 (OS Build 19045.3758) Preview - Microsoft Support\nWindows Server 2022/2019:\nNovember 14, 2023—KB5032198 (OS Build 20348.2113) - Microsoft Support\n(or later)\nOperating system requirements for Condition 'Document Property is'\nWindows 11:\nFebruary 29, 2024—KB5034848 (OS Builds 22621.3235 and 22631.3235) Preview - Microsoft Support\n(or later)\nWindows 10:\nFebruary 29, 2024—KB5034843 (OS Build 19045.4123) Preview - Microsoft Support\n(or later)\nImportant\nFor information about the Adobe requirements for using Microsoft Purview Data Loss Prevention (DLP) features with PDF files, see this article from Adobe:\nMicrosoft Purview Information Protection Support in Acrobat\n.\nConditions Instances supports\nContent contains\nContent is shared from Microsoft 365\nConditions On-premises repositories support\nContent contains\nFile extension is\nDocument property is\nConditions Fabric and Power BI support\nContent contains\nConditions Microsoft 365 Copilot supports\nThis feature is in preview.\nContent contains (sensitivity labels)\nCondition groups\nSometimes you need a rule to identify only one thing, such as all content that contains a U.S. Social Security Number, which is defined by a single SIT. However, in many scenarios where the types of items you're trying to identify are more complex and therefore harder to define, more flexibility in defining conditions is required.\nFor example, to identify content subject to the U.S. Health Insurance Act (HIPAA), you need to look for:\nContent that contains specific types of sensitive information, such as a U.S. Social Security Number or Drug Enforcement Agency (DEA) Number.\nAND\nContent that's more difficult to identify, such as communications about a patient's care or descriptions of medical services provided. Identifying this content requires matching keywords from large keyword lists, such as the International Classification of Diseases (ICD-9-CM or ICD-10-CM).\nYou can identify this type of data by grouping conditions and using logical operators (AND, OR) between the groups.\nFor the\nU.S. Health Insurance Act (HIPAA)\n, conditions are grouped like this:\nThe first group contains the SITs that identify an individual and the second group contains the SITs that identify medical diagnosis.\nConditions can be grouped and joined by boolean operators (AND, OR, NOT) so that you define a rule by stating what should be included and then defining exclusions in a different group joined to the first by a NOT. To learn more about how Purview DLP implements booleans and nested groups see,\nComplex rule design\n.\nDLP platform limitations for conditions\nCondition\nWorkload\nLimit\nCost of Evaluation\nContent Contains\nEXO/SPO/ODB\n125 SITs per rule\nHigh\nContent is shared from Microsoft 365\nEXO/SPO/ODB\n-\nHigh\nSender IP address is\nEXO\nIndividual range length <= 128; Count <= 600\nLow\nHas sender overridden the policy tip\nEXO\n-\nLow\nSender is\nEXO\nIndividual email length <= 256; Count <= 600\nMedium\nSender is a member of\nEXO\nCount <= 600\nHigh\nSender domain is\nEXO\nDomain name length <= 67; Count <= 600\nLow\nSender address contains words\nEXO\nIndividual word length <= 128; Count <= 600\nLow\nSender address matches patterns\nEXO\nRegex length <= 128 char; Count <= 600\nLow\nSender AD attribute contains words\nEXO\nIndividual word length <= 128; Count <= 600\nMedium\nSender AD attribute matches patterns\nEXO\nRegex length <= 128 char; Count <= 600\nMedium\nContent of email attachment(s) can't be scanned\nEXO\nSupported file types\nLow\nIncomplete scan of email attachment content\nEXO\nExtracted content size > 2 MB (2 million characters)\nLow\nAttachment is password-protected\nEXO\nFile types: Office files, .PDF, .ZIP, and 7z\nLow\nAttachment's file extension is\nEXO/SPO/ODB\nCount <= 600 per rule\nHigh\nRecipient is a member of\nEXO\nCount <= 600\nHigh\nRecipient domain is\nEXO\nDomain name length <= 67; Count <= 5000\nLow\nRecipient is\nEXO\nIndividual email length <= 256; Count <= 600\nLow\nRecipient address contains words\nEXO\nIndividual word length <= 128; Count <= 600\nLow\nRecipient address matches patterns\nEXO\nCount <= 300\nLow\nDocument name contains words or phrases\nEXO\nIndividual word length <= 128; Count <=600\nLow\nDocument Name matches patterns\nEXO\nRegex length <= 128 char; Count <= 300\nLow\nDocument property is\nEXO/SPO/ODB\n-\nLow\nDocument size equals or is greater than\nEXO\n-\nLow\nSubject contains words or phrases\nEXO\nIndividual word length <= 128; Count <= 600\nLow\nHeader contains words or phrases\nEXO\nIndividual word length <= 128; Count <= 600\nLow\nSubject or body contains words or phrases\nEXO\nIndividual word length <= 128; Count <= 600\nLow\nContent character set contains words\nEXO\nCount <= 600\nLow\nHeader matches patterns\nEXO\nRegex length <= 128 char; Count <= 300\nLow\nSubject matches patterns\nEXO\nRegex length <= 128 char; Count <= 300\nLow\nSubject or body matches patterns\nEXO\nRegex length <= 128 char; Count <= 300\nLow\nMessage type is\nEXO\n-\nLow\nMessage size over\nEXO\n-\nLow\nWith importance\nEXO\n-\nLow\nSender AD attribute contains words\nEXO\nEach attribute key value pair: has Regex length <= 128 char; Count <= 600\nMedium\nSender AD attribute matches patterns\nEXO\nEach attribute key value pair: has Regex length <= 128 char; Count <= 300\nMedium\nDocument contains words\nEXO\nIndividual word length <= 128; Count <= 600\nMedium\nDocument matches patterns\nEXO\nRegex length <= 128 char; Count <= 300\nMedium\nActions\nAny item that makes it through the\nconditions\nfilter has any\nactions\nthat are defined in the rule applied to it. You have to configure the required options to support the action. For example, if you select Exchange with the\nRestrict access or encrypt the content in Microsoft 365 locations\naction, you need to choose from these options:\nBlock users from accessing shared SharePoint, OneDrive, and Teams content\nBlock everyone. Only the content owner, last modifier, and site admin will continue to have access\nBlock only people from outside your organization. Users inside your organization continue to have access.\nEncrypt email messages (applies only to content in Exchange)\nThe actions that are available in a rule depend on the locations that have been selected. The available actions for each individual location are listed below.\nImportant\nFor SharePoint and OneDrive locations, documents will be proactively blocked right after detection of sensitive information (regardless of whether the document is shared or not) for all guests; internal users continue to have access to the document.\nSupported actions: Exchange\nWhen DLP policy rules are applied in Exchange, they may be\nhalting\n,\nnon-halting\n, or\nneither\n. Most of the rules that Exchange supports are non halting. Non halting actions are applied after processing the subsequent rules and policies.\nDLP actions are taken on inbound encrypted emails that are in scope of a policy, such as block,\nbut\nto maintain the confidentiality of the encryption, the event won't appear in Activity Explorer or in the Alert and the content of the message won't be accessible to anyone other than the recipient.\nHowever, when a\nhalting\naction is triggered by a DLP policy rule, Purview stops processing any subsequent rules. For instance, when the\nRestrict access or encrypt the content in Microsoft 365 locations\naction is triggered, no further rules or policies are processed.\nIf an action is\nneither\nhalting nor non halting, Purview waits for the result of the action to occur before continuing. So, when an outgoing email triggers the\nForward the message for approval to sender's manager\naction,\nPurview waits to get the manager's decision on whether or not the email may be sent. If the manager approves, the action behaves as a non halting action and the subsequent rules are processed. In contrast, if the manager rejects sending the email,\nForward the message for approval to sender's manager\nbehaves as a halting action and blocks sending the email; no subsequent rules or policies are processed.\nThe following table lists the actions that Exchange supports, and indicates whether they're halting or non halting.\nAction\nHalting / Non halting\nRestrict access or encrypt the content in Microsoft 365 locations (Block Everyone, Block only people outside your organization)\nHalting\nRestrict access or encrypt the content in Microsoft 365 locations (Encrypt Email Messages)\nNon - Halting\nSet headers\nNon halting\nRemove header\nNon halting\nRedirect the message to specific users\nNon halting\nForward the message for approval to sender's manager\nNeither\nForward the message for approval to specific approvers\nNeither\nAdd recipient to the To box\nNon halting\nAdd recipient to the Cc box\nNon halting\nAdd recipient to the Bcc box\nNon halting\nAdd the sender's manager as recipient\nNon halting\nRemove message encryption and rights protection\nNon halting\nPrepend Email Subject\nNon halting\nAdd HTML Disclaimer\nNon halting\nModify Email Subject\nNon halting\nDeliver the message to the hosted quarantine\nHalting\nApply branding to encrypted messages\nNon halting\nTip\nFor the\nApply branding to encrypted messages\naction, if you already have Microsoft Purview Message Encryption implemented, the templates automatically show up in the drop-down list. If you want to implement Microsoft Purview Message Encryption, see\nAdd your organization's brand to your Microsoft Purview Message Encryption encrypted messages\nfor background on message encryption and how to create and configure your branding templates.\nFor more information on the actions that Exchange supports, including PowerShell values, see:\nData loss prevention Exchange conditions and actions reference\n.\nSupported actions: SharePoint\nRestrict access or encrypt the content in Microsoft 365 locations\nSupported actions: OneDrive\nRestrict access or encrypt the content in Microsoft 365 locations\nSupported actions: Teams Chat and Channel Messages\nRestrict access or encrypt the content in Microsoft 365 locations\nSupported actions: Devices\nThe\nDevices\nlocation supports these actions:\nRestrict access or encrypt the content in Microsoft 365 locations\nBlock users from receiving email, or accessing shared SharePoint, OneDrive, Teams files, and Power BI items\nAudit or restrict activities when users access sensitive sites in Microsoft Edge browsers on Windows devices\nSensitive site restrictions\nAudit or restrict activities on devices\nUpload to a restricted cloud service domain or access from an unallowed browser\nPaste to supported browsers\nFile activities for all apps\nCopy to clipboard\nCopy to removable USB device\nCopy to a network share\nPrint\nCopy or move using unallowed Bluetooth app\nCopy or move using RDP\nFile activities for apps in restricted app groups\nApp access restrictions\nAccess by restricted apps\nAccess by apps not included in restricted apps list or any restricted app groups added to rule\nin preview\nRestrictions in Windows Recall in Copilot+ PCs\nApply restrictions to only unsupported file extensions\nStart a Power Automate flow\nImportant\nWhen you select the\nAudit or restrict activities on devices\naction, the\nApply restrictions to only unsupported file extensions\nshows up.\nApply restrictions to only unsupported file extensions\nconfiguration option\nDOES NOT\nsupport scoping by\nDevice and device groups\nin the policy location setting.\nYou can tell DLP to\nAllow\n,\nAudit only\n,\nBlock with override\n, or\nBlock\n(the actions) these user activities for onboarded Windows devices.\nYou can tell DLP to\nAudit only\n,\nBlock with override\n, or\nBlock\n(the actions) these user activities for onboarded macOS devices.\nBlock\n: User related activity is blocked, and auditing is enabled. Admins may optionally see alerts.\nBlock with override\n: This option acts as a standard block but permits users to bypass it. By clicking the 'Allow' button on the toast notification or the 'Ok' button on the Microsoft Edge notification, users can proceed. Once allowed, Endpoint DLP will automatically resume for actions including 'Copy to a network share', 'Copy to a removable USB device', and 'Print'. For other actions, users will need to repeat the process after clicking 'Allow' to bypass the policy.\nAudit\n: No blocking of activities, but auditing is enabled, and admins may optionally see alerts.\nAllow\n: Activities are allowed without triggering alerts, but auditing is still enabled.\nOff\n: No blocking or auditing of activities.\nEnforcement mode\nBlock user\nAlert generation\nAuditing record generation\nBlock\nYes\nYes if alert is turned on for the DLP rule\nYes\nBlock with override\nYes\nYes if alert is turned on for the DLP rule\nYes\nAudit\nNo\nYes if alert is turned on for the DLP rule\nYes\nAllow\nNo\nNever triggered\nYes\nOff\nNo\nNo\nNo\nSupported actions: Managed cloud apps\nYou can tell DLP to\nAudit only\nor\nBlock\n(the actions) for user activities in managed cloud apps on Windows and macOS devices.\nSupported actions: Unmanaged cloud apps\nYou can tell DLP to\nAudit only\nor\nBlock\n(the actions) for user activities in unmanaged cloud apps on Windows.\nMore information on supported actions\nYou can find more details here about actions:\nRestrict access or encrypt content in Microsoft 365 locations\nAudit or restrict activities when users access sensitive sites in Microsoft Edge browsers on Windows devices\nAudit or restrict activities on devices\nService domain and browser activities\nFile activities for all apps\nRestricted app activities\nFile activities for apps in restricted app groups\nRestrict access or encrypt content in Microsoft 365 locations\nUse this to block users from receiving email, or accessing shared SharePoint, OneDrive, Teams files, and Power BI items. This action can block everyone or block only people who are outside your organization.\nAudit or restrict activities when users access sensitive sites in Microsoft Edge browsers on Windows devices\nUse this action to control when users attempt to:\nActivity\nDescription/options\nPrint the site\nDetects when users try to print a protected site from an onboarded device.\nCopy data from the site\nDetects when users try to copy data from a protected site from an onboarded device.\nSave the site as local files (Save-As)\nDetects when users try to save a protected site as local files from an onboarded device.\nAudit or restrict activities on devices\nUse this to restrict user activities by Service domain and browser activities, File activities for all apps, Restricted app activities. To use\nAudit or restrict activities on devices\n, you have to configure options in\nDLP settings\nand in the policy in which you want to use them. See,\nRestricted apps and app groups\nfor more information.\nDLP rules with the action\nAudit or restrict activities on devices\ncan have\nBlock with override\nconfigured. When this rule is applied to a file, any attempt to perform a restricted action on the file is blocked. A notification is displayed with the option to override the restriction. If the user chooses to override, the action is permitted for a period of 1 minute, during which the user can retry the action without restriction. The exception to this behavior is when a sensitive file is dragged and dropped into Microsoft Edge, which will immediately attach the file if the rule is overridden.\nService domain and browser activities\nWhen you configure the\nAllow/Block cloud service domains\nand the\nUnallowed browsers\nlist (see\nBrowser and domain restrictions to sensitive data\n) and a user attempts to upload a protected file to a cloud service domain or access it from an unallowed browser, you can configure the policy action to\nAudit only\n,\nBlock with override\n, or\nBlock\nthe activity.\nActivity\nDescription/options\nUpload to a restricted cloud services domain or access from an unallowed app\nDetects when protected files are blocked or allowed to be uploaded to cloud service domains. See,\nBrowser and domain restrictions to sensitive data\nand\nScenario 6 Monitor or restrict user activities on sensitive service domains)\n.\nPaste to supported browsers\nDetects when users paste sensitive information into a text field or web form using Microsoft Edge, Google Chrome (with Microsoft Purview extension), or Mozilla Firefox (with Microsoft Purview extension). Evaluation is independent of the classification of the source file. For more information, see:\nEndpoint activities you can monitor and take action on\n.\nPaste to Browser limitations\nOnly certain rule conditions work with Paste to Browser events due to the fact that the rules are being evaluated only on the clipboard data. Paste to Browser won't evaluate based on where the text is being copied from.\nRule conditions that work with Paste to Browser:\nContent contains\nContent is not labeled\nAdditional notes:\nPaste to Browser supports SITs, not Sensitivity Labels.\nPaste to Browser doesn't evaluate on text smaller than 30 characters.\nAdvanced Classification isn't supported.\nContextual Summary doesn't show for Paste to Browser events.\nPaste to Browser takes 2 seconds to evaluate before allowing the paste action.\nIF JIT is configured to block on fallback, it blocks pasting.\nPaste to Browser only classifies the first 4 MB of text from the clipboard\nFile activities for all apps\nWith the\nFile activities for all apps\noption, you select either\nDon't restrict file activities\nor\nApply restrictions to specific activities\n. When you select\nApply restrictions to specific activities\n, the actions that you select here are applied when a user has accessed a DLP protected item.\nActivity\nDescription/options\nCopy to clipboard\nDetects when protected files are copied to the clipboard on an onboarded device. For more information, see\nEndpoint activities you can monitor and take action on\nand\nCopy to clipboard behavior\nCopy to a removable device\nDetects when protected files are copied or moved from an onboarded device to a removable USB device. For more information, see\nRemovable USB device groups\n.\nCopy to a network share\nDetects when protected files are copied or moved from an onboarded device to any network share. For more information, see\nEndpoint activities you can monitor and take action on\nand\nNetwork share coverage and exclusions\n.\nPrint\nDetects when a protected file is printed from an onboarded device. For more information, see\nPrinter groups\n.\nCopy or move using unallowed Bluetooth app\nDetects when a protected file is copied or moved from an onboarded Windows device using an unallowed Bluetooth app. For more information, see\nUnallowed Bluetooth apps\n. This isn't supported for macOS.\nCopy or move using RDP\nDetects when users copy or move protected files from an onboarded Windows device to another location using RDP. This isn't supported for macOS.\nRestricted app activities\nPreviously called Unallowed apps,\nrestricted app activities\nare apps that you want to place restrictions on. You define these apps in a list in Endpoint DLP settings. When a user attempts to access a DLP protected file using an app that is on the list, you can either\nAudit only\n,\nBlock with override\n, or\nBlock\nthe activity. DLP actions defined in\nRestricted app activities\nare overridden if the app is a member of restricted app group. Then the actions defined in the restricted app group are applied.\nActivity\nDescription/options\nAccess by restricted apps\nDetects when unallowed apps try to access protected files on an onboarded Windows device. For more information, see\nRestricted apps and app groups\n.\nFile activities for apps in restricted app groups\nYou define your restricted app groups in Endpoint DLP settings and add restricted app groups to your policies. When you add a restricted app group to a policy, you must select one of these options:\nDon't restrict file activity\nApply restrictions to all activity\nApply restrictions to specific activity\nWhen you select either of the\nApply restrictions\noptions, and a user attempts to access a DLP protected file using an app that is in the restricted app group, you can either\nAudit only\n,\nBlock with override\n, or\nBlock\nby activity. DLP actions that you define here override actions defined in\nRestricted app activities\nand\nFile activities for all apps\nfor the app.\nFor more information, see\nRestricted apps and app groups\n.\nNote\nThe devices location provides many subactivities (conditions) and actions. To learn more, see\nEndpoint activities you can monitor and take action on\n.\nImportant\nThe\nCopy to clipboard\ncondition detects when a user copies information from a\nprotected file\nto the clipboard. Use\nCopy to clipboard\nto block, block with override, or audit when users copy information from a protected file.\nThe\nPaste to supported browsers\ncondition detects when a user attempts to paste sensitive text into a text field or web form using Microsoft Edge, Google Chrome with Microsoft Purview extension, or Mozilla Firefox with Microsoft Purview\nextension\nregardless of where that information came from\n. Use\nPaste to supported browsers\nto block, block with override, or audit when users paste sensitive information into a text field or web form.\nInstances actions\nRestrict access or encrypt the content in Microsoft 365 locations\nRestrict Third Party Apps\nOn-premises repositories actions\nRestrict access or remove on-premises files.\nBlock people from accessing files stored in on-premises repositories\nSet permissions on the file (permissions inherited from the parent folder)\nMove file from where it's stored to a quarantine folder\nSee,\nDLP On-premises repository actions\nfor full details.\nFabric and Power BI actions\nNotify users with email and policy tips\nSend alerts to Administrator\nRestrict access\nNote\nApplicable to\nsupported item types\nonly.\nMicrosoft 365 Copilot actions\nThis feature is in preview.\nExclude content in the location\nManaged cloud apps actions\nRestrict browser and network activities\nUnmanaged cloud apps actions\nRestrict browser and network activities\nActions available when you combine locations\nIf you select Exchange and any other single location for the policy to be applied to, the\nRestrict access or encrypt the content in Microsoft 365 locations and all actions for the non-Exchange location actions are available.\nIf you select two or more non-Exchange locations for the policy to be applied to, the\nRestrict access or encrypt the content in Microsoft 365 locations and all actions for non-Exchange locations actions are available.\nFor example, if you select the Exchange and Devices locations, these actions are available:\nRestrict access or encrypt the content in Microsoft 365 locations\nAudit or restrict activities on Windows devices\nIf you select Devices and Instances, these actions are available:\nRestrict access or encrypt the content in Microsoft 365 locations\nAudit or restrict activities on Windows devices\nRestrict Third Party Apps\nWhether an action takes effect or not depends on how you configure the mode of the policy. You can choose to run the policy in simulation mode with or without showing policy tip by selecting the\nRun the policy in simulation mode\noption. You choose to run the policy as soon as an hour after it's created by selecting the\nTurn it on right away\noption, or you can choose to just save it and come back to it later by selecting the\nKeep it off\noption.\nDLP platform limitations for actions\nAction Name\nWorkload\nLimits\nRestrict access or encrypt content in Microsoft 365\nEXO/SPO/ODB\nSet headers\nEXO\nRemove header\nEXO\nRedirect the message to specific users\nEXO\nTotal of 100 across all DLP rules. Can't be DL/SG\nForward the message for approval to sender's manager\nEXO\nManager should be defined in AD\nForward the message for approval to specific approvers\nEXO\nGroups aren't supported\nAdd recipient to the\nTo\nbox\nEXO\nRecipient count <= 10; Can't be DL/SG\nAdd recipient to the\nCc\nbox\nEXO\nRecipient count <= 10; Can't be DL/SG\nAdd recipient to the\nBcc\nbox\nEXO\nRecipient count <= 10; Can't be DL/SG\nAdd the sender's manager as recipient\nEXO\nManager attribute should be defined in AD\nApply HTML disclaimer\nEXO\nPrepend subject\nEXO\nApply message encryption\nEXO\nRemove message encryption\nEXO\n(preview) Exclude content in Copilot location\nMicrosoft 365 Copilot (preview)\nOnly content in SharePoint and OneDrive for Business can be excluded from being processed by Microsoft 365 Copilot\nUser notifications and policy tips\nWhen a user attempts an activity on a sensitive item in a context that meets the conditions of a rule (for example, content such as an Excel workbook on a OneDrive site that contains personal data (PII) and is shared with a guest), you can let them know about it through user notification emails and in-context policy tip popups. These notifications are useful because they increase awareness and help educate people about your organization's DLP policies.\nAn Alert email, Incident Report email, and User Notification will only be sent once per document. If a document with a 'Content is Shared' condition is shared twice, there will still be only one notification.\nImportant\nNotification emails are sent unprotected.\nEmail notifications are only supported for the Microsoft 365 services.\nEmail notifications support by selected location\nSelected location\nEmail notifications supported\nDevices\n- Not supported\nExchange + Devices\n- Supported for Exchange\n- Not supported for Devices\nExchange\n- Supported\nSharePoint + Devices\n- Supported for SharePoint\n- Not supported for Devices\nSharePoint\n- Supported\nExchange + SharePoint\n- Supported for Exchange\n- Supported for SharePoint\nDevices + SharePoint + Exchange\n- Not supported for Devices\n- Supported for SharePoint\nSupported for Exchange\nTeams\n- Not supported\nOneDrive\n- Supported for OneDrive for work or school\n- Not supported for Devices\nFabric and Power-BI\n- Not supported\nInstances\n- Not supported\nOn-premises repositories\n- Not supported\nExchange + SharePoint + OneDrive\n- Supported for Exchange\n- Supported for SharePoint\n- Supported for OneDrive\nMicrosoft 365 Copilot (preview)\n- Not supported\nManaged cloud apps\n- Not supported\nUnmanaged cloud apps\n- Not supported\nYou can also give people the option to\noverride the policy\n, so that they're not blocked if they have a valid business need or if the policy is detecting a false positive.\nPolicy tips and user notifications configuration options\nThe user notifications and policy tips configuration options vary depending on the monitoring locations you've selected. If you selected:\nExchange\nSharePoint\nOneDrive\nTeams Chat and Channel\nInstances\nYou can enable/disable user notifications for various Microsoft apps, see\nData Loss Prevention policy tips reference\n.\nYou can also enable/disable notifications with a policy tip.\nemail notifications to the user who sent, shared, or last modified the content\nOR\nnotify specific people\nFurthermore, you can customize the email text, subject, and the policy tip text.\nFor detailed information on customizing end user notification emails, see\nCustom email notifications\n.\nIf you selected Devices only, you get all the same options that are available for Exchange, SharePoint, OneDrive, Teams Chat and Channel, and Instances, plus the option to customize the notification title, content, and add a hyperlink in the form of a\nGet Support\nbutton that appears on the Windows 10/11 device.\nCustom policy tip notifications character limits\nPolicy tip notifications are subject to the following character limits:\nVariable\nCharacter Limit\nTITLE\n120\nCONTENT\n250\nJUSTIFICATION\n250\nHyperlink has no character limit but is limited to the remaining space available in the entire DLP package. The hyperlink must be a resolvable URL, and it's abstracted behind a selectable control. The more information hyperlink is available in Microsoft 365 Office apps.\nYou can customize the title and body of text using the following parameters.\nCommon name\nParameter\nExample\nfile name\n%%FileName%%\nContoso doc 1\nprocess name\n%%ProcessName%%\nWord\npolicy name\n%%PolicyName%%\nContoso highly confidential\naction\n%%AppliedActions%%\npasting document content from the clipboard to another app\n%%AppliedActions%%\nsubstitutes these values into the message body:\naction common name\nvalue substituted in for %%AppliedActions%% parameter\ncopy to removeable storage\nwriting to removable storage\ncopy to network share\nwriting to a network share\nprint\nprinting\npaste from clipboard\npasting from the clipboard\ncopy via bluetooth\ntransferring via Bluetooth\nopen with an unallowed app\nopening with this app\ncopy to a remote desktop (RDP)\ntransferring to remote desktop\nuploading to an unallowed website\nuploading to this site\naccessing the item via an unallowed browser\nopening with this browser\nUsing this customized text\n%%AppliedActions%% File name %%FileName%% via %%ProcessName%% isn't allowed by your organization. Select 'Allow' if you want to bypass the policy %%PolicyName%%\nproduces this text in the customized notification:\npasting from the clipboard File Name: Contoso doc 1 via WINWORD.EXE isn't allowed by your organization. Select the 'Allow' button if you want to bypass the policy Contoso highly confidential\nYou can localize your custom policy tips by using the\nSet-DlpComplianceRule -NotifyPolicyTipCustomTextTranslations cmdlet\n.\nCustom Policy Tips show for the most restrictive rule, not necessarily the rule that's performing the restriction.\nNote\nUser notifications and policy tips aren't available for the On-premises location\nOnly the policy tip from the highest priority, most restrictive rule is shown. For example, a policy tip from a rule that blocks access to content will be shown over a policy tip from a rule that simply sends a notification. This prevents people from seeing a cascade of policy tips.\nTo learn more about user notification and policy tip configuration and use, including how to customize the notification and tip text, see\nSend email notifications and show policy tips for DLP policies\n.\nPolicy tip considerations\nPolicy tips aren't generated if the sensitivity label is in a compressed file.\nDLP has difficulty generating policy tips for encrypted files.\nPolicy tip references\nDetails on support for policy tips and notifications for different apps can be found here:\nData loss prevention policy tip reference for Outlook for Microsoft 365\nData loss prevention policy tip reference for Outlook on the Web\nData loss prevention policy tip reference SharePoint in Microsoft 365 and OneDrive. web client\nBlock users in Exchange\nNote\nIf you have policy tips enabled when the policy is configured to\nBlock only users outside your organization\nis met, the policy tips notification blocks you from sending a message at all if there are external recipients. Therefore, you have to remove any external recipients before you can send the message to internal recipients.\nBlocking and notifications in SharePoint in Microsoft 365 and OneDrive\nThe following table shows the DLP blocking and notification behavior for policies that are scoped to SharePoint in Microsoft 365 and OneDrive. This isn't intended to be an exhaustive list, and there are more settings that aren't in scope for this article.\nConditions\nRestrict access setting\nBlocking and notification behavior\nContent is shared from Microsoft 365\n- with people outside my organization\nNot configured\nUser notification emails, policy tips, DLP alerts, and incident reports are sent only when a file is shared with a guest and a guest access the file.\nContent is shared from Microsoft 365\n- only with people inside my organization\nNot configured\nUser notification emails, policy tips, DLP alerts, and incident reports are sent when sensitive information is detected in the document.\nContent is shared from Microsoft 365\n- only with people inside my organization\n-\nRestrict access or encrypt the content in Microsoft 365 locations\n-\nBlock users from receiving email or accessing shared SharePoint, OneDrive, and Teams files\n-\nBlock everyone\n- Access to sensitive files is blocked as soon as they're uploaded.\n- User notification emails, policy tips, DLP alerts, and incident reports are sent when a file is uploaded\nContent is shared from Microsoft 365\n- with people outside my organization\n-\nRestrict access or encrypt the content in Microsoft 365 locations\n-\nBlock users from receiving email or accessing shared SharePoint, OneDrive, and Teams files\n-\nBlock only people outside your organization\n- Access to a sensitive file is blocked as soon as it's uploaded, regardless of whether the document is shared or not for all guests. Policy Tips will be sent when the sensitive information is detected.\n- If the sensitive information is added to a file after it's shared and accessed by a user outside the organization, then alerts and incident reports will be sent.\n- If the document contains sensitive information before it's uploaded, external sharing will be blocked proactively. Because external sharing in this scenario is blocked when the file is uploaded, no alerts or incident reports are sent. Suppression of the alerts and incident reports is designed to prevent a flood of alerts to the user for each blocked file. User notification emails, policy tips, DLP alerts, and incident reports are sent when a file is uploaded or sensitive information is added to the file.\n- Proactive blocking will show up as event in the Audit Log and Activity Explorer.\n-\nContent is shared from Microsoft 365\n-\nwith people outside my organization\n-\nRestrict access or encrypt the content in Microsoft 365 locations\n-\nBlock users from receiving email or accessing shared SharePoint, OneDrive, and Teams files\n-\nBlock everyone\n- When the first user outside the organization access the document, the event causes the document to be blocked.\n- It's expected that for a short time, the document is accessible by guests who have the link to the file.\n- User notification emails, policy tips, DLP alerts, and incident reports are sent when a file is shared with an guest and an guest accesses that file.\nLearn more URL\nUsers may want to learn why their activity is being blocked. You can configure a site or a page that explains more about your policies. When you select\nProvide a compliance URL for the end user to learn more about your organization's policies (only available for Exchange)\n, and the user receives a policy tip notification in Outlook Win32, the\nLearn more\nlink points to the site URL that you provide. This URL has priority over the global compliance URL configured with\nSet-PolicyConfig -ComplainceURL\n.\nImportant\nYou must configure the site or page that\nLearn more\npoints to from scratch. Microsoft Purview doesn't provide this functionality out of the box.\nUser overrides\nThe intent of\nUser overrides\nis to give users a way to bypass, with justification, DLP policy blocking actions on sensitive items in Exchange, SharePoint, OneDrive, or Teams, so that they can continue their work. User overrides are enabled only when\nNotify users in Office 365 services with a policy tip\nis enabled, so user overrides go hand-in-hand with Notifications and Policy tips.\nNote\nUser overrides aren't available for the On-premises repositories location.\nTypically, user overrides are useful when your organization is first rolling out a policy. The feedback that you get from any override justifications and identifying false positives helps in tuning the policy.\nIf the policy tips in the most restrictive rule allow people to override the rule, then overriding this rule also overrides any other rules that the content matched.\nUser override behavior\nWhen a user selects the\nAllow\noption to override a block action for these activities:\nPrint\nCopy to a removable USB device\nCopy to a network share\nWithin 30 seconds of the popup notification showing, the activity is allowed to continue. If the user doesn't select the\nAllow\noption within 30 seconds, the activity is blocked.\nFor all other activities, the user must repeat the activity after selecting the\nAllow\noption in order for it to complete\nBusiness justification X-Header\nWhen a user overrides a block with override action on an email, the override option and the text that they provide are stored in the\nAudit log\nand in the email X-header. To view the business justification overrides,\nsearch the audit log\nfor\nExceptionInfo\nvalue for the details. Here's an example of the audit log values:\n{\n \"FalsePositive\"; false,\n \"Justification\"; My manager approved sharing of this content\",\n \"Reason\"; \"Override\",\n \"Rules\": [\n \"\"\n ]\n}\nIf you have an automated process that makes use of the business justification values, the process can access that information programmatically in the email X-header data.\nNote\nThe\nmsip_justification\nvalues are stored in the following order:\nFalse Positive; Recipient Entitled; Manager Approved; I Acknowledge; JustificationText_[free text]\n.\nNotice that the values are separated by semicolons. The maximum free text allowed is 500 characters.\nIncident reports\nWhen a rule is matched, you can send an\nAlert\nemail to your compliance officer (or any people you choose) with details of the event and you can view them in the Microsoft Purview Data Loss Prevention\nAlerts\ndashboard and in the\nMicrosoft Defender XDR portal\n. An alert includes information about the item that was matched, the actual content that matched the rule, and the name of the person who last modified the content.\nIn preview, admin alert emails include details such as:\nThe alert severity\nThe time the alert occurred\nThe activity.\nThe sensitive data that were detected.\nThe alias of the user whose activity triggered the alert.\nThe policy that was matched.\nThe alert ID\nThe endpoint operation that was attempted if the\nDevices\nlocation is in the scope of the policy.\nThe app that was being used.\nThe device name, if the match occurred on an endpoint device.\nDLP feeds incident information to other Microsoft Purview Information Protection services, like\ninsider risk management\n. In order to get incident information to insider risk management, you must set the\nIncident reports\nseverity level to\nHigh\n.\nAlert types\nAlerts can be sent every time an activity matches a rule, which can be noisy. To help cut down on the noise, they can be aggregated based on number of matches or volume of items over a set period of time. There are two types of alerts that can be configured in DLP policies.\nSingle-event alerts\nare typically used in policies that monitor for highly sensitive events that occur in a low volume, like a single email with 10 or more customer credit card numbers being sent outside your organization. In preview\nuser based alert aggregation (preview)\nmodifies the behavior of single event alerts.\nAggregate-event alerts\nare typically used in policies that monitor for events that occur in a higher volume over a period of time. For example, an aggregate alert can be triggered when 10 individual emails each with one customer credit card number is sent outside your org over 48 hours.\nNote\nFor rules with Alerts configured on Sharepoint, or OneDrive workloads we only send one alert per file per rule. This is true, even if the same violation has been committed by multiple users.\nOther alert options\nWhen you select\nUse email incident reports to notify you when a policy match occurs\nyou can choose to include:\nThe name of the person who last modified the content.\nThe types of sensitive content that matched the rule.\nThe rule's severity level.\nThe content that matched the rule, including the surrounding text.\nThe item containing the content that matched the rule.\nFor more information on alerts, see:\nAlerts in DLP policies\n: Describes alerts in the context of a DLP policy.\nGet started with data loss prevention alerts\n: Covers the necessary liscensing, permissions, and prerequisites for DLP alerts and alert reference details.\nCreate and deploy data loss prevention policies\n: Includes guidance on alert configuration in the context of creating a DLP policy.\nLearn about investigating data loss prevention alerts\n: Covers the various methods for investigating of DLP alerts.\nInvestigate data loss incidents with Microsoft Defender XDR\n: How to investigate DLP alerts in Microsoft Defender portal.\nEvidence collection for file activities on devices\nIf you've enabled\nSetup evidence collection for file activities on devices\nand added Azure storage accounts, you can select\nCollect original file as evidence for all selected file activities on Endpoint\nand the Azure storage account you want to copy the items to. You must also choose the activities you want to copy items for. For example, if you select\nPrint\nbut not\nCopy to a network share\n, then only items that are printed from monitored devices will be copied to the Azure storage account.\nAdditional options\nIf you have multiple rules in a policy, you can use the\nAdditional options\nto control further rule processing if there's a match to the rule you're editing as well as setting the priority for evaluation of the rule. This is only supported for Exchange and Teams locations\nSee also\nLearn about data loss prevention\nPlan for data loss prevention (DLP)\nCreate and Deploy data loss prevention policies\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "DLP Policy Reference", @@ -602,7 +602,7 @@ "https://learn.microsoft.com/en-us/purview/dlp-create-deploy-policy": { "content_hash": "sha256:e5620af6b12f1ce0a9ee4ef9637f52d78ea04374ceebae2fc342a4525330c98f", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nCreate and deploy data loss prevention policies\nFeedback\nSummarize this article for me\nMicrosoft Purview Data Loss Prevention (DLP) policies include many configuration options. Each option changes the policy's behavior. The articles in this series cover some of the most common DLP policy scenarios. They walk you through configuring those options to give you hands-on experience with the DLP policy creation process. When you familiarize yourself with these scenarios, you gain the foundational skills that you need to use the DLP policy creation UX to create your own policies.\nHow you deploy a policy is as important policy design. You have\nmultiple options to control policy deployment\n. This article shows you how to use these options so that the policy achieves your intent while avoiding costly business disruptions.\nIn preview\nYou can change the display name of DLP policies and rules. Once you rename a policy or a rule, any existing records retain their previous name in activity explorer evetns, in alerts and in audit records. New records will reflect the new name in activity explorer events, in alerts and in audit records. These names will remain until the items age out of the system.\nOrient yourself to DLP\nAdministrative units\nLearn about Microsoft Purview Data Loss Prevention\nPlan for data loss prevention (DLP)\n- by working through this article you will:\nIdentify stakeholders\nDescribe the categories of sensitive information to protect\nSet goals and strategy\nCollection Policies solution overview\nCollection policy reference\nData Loss Prevention policy reference\n- this article introduces all the components of a DLP policy and how each one influences the behavior of a policy\nDesign a DLP policy\n- this article walks you through creating a policy intent statement and mapping it to a specific policy configuration.\nCreate and Deploy data loss prevention policies\n- This article presents some common policy intent scenarios that you map to configuration options, then it walks you through configuring those options.\nLearn about investigating data loss prevention alerts\n- This article introduces you to the lifecycle of alerts from creation, through final remediation and policy tuning. It also introduces you to the tools you use to investigate alerts.\nSKU/subscriptions licensing\nFor information on licensing, see\nMicrosoft 365 Enterprise Plans\nMicrosoft 365 Service Descriptions\nPermissions\nThe account you use to create and deploy policies must be a member of one of these role groups:\nCompliance administrator\nCompliance data administrator\nInformation Protection\nInformation Protection Admin\nSecurity administrator\nImportant\nBefore you start, make sure you understand the difference between an\nunrestricted administrator\nand an\nadministrative unit restricted administrator\nby reading\nAdministrative units\n.\nGranular Roles and Role Groups\nYou can use these roles and role groups to fine tune your access controls.\nHere's a list of applicable roles. To learn more, see\nPermissions in the Microsoft Purview portal\n.\nDLP Compliance Management\nInformation Protection Admin\nInformation Protection Analyst\nInformation Protection Investigator\nInformation Protection Reader\nHere's a list of applicable role groups. To learn more, see\nPermissions in the Microsoft Purview portal\n.\nInformation Protection\nInformation Protection Admins\nInformation Protection Analysts\nInformation Protection Investigators\nInformation Protection Readers\nPolicy creation scenarios for Enterprise applications & devices\nThe previous article,\nDesign a DLP policy\n, introduces you to the methodology of creating a policy intent statement and then mapping that intent statement to policy configuration options.\nHelp prevent sharing credit card numbers through email\nHelp prevent sharing sensitive items via SharePoint and OneDrive with external users\nHelp protect files that Endpoint Data Loss Prevention fails to scan\nHelp protect files that Endpoint Data Loss Prevention doesn't scan\nHelp protect against sharing of a defined set of unsupported files\nDisable Microsoft Purview data loss prevention scanning for some supported files and apply controls\nHelp prevent sharing Power BI reports with credit card numbers\nPolicy creation scenarios for Inline web traffic\nHelp prevent sharing via Microsoft Edge for Business to unmanaged AI apps from managed devices\nHelp Prevent Users from Sharing Sensitive Info with Cloud Apps in Edge for Business\nHelp prevent sharing sensitive information with unmanaged AI apps via network data security\nDeployment\nA successful policy deployment isn't just about getting the policy into your environment to enforce controls on user actions. A haphazard, rushed deployment can negatively impact business processes and annoy your users. Those consequences slow acceptance of DLP technology in your organization and the safer behaviors it promotes. Ultimately, those consequences make your sensitive items less safe in the long run.\nBefore you start your deployment, make sure you read through\nPolicy deployment\n. It gives you a broad overview of the policy deployment process and general guidance.\nThis section dives more deeply into the three types of controls you use in concert to manage your policies in production. You can change any of these controls at any time, not just during policy creation.\nThree axes of deployment management\nUse three axes to control the policy deployment process: scope, state, and actions. Always take an incremental approach to deploying a policy, starting from the least impactful\nsimulation mode\nthrough full enforcement.\nRecommended deployment control configurations\nWhen your policy state is\nYour policy scope can be\nImpact of policy actions\nRun the policy in simulation mode\nPolicy scope of locations can be narrow or broad\n- You can configure any action\n- No user impact from configured actions\n- Admin sees alerts and can track activities\nRun the policy in simulation mode with policy tips\nPolicy should be scoped to target a pilot group and then expand the scope as you tune the policy\n- You can configure any action\n- No user impact from configured actions\n- Users can receive policy tips and alerts\n- Admin sees alerts and can track activities\nTurn it on\nAll targeted location instances\n- All configured actions are enforced on user activities\n- Admin sees alerts and can track activities\nKeep it off\nn/a\nn/a\nState\nState is the primary control you use to roll out a policy. When you finish creating your policy, set the state of the policy to\nKeep it off\n. Leave it in this state while you work on the policy configuration and until you get a final review and sign off. Set the state to:\nRun the policy in simulation mode\n: No policy actions are enforced, events are audited. While in this state, you can monitor the impact of the policy in the DLP simulation mode overview and the DLP\nActivity explorer\nconsole.\nRun the policy in simulation mode and show policy tips while in simulation mode\n: No actions are enforced, but users receive policy tips and notification emails to raise their awareness and educate them.\nTurn it on right away\n: This is full enforcement mode.\nKeep it off\n: The policy is inactive. Use this state while developing and reviewing your policy before deployment.\nYou can change the state of a policy at any time.\nActions\nActions are what a policy does in response to user activities on sensitive items. Because you can change these actions at any time, you can start with the least impactful,\nAllow\n(for devices) and\nAudit only\n(for all other locations), gather and review the audit data, and use it to tune the policy before moving to more restrictive actions.\nAllow\n: The user activity is allowed to occur, so no business processes are impacted. You get audit data and there aren't any user notifications or alerts.\nNote\nThe\nAllow\naction is only available for policies that are scoped to the\nDevices\nlocation.\nAudit only\n: The user activity is allowed to occur, so no business processes are impacted. You get audit data and you can add notifications and alerts to raise awareness and train your users to know that what they're doing is a risky behavior. If your organization intends to enforce more restrictive actions later on, you can tell your users that too.\nBlock with override\n: The user activity is blocked by default. You can audit the event, raise alerts and notifications. This action impacts the business process, but your users are given the option to override the block and provide a reason for the override. Because you get direct feedback from your users, this action can help you identify false positive matches, which you can use to further tune the policy.\nNote\nFor Exchange online and SharePoint in Microsoft 365, you configure overrides in the user notification section.\nBlock\n: The user activity is blocked no matter what. You can audit the event, raise alerts and notifications.\nPolicy scope\nEvery policy is scoped to one or more locations, such as Exchange, SharePoint in Microsoft 365, Teams, and Devices. By default, when you select a location, all instances of that location fall under the scope and none are excluded. You can further refine which instances of the location (such as sites, groups, accounts, distribution groups, mailboxes, and devices) that the policy is applied to by configuring the include/exclude options for the location. To learn more about your include/exclude scoping options, see,\nLocations\n.\nIn general, you have more flexibility with scoping while the policy is in\nRun the policy in simulation mode\nstate because no actions are taken. You can start with just the scope you designed the policy for or go broad to see how the policy would impact sensitive items in other locations.\nWhen you change the state to\nRun the policy in simulation mode and show policy tips\n, narrow your scope to a pilot group that can give you feedback and be early adopters who can be a resource for others when they come onboard.\nWhen you move the policy to\nTurn it on right away\n, you broaden the scope to include all the instances of locations that you intended when the policy was designed.\nPolicy deployment steps\nAfter you've created the policy and set its state to\nKeep it off\n, do a final review with your stakeholders.\nChange the state to\nRun the policy in simulation mode\n. The location scope can be broad at this point so you can gather data on the behavior of the policy across multiple locations or just start with a single location.\nTune the policy based on the behavior data so that it better meets the business intent.\nChange the state to\nRun the policy in simulation mode and show policy tips\n. Refine the scope of locations to support a pilot group if needed and make use of includes/excludes so that the policy is first rolled out to that pilot group.\nGather user feedback and alert and event data. If needed, tune the policy and your plans. Make sure you address all the issues that your users bring up. Your users will most likely encounter issues and raise questions about things that you didn't think of during the design phase. Develop a group of super users at this point. They can be a resource to help train other users as the scope of the policy is increased and more users come onboard. Before you go to the next stage of deployment, make sure that the policy is achieved your control objectives.\nChange the state to\nTurn it on right away\n. The policy is fully deployed. Monitor DLP alerts and DLP activity explorer. Address alerts.\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Create DLP Policies", @@ -620,7 +620,7 @@ "https://learn.microsoft.com/en-us/purview/sensitivity-labels": { "content_hash": "sha256:e6a62054f60f937e8c561962317e0eba3a484a18831be29b1a2260a3d5e9c22f", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nLearn about sensitivity labels\nFeedback\nSummarize this article for me\nMicrosoft Purview service description\nNote\nIf you're looking for information about sensitivity labels that you see and can select in your Office apps, check their documentation. For example,\nApply sensitivity labels to your files\n.\nThe information on this page is for IT administrators who can create and configure those labels by using the Microsoft Purview management solution.\nTo get their work done, people in your organization collaborate with others both inside and outside the organization. This means that content no longer stays behind a firewall—it can roam everywhere, across devices, apps, and services. And when it roams, you want it to do so in a secure, protected way that meets your organization's business and compliance policies.\nSensitivity labels from Microsoft Purview Information Protection let you classify and protect your organization's data, while making sure that user productivity and their ability to collaborate isn't hindered.\nThe following example from Excel shows how users might see an applied sensitivity label from the window bar, and how they can easily change the label by using the\nsensitivity bar\nthat's available with the latest versions of Office. The labels are also available from the\nSensitivity\nbutton on the\nHome\ntab from the ribbon.\nTo apply sensitivity labels, users must be signed in with their Microsoft 365 work or school account.\nNote\nFor US Government tenants, sensitivity labels are supported for all platforms.\nIf you use the Microsoft Purview Information Protection client and scanner, see the\nAzure Information Protection Premium Government Service Description\n.\nYou can use sensitivity labels to:\nProvide protection settings that include encryption and content markings.\nFor example, apply a \"Confidential\" label to a document or email, and that label encrypts the content and applies a \"Confidential\" watermark. Content markings include headers and footers as well as watermarks, and encryption can also restrict what actions specified people can take on the content.\nExtend SharePoint protection when files are downloaded\nwhen you configure a default sensitivity label for SharePoint document libraries and select the option to extend protection for unencrypted files. Then, when these files are downloaded, the current SharePoint permissions travel with the labeled file.\nProtect content in Office apps across different platforms and devices.\nSupported by Word, Excel, PowerPoint, and Outlook on the Office desktop apps and Office for the web. Supported on Windows, macOS, iOS, and Android. For other apps, check their documentation.\nProtect content in third-party apps and services\nby using Microsoft Defender for Cloud Apps. With Defender for Cloud Apps, you can detect, classify, label, and protect content in third-party apps and services, such as SalesForce, Box, or DropBox, even if the third-party app or service doesn't read or support sensitivity labels.\nIdentify content for eDiscovery cases\n. The condition builder to create search queries in eDiscovery supports sensitivity labels that are applied to content. For example, as part of your eDiscovery case, restrict content to files and emails that have a \"Highly Confidential\" sensitivity label. Or conversely, exclude content to files and emails that have a \"Public\" sensitivity label.\nProtect containers\nthat include Teams, Microsoft 365 Groups, SharePoint sites, and Loop workspaces. For example, set privacy settings, external user access and external sharing, access from unmanaged devices, and control how channels can be shared with other teams.\nProtect meetings and chat\nby labeling (and optionally, encrypting) meeting invites and any responses, and enforce Teams-specific options for the meeting and chat.\nExtend sensitivity labels to Power BI\n: When you turn on this capability, you can apply and view labels in Power BI, and protect data when it's saved outside the service.\nExtend sensitivity labels to assets in Microsoft Purview Data Map\n: When you turn on this capability, currently in preview, you can apply your sensitivity labels to files and schematized data assets in Microsoft Purview Data Map. The schematized data assets include SQL, Azure SQL, Azure Synapse, Azure Cosmos DB, and AWS RDS.\nExtend sensitivity labels to third-party apps and services.\nUsing the Microsoft Information Protection SDK, third-party apps can read sensitivity labels and apply protection settings.\nLabel content without using any protection settings.\nYou can also just apply a label as a result of identifying the sensitivity of the data. This action provides users with a visual mapping of your organization's data sensitivity, and the labels can generate usage reports and activity data for data that has different levels of sensitivity. Based on this information, you can always choose to apply protection settings later.\nProtect data when Microsoft 365 Copilot is used.\nCopilot and agents recognize and integrate sensitivity labels into the user interactions to help keep labeled data protected. For more information, see the following section\nSensitivity labels for Microsoft 365 Copilot and Microsoft 365 Copilot Chat\n.\nIn all these cases, sensitivity labels from Microsoft Purview can help you take the right actions on the right content. With sensitivity labels, you can identify the sensitivity of data across your organization, and the label can enforce protection settings that are appropriate for the sensitivity of that data. That protection then stays with the content.\nFor more information about these and other scenarios that are supported by sensitivity labels, see\nCommon scenarios for sensitivity labels\n. To see how your sensitivity labels are being used, view the reports from the\nMicrosoft Purview portal\n.\nNew features are being developed all the time that support sensitivity labels, so you might also find it useful to check the\nMicrosoft 365 roadmap\n.\nWhat a sensitivity label is\nWhen you assign a sensitivity label to content, it's like a stamp that's applied and is:\nCustomizable.\nSpecific to your organization and business needs, you can create categories for different levels of sensitive content in your organization. For example, Personal, Public, General, Confidential, and Highly Confidential.\nClear text.\nBecause a label is stored in clear text in the metadata for files and emails, third-party apps and services can read it and then apply their own protective actions, if required.\nPersistent.\nBecause the label is stored in metadata for files and emails, the label stays with the content, no matter where it's saved or stored. The label identification that's unique to your organization becomes the basis for applying and enforcing policies that you configure.\nWhen viewed by users in your organization, an applied sensitivity label appears like a tag on apps and can be easily integrated into their existing workflows. Your sensitivity labels\naren't visible in apps to users from other organizations, or to guests\n.\nThe following example shows an opened email where another user has applied the sensitivity label named\nGeneral\n, which doesn't apply encryption. The label description supplied by the admin displays more detail to users about the category of data identified by this sensitivity label.\nNote\nDon't confuse sensitivity labels with Outlook's built-in\nsensitivity levels\nthat indicate the sender's intention but can't provide data security.\nEach item that supports sensitivity labels can have a single sensitivity label applied to it from your organization. Documents and emails can have both a sensitivity label and a\nretention label\napplied to them.\nWhat sensitivity labels can do\nAfter a sensitivity label is applied to content, for example in an email, meeting invite, document, or Loop page, any configured protection settings for that label are enforced on the content. You can configure a sensitivity label to:\nControl access to content, using encryption\nfor emails, meeting invites, and documents to prevent unauthorized people from accessing this data. You can additionally choose which users or group have permissions to perform which actions and for how long. For example, you can choose to allow all users in your organization to modify a document while a specific group in another organization can only view it. Alternatively, instead of administrator-defined permissions, you can allow your users to assign permissions to the content when they apply the label.\nFor more information about the encryption-based access control settings when you create or edit a sensitivity label, see\nRestrict access to content by using encryption in sensitivity labels\n.\nMark the content\nwhen you use Office apps, by adding watermarks, headers, or footers to email, meeting invites, or documents that have the label applied. Watermarks can be applied to documents and Loop component and pages, but not email or meeting invites. For other apps, check their documentation. Example header and watermark:\nContent markings also support variables. For example, insert the label name or document name into the header, footer, or watermark. For more information, see\nContent markings with variables\n.\nNeed to check when content markings are applied? See\nWhen Office apps apply content marking and encryption\n.\nIf you have templates or workflows that are based on specific documents, test those documents with your chosen content markings before you make the label available for users. Some string length restrictions to be aware of:\nWatermarks are limited to 255 characters. Headers and footers are limited to 1024 characters, except in Excel. Excel has a total limit of 255 characters for headers and footers but this limit includes characters that aren't visible, such as formatting codes. If that limit is reached, the string you enter isn't displayed in Excel.\nProtect content in containers such as sites and groups\nwhen you enable the capability to\nuse sensitivity labels with Microsoft Teams, Microsoft 365 groups, and SharePoint sites\n.\nYou can't configure protection settings for groups and sites until you enable this capability. This label configuration doesn't result in documents or emails being automatically labeled but instead, the label settings protect content by controlling access to the container where content can be stored. These settings include privacy settings, external user access and external sharing, access from unmanaged devices, private teams discoverability, and sharing controls for channels.\nApply the label automatically to files and emails, or recommend a label.\nChoose how to identify sensitive information that you want labeled, and the label can be applied automatically, or you can prompt users to apply the label that you recommend with a policy tip. If you recommend a label, you can customize the prompt but the following example shows the automatically generated text:\nFor more information about the\nAuto-labeling for files and emails\nsettings when you create or edit a sensitivity label, see\nAutomatically apply a sensitivity label to Microsoft 365 data\nfor Office apps, and\nLabeling in Microsoft Purview Data Map\n.\nSet the default sharing link type\nfor SharePoint sites and individual documents. To help prevent users oversharing, set the\ndefault scope and permissions\nfor when users share documents from SharePoint and OneDrive.\nFor more label configurations, see\nManage sensitivity labels for Office apps\n.\nLabel scopes\nWhen you create a sensitivity label, you're asked to configure the label's scope, which determines two things:\nWhich label settings you can configure for that label\nThe availability of the label to apps and services, which includes whether users can see and select the label\nThis scope configuration lets you have sensitivity labels that are just for items such as files, emails, and meetings, and can't be selected for groups and sites. Similarly, sensitivity labels that are just for groups and sites and can't be selected for items.\nBy default, the\nFiles & other data assets\nscope is always selected for a new label. As well as files for Office, Loop, and Power BI, it includes items from Microsoft Fabric, and data assets for Microsoft Purview Data Map when you\nextend your sensitivity labels beyond Microsoft 365\n. For more information about which items support sensitivity labels:\nFor Office files:\nOffice file types supported\nFor Microsoft Loop:\nSensitivity labels to protect Loop components and pages\nFor Power BI:\nSupported export paths\nFor Microsoft Fabric:\nInformation protection in Microsoft Fabric\nFor Data Map:\nData sources that connect to Data Map\nFor other apps, check their documentation. Support for sensitivity labels is continually expanding.\nYou typically select the\nEmails\nscope together with\nFiles & other data assets\n, because emails often include files as attachments and share the same sensitivity. Many labeling features require both options to be selected, but there might be times when you want a new label to be available for emails only. For more information, see\nScope labels to just files or emails\n.\nThe scope for\nMeetings\nincludes calendar events, Teams meetings options, and Team chat. You must also select the\nFiles & other data assets\nscope and\nEmails\nscope for this option. For more information about this labeling scenario, see\nUse sensitivity labels to protect calendar items, Teams meetings, and chat\n.\nThe\nGroups & sites\nscope becomes available and selected by default when you\nenable sensitivity labels for containers and synchronize labels\n. This option lets you protect content in SharePoint sites, Teams sites, and Loop workspaces by labeling those containers but doesn't label the items in them.\nNote\nItems that were previously in the\nSchematized data assets\nscope are now included in\nFiles & other data assets\n.\nIf one or more scopes aren't selected, you see the first page of the configuration settings for these scopes, but you can't configure the settings. For these pages that have unavailable options, select\nNext\nto continue to configure settings for the next scope. Or, select\nBack\nto change the label's scope.\nLabel priority (order matters)\nWhen you create your sensitivity labels in the Microsoft Purview portal, they appear in a list on the\nLabels\npage from\nInformation Protection\n. In this list, the order of the labels is important because it sets their priority. You want your most restrictive sensitivity label, such as Highly Confidential, to appear at the\nbottom\nof the list, and your least restrictive sensitivity label, such as Personal or Public, to appear at the\ntop\n.\nYou can apply just one sensitivity label to an item such as a document, email, or container. If you use the option that requires your users to provide a justification for changing a label to a lower sensitivity for files, emails, and meetings, the order of this list identifies the lower sensitivity. However, this option doesn't apply to sublabels that share the priority of their parent label, and doesn't apply to data assets or labels that protect containers.\nNote\nIf you're using the\nmodern label scheme\nthat replaces parent labels with label groups, sublabels in the same label group also don't support the justification setting.\nThe priority of sublabels is used with\nautomatic labeling\n, though. When you configure auto-labeling policies, multiple matches can result for more than one label. Then, the last sensitive label is selected, and then if applicable, the last sublabel. When you configure the auto-labeling label setting (rather than auto-labeling policies) for sublabels themselves, the behavior is a little different when these labels share the same parent label or label group. For example, a sublabel configured for automatic labeling is preferred over a sublabel configured for recommended labeling. For more information, see\nHow multiple conditions are evaluated when they apply to more than one label\n.\nThe priority of sublabels is also used with\nlabel inheritance from email attachments\n.\nWhen you select a sensitivity label, you can change its priority by using the options to move it to the top or bottom of the list if it's not a sublabel, move it up or down by one label, or directly assign a priority number. For example:\nSublabels that use parent labels or label groups\nYou might often want to logically group sensitivity labels in a two-tier hierarchy, to denote the overall sensitivity level but provide labels with different settings. As an example from the\ndefault sensitivity labels\n:\nConfidential\nAnyone (unrestricted)\nAll Employees\nTrusted People\nIn apps, users first see\nConfidential\nand then must select one of the other sensitivity labels, such as\nAll Employees\n. The applied sensitivity label is then\nConfidential \\ All Employees\n.\nThe second tier of sensitivity labels don't inherit any settings from the first tier label, except the label color. In our example, the\nConfidential\nlabel is simply a text label with no protection settings, and can't be applied by itself to content.\nThe initial implementation to group labels in this two-tier hierarchy used parent labels and sublabels. The configuration of parent labels looked the same as the configuration of sublabels, but behaved differently when sublabels were published to users.\nTo reduce this complexity, label groups are now replacing parent labels. Label groups look and behave the same to users, but have configuration options only for the settings that they support: the label name, descriptions, color, and priority. You can't publish label groups themselves, only the labels within the groupings.\nIf you're still using parent labels with sublabels: Don't choose a parent label as the default label, or configure a parent label to be automatically applied (or recommended). If you do, the parent label can't be applied.\nExample of how sublabels display for users:\nIf required, you can manually convert parent labels to label groups. For more information, see\nMigrate parent sensitivity labels to label groups\n.\nEditing or deleting a sensitivity label\nIf you delete a sensitivity label from the Microsoft Purview portal, the label isn't automatically removed from content, and any protection settings continue to be enforced on content that had that label applied.\nIf you edit a sensitivity label, the version of the label that was applied to content is what's enforced on that content.\nFor detailed information about what happens when you delete a sensitivity label and how this is different from removing it from a sensitivity label policy, see\nRemoving and deleting labels\n.\nSensitivity label limitations per tenant\nA single tenant can have a high number of sensitivity labels (1,000+), with one exception: If the label applies encryption that specifies the users and permissions, there's a maximum of 500 labels per tenant supported with this configuration. However, as a best practice to lower admin overheads and reduce complexity for your users, try to keep the number of labels to a minimum.\nTip\nReal-world deployments have proved effectiveness to be noticeably reduced when users have more than five main labels or more than five sublabels per main label. You might also find that some applications can't display all your labels when too many are published to the same user.\nWhat label policies can do\nAfter you create your sensitivity labels, you need to publish them to make them available to people and services in your organization. The sensitivity labels can then be applied to Office documents and emails, and other items that support sensitivity labels.\nUnlike retention labels, which are published to locations such as all Exchange mailboxes, sensitivity labels are published to users or groups. Apps that support sensitivity labels can then display them to those users and groups as labels that they can apply.\nAlthough the default is to publish labels to all users in your organization, multiple label policies let you publish different sensitivity labels to different users if this is needed. For example, all users see labels that they can apply for Public, General, and Confidential, but only users in your legal department also see a Highly Confidential label that they can apply.\nAll users in the same organization can see the name of a sensitivity label applied to content, even if that label isn't published to them. They won't see sensitivity labels from other organizations.\nWhen you configure a publishing label policy, you can:\nChoose which users and groups see the labels.\nLabels can be published to any specific user or email-enabled security group, distribution group, or Microsoft 365 group (which can have\ndynamic membership\n) in Microsoft Entra ID.\nSpecify a default label\nfor unlabeled documents and Loop components and pages, emails and meeting invites, new containers (when you've\nenabled sensitivity labels for Microsoft Teams, Microsoft 365 groups, and SharePoint sites\n), and also a default label for\nPower BI content\n. You can specify the same label for all five types of items, or different labels. Users can change the applied default sensitivity label to better match the sensitivity of their content or container.\nConsider using a default label to set a base level of protection settings that you want applied to all your content. However, without user training and other controls, this setting can also result in inaccurate labeling. It's usually not a good idea to select a label that applies encryption as a default label to documents. For example, many organizations need to send and share documents with external users who might not have apps that support the encryption or they might not use an account that can be authorized. For more information about this scenario, see\nSharing encrypted documents with external users\n.\nImportant\nWhen you have\nsublabels\n, be careful not to configure the parent label as a default label.\nRequire a justification for changing a label.\nFor files, emails, and meetings, but not for groups and sites (used by Teams and SharePoint), if a user tries to remove a label or replace it with a label that has a lower priority, by default the user must provide a justification to perform this action. For example, a user opens a document labeled Confidential (order number 3) and replaces that label with one named Public (order number 1). For Office apps, this justification prompt is triggered once per app session. When you use the Microsoft Purview Information Protection client, the prompt is triggered for each file. Administrators can read the justification reason along with the label change in\nactivity explorer\n.\nRequire users to apply a label\nfor the different types of items and the containers that support sensitivity labels. Also known as mandatory labeling, these options ensure a label must be applied before users can save files and send emails or meeting invites, create new groups or sites, and when they use unlabeled content for Power BI.\nFor documents and emails, a label can be assigned manually by the user, automatically as a result of a condition that you configure, or be assigned by default (the default label option previously described). An example prompt when a user is required to assign a label:\nFor more information about mandatory labeling for documents and emails, see\nRequire users to apply a label to their email and documents\n.\nFor containers, a label must be assigned at the time the group or site is created.\nFor more information about mandatory labeling for Power BI, see\nMandatory label policy for Power BI\n.\nConsider using this option to help increase your labeling coverage. However, without user training, these settings can result in inaccurate labeling. In addition, unless you also set a corresponding default label, mandatory labeling can frustrate your users with the frequent prompts.\nProvide help link to a custom help page.\nIf your users aren't sure what your sensitivity labels mean or how they should be used, you can provide a Learn More URL that appears after the list of available sensitivity labels in the Office apps. For example:\nFor more label policy configurations, see\nManage sensitivity labels for Office apps\n.\nAfter you create a publishing label policy that assigns new sensitivity labels to users and groups, users start to see those labels in their Office apps. Allow up to 24 hours for the latest changes to replicate throughout your organization.\nLabel policy priority (order matters)\nYou make your sensitivity labels available to users by publishing them in a sensitivity label policy that appears in a list on the\nLabel policies\npage. Just like sensitivity labels (see\nLabel priority (order matters)\n), the order of the sensitivity label policies is important because it reflects their priority: The label policy with lowest priority is shown at the top of the list with the\nlowest\norder number, and the label policy with the highest priority is shown at the bottom of the list with the\nhighest\norder number.\nA label policy consists of:\nA set of labels.\nThe users and groups that will be assigned the policy with labels.\nThe scope of the policy and policy settings for that scope (such as default label for files and emails).\nYou can include a user in multiple label policies, and the user will get all the sensitivity labels and settings from those policies. If there's a conflict in settings from multiple policies, the settings from the policy with the highest priority (highest order number) is applied. In other words, the highest priority wins for each setting.\nIf you're not seeing the label policy setting behavior that you expect for a user or group, check the order of the sensitivity label policies. You might need to move the policy down. To reorder the label policies, select a sensitivity label policy > choose the Actions ellipsis for that entry >\nMove down\nor\nMove up\n. For example:\nFrom our screenshot example that shows three label policies, all users are assigned the standard label policy, so it's appropriate it has the lowest priority (lowest order number of 0). Only users in the IT department are assigned the second policy that has the order number 1. For these users, if there are any conflicts in settings between their policy and the standard policy, the settings from their policy wins because it has a higher order number.\nSimilarly for users in the legal department, who are assigned the third policy with distinct settings. It's likely these users will have more stringent settings, so it's appropriate that their policy has the highest order number. It's unlikely a user from the legal department will be in a group that's also assigned to the policy for the IT department. But if they are, the order number 2 (highest order number) ensures the settings from the legal department always take priority if there's a conflict.\nNote\nRemember: If there is a conflict of settings for a user who has multiple policies assigned to them, the setting from the assigned policy with the highest order number is applied.\nSensitivity labels for Microsoft 365 Copilot and Microsoft 365 Copilot Chat\nThe sensitivity labels that you use to protect your organization's data are recognized and used by Microsoft 365 Copilot, Microsoft 365 Copilot Chat, and Copilot agents to provide an extra layer of protection. For example, in Microsoft 365 Copilot Chat conversations that can reference data from different types of items, the sensitivity label with the highest priority (typically, the most restrictive label) is visible to users. Similarly, when sensitivity label inheritance is supported by Copilot and agents, the sensitivity label with the highest priority is selected.\nIf the labels applied encryption from Microsoft Purview Information Protection, Copilot and agents check the usage rights for the user. Only if the user is granted permissions to copy (the EXTRACT usage right) from an item, is data from that item returned by Copilot or agents.\nFor more detailed information about how sensitivity labels help protect your data when you use Copilot and other AI apps, see the following articles:\nMicrosoft Purview data security and compliance protections for generative AI apps\nUse Microsoft Purview to manage data security & compliance for Microsoft 365 Copilot & Microsoft 365 Copilot Chat\nConsiderations to manage Microsoft 365 Copilot for security and compliance\nSensitivity labels and Azure Information Protection\nThe older labeling client, the Azure Information Protection unified labeling client, is now replaced with the\nMicrosoft Purview Information Protection client\nto extend labeling on Windows to File Explorer, PowerShell, the on-premises scanner, and provide a viewer for encrypted files.\nOffice apps support sensitivity labels with a subscription versions of Office, such as Microsoft 365 Apps for enterprise.\nSensitivity labels and the Microsoft Information Protection SDK\nBecause a sensitivity label is stored in the metadata of a document, third-party apps and services can read from and write to this labeling metadata to supplement your labeling deployment. Additionally, software developers can use the\nMicrosoft Information Protection SDK\nto fully support labeling and encryption capabilities across multiple platforms. To learn more, see the\nGeneral Availability announcement on the Tech Community blog\n.\nYou can also learn about\npartner solutions that are integrated with Microsoft Purview Information Protection\n.\nDeployment guidance\nFor deployment planning and guidance that includes licensing information, permissions, deployment strategy, a list of supported scenarios, and end-user documentation, see\nGet started with sensitivity labels\n.\nTo learn how to use sensitivity labels to comply with data privacy regulations, see\nDeploy information protection for data privacy regulations with Microsoft 365\n.\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Sensitivity Labels", @@ -629,7 +629,7 @@ "https://learn.microsoft.com/en-us/purview/sensitivity-labels-teams-groups-sites": { "content_hash": "sha256:eb151313731d322d7e6c48e0126105e49bb6b2385c02db1b21c8391de64fbb1f", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nUse sensitivity labels to protect content in Microsoft Teams, Microsoft 365 groups, and SharePoint sites\nFeedback\nSummarize this article for me\nMicrosoft Purview service description\nIn addition to using\nsensitivity labels\nto protect documents and emails, you can also use sensitivity labels to protect content in the following containers: Microsoft Teams sites, Microsoft 365 groups (\nformerly Office 365 groups\n), and SharePoint sites. For this container-level protection, use the following label settings:\nPrivacy (public or private) of teams sites and Microsoft 365 groups\nExternal user access\nExternal sharing from SharePoint sites\nAccess from unmanaged devices\nAuthentication contexts\nPrevent discovery of private teams for users who have this capability\nShared channels control for team invitations\nDefault sharing link for a SharePoint site (PowerShell-only configuration)\nSite sharing settings (PowerShell-only configuration)\nDefault label for channel meetings\nImportant\nThe settings for unmanaged devices and authentication contexts work in conjunction with Microsoft Entra Conditional Access. You must configure this dependent feature if you want to use a sensitivity label for these settings. Additional information is included in the instructions that follow.\nWhen you apply this sensitivity label to a supported container, the label automatically applies the sensitivity category and configured protection settings to the site or group.\nBe aware that some label options can extend configuration settings to site owners that are otherwise restricted to administrators. When you configure and publish the label settings for external sharing options and the authentication context, a site owner can now set and change these options for a site by applying or changing the sensitivity label for a team or site. Don't configure these specific label settings if you don't want site owners to be able to make these changes.\nContent in these containers however, do not inherit the labels for the sensitivity category or settings for files and emails, such as content markings and encryption. So that users can label their documents in SharePoint sites or team sites, make sure you've\nenabled sensitivity labels for Office files in SharePoint and OneDrive\n.\nContainer labels don't support displaying\nother languages\nand display the original language only for the label name and description.\nUsing sensitivity labels for Microsoft Teams, Microsoft 365 groups, and SharePoint sites\nBefore you enable sensitivity labels for containers and configure sensitivity labels for the new settings, users can see and apply sensitivity labels in their apps. For example, from Word:\nAfter you enable and configure sensitivity labels for containers, users can additionally see and apply sensitivity labels to Microsoft team sites, Microsoft 365 groups, and SharePoint sites. For example, when you create a new team site from SharePoint:\nAfter a sensitivity label has been applied to a site, you must have the following role to change that label in SharePoint or Teams:\nFor a group-connect site: Microsoft 365 group\nOwners\nFor a site that isn't group-connected: SharePoint\nsite admin\nNote\nSensitivity labels for containers support\nTeams shared channels\n. If a team has any shared channels, they automatically inherit sensitivity label settings from their parent team, and that label can't be removed or replaced with a different label.\nHow to enable sensitivity labels for containers and synchronize labels\nIf you haven't yet enabled sensitivity labels for containers, do the following set of steps as a one-time procedure:\nBecause this feature uses Microsoft Entra functionality, follow the instructions from the Microsoft Entra documentation to enable sensitivity label support:\nAssign sensitivity labels to Microsoft 365 groups in Microsoft Entra ID\n.\nYou now need to synchronize your sensitivity labels to Microsoft Entra ID. First,\nconnect to Security & Compliance PowerShell\n.\nFor example, in a PowerShell session that you run as administrator, sign in with a global administrator account.\nThen run the following command to ensure your sensitivity labels can be used with Microsoft 365 groups:\nExecute-AzureAdLabelSync\nHow to configure groups and site settings\nAfter sensitivity labels are enabled for containers as described in the previous section, you can then configure protection settings for groups and sites in the sensitivity labeling configuration. Until sensitivity labels are enabled for containers, the settings are visible but you can't configure them.\nFollow the general instructions to\ncreate or edit a sensitivity label\nand make sure you select\nGroups & sites\nfor the label's scope:\nWhen only this scope is selected for the label, the label won't be displayed in Office apps that support sensitivity labels and can't be applied to files and emails. Having this separation of labels can be helpful for both users and administrators, but can also add to the complexity of your label deployment.\nFor example, you need to carefully review your\nlabel ordering\nbecause SharePoint detects when a labeled document is uploaded to a labeled site. In this scenario, an audit event and email are automatically generated when the document has a higher priority sensitivity label than the site's label. For more information, see the\nAuditing sensitivity label activities\nsection on this page.\nThen, on the\nDefine protection settings for groups and sites\npage, select the options you want to configure:\nPrivacy and external user access settings\nto configure the\nPrivacy\nand\nExternal users access\nsettings.\nExternal sharing and Conditional Access settings\nto configure the\nControl external sharing from labeled SharePoint sites\nand\nUse Microsoft Entra Conditional Access to protect labeled SharePoint sites\nsetting.\nPrivate teams discoverability and shared channel controls\nto configure the settings to prevent users who can discover private teams from finding a private team with this sensitivity label applied, and channel sharing controls for invitations to other teams.\nApply a label to channel meetings\n: This additional option is applicable only if you are editing an existing label where the scope includes meetings, and you've configured\nlabels to protect meetings\n. Select a sensitivity label to automatically apply to channel meetings and all channel chats. For non-channel meetings, you can select a default label as a policy setting.\nIf you selected\nPrivacy and external user access settings\n, now configure the following settings:\nPrivacy\n: Keep the default of\nPublic\nif you want anyone in your organization to access the team site or group where this label is applied.\nSelect\nPrivate\nif you want access to be restricted to only approved members in your organization.\nSelect\nNone\nwhen you want to protect content in the container by using the sensitivity label, but still let users configure the privacy setting themselves.\nThe settings of\nPublic\nor\nPrivate\nset and lock the privacy setting when you apply this label to the container. Your chosen setting replaces any previous privacy setting that might be configured for the team or group, and locks the privacy value so it can be changed only by first removing the sensitivity label from the container. After you remove the sensitivity label, the privacy setting from the label remains and users can now change it again.\nExternal user access\n: Control whether the group owner can\nadd guests to the group\n.\nIf you selected\nExternal sharing and Conditional Access settings\n, now configure the following settings:\nControl external sharing from labeled SharePoint sites\n: Select this option to then select either external sharing for anyone, new and existing guests, existing guests, or only people in your organization. For more information about this configuration and settings, see the SharePoint documentation,\nTurn external sharing on or off for a site\n.\nUse Microsoft Entra Conditional Access to protect labeled SharePoint sites\n: Select this option only if your organization has configured and is using\nMicrosoft Entra Conditional Access\n. Then, select one of the following settings:\nDetermine whether users can access SharePoint sites from unmanaged devices\n: This option uses the SharePoint feature that uses Microsoft Entra Conditional Access to block or limit access to SharePoint and OneDrive content from unmanaged devices. For more information, see\nControl access from unmanaged devices\nfrom the SharePoint documentation. The option you specify for this label setting is the equivalent of running a PowerShell command for a site, as described in steps 3-5 from the\nBlock or limit access to a specific SharePoint site or OneDrive\nsection from the SharePoint instructions.\nFor additional configuration information, see\nMore information about the dependencies for the unmanaged devices option\nat the end of this section.\nChoose an existing authentication context\n: This option lets you enforce more stringent access conditions when users access SharePoint sites that have this label applied. These conditions are enforced when you select an existing authentication context that has been created and published for your organization's Conditional Access deployment. If users don't meet the configured conditions or if they use apps that don't support authentication contexts, they are denied access.\nFor additional configuration information, see\nMore information about the dependencies for the authentication context option\nat the end of this section.\nExamples for this label configuration:\nYou choose an authentication context that is configured to require\nmultifactor authentication (MFA)\n. This label is then applied to a SharePoint site that contains highly confidential items. As a result, when users from an untrusted network attempt to access a document in this site, they see the MFA prompt that they must complete before they can access the document.\nYou choose an authentication context that is configured for\nterms-of-use (ToU) policies\n. This label is then applied to a SharePoint site that contains items that require a terms-of-use acceptance for legal or compliance reasons. As a result, when users attempt to access a document in this site, they see a terms-of-use document that they must accept before they can access the original document.\nIf you selected\nPrivate teams discoverability and shared channel controls\n:\nFor\nPrivate teams discoverability\n, use the\nAllow users to discover private teams that have this label applied\ncheckbox when you've configured a\nTeams policy that allows private teams discovery\n:\nWhen the checkbox is selected (the default setting), a private team with the sensitivity label applied will be discoverable for a user who is allowed to discover private teams.\nWhen the checkbox is cleared, a private team with the sensitivity label applied will remain hidden and won't be discoverable for all users.\nFor\nTeams shared channels\n, when a team has a sensitivity label applied, you can allow or prevent other teams from being invited to the original team's shared channels. For more information about shared channels, see\nShared channels in Microsoft Teams\n.\nImportant\nThese options for Teams shared channels have a dependency on the settings on the previous\nPrivacy and external user access\npage. If you select an option that's not compatible with these previous settings, you see a validation message to change your selection. Alternatively, you can go back in the configuration to change the dependent setting.\nOptions include\nInternal only\n,\nSame label only\n, and\nPrivate team only\n. Only the last option can potentially remove previously invited teams, and none of the options affect invitations to individual users.\nThe site and group settings take effect when you apply the label to a team, group, or site. If the\nlabel's scope\nincludes files and emails, other label settings such as encryption and content marking aren't applied to the content within the team, group, or site.\nIf your sensitivity label isn't already published, now publish it by\nadding it to a sensitivity label policy\n. The users who are assigned a sensitivity label policy that includes this label will be able to select it for sites and groups.\nMore information about the dependencies for the unmanaged devices option\nIf you don't configure the dependent conditional access policy for SharePoint as documented in\nUse app-enforced restrictions\n, the option you specify here will have no effect. Additionally, it will have no effect if it's less restrictive than a configured setting at the tenant level. If you have configured an organization-wide setting for unmanaged devices, choose a label setting that's either the same or more restrictive\nFor example, if your tenant is configured for\nAllow limited, web-only access\n, the label setting that allows full access will have no effect because it's less restrictive. For this tenant-level setting, choose the label setting to block access (more restrictive) or the label setting for limited access (the same as the tenant setting).\nBecause you can configure the SharePoint settings separately from the label configuration, there's no check in the sensitivity label configuration that the dependencies are in place. These dependencies can be configured after the label is created and published, and even after the label is applied. However, if the label is already applied, the label setting won't take effect until after the user next authenticates.\nMore information about the dependencies for the authentication context option\nTo display in the drop-down list for selection, authentication contexts must be created, configured, and published as part of your Microsoft Entra Conditional Access configuration. For more information and instructions, see the\nConfigure authentication contexts\nsection from the Microsoft Entra Conditional Access documentation.\nNot all apps support authentication contexts. If a user with an unsupported app connects to the site that's configured for an authentication context, they see either an access denied message or they are prompted to authenticate but rejected. The apps that currently support authentication contexts:\nOffice for the web, which includes Outlook for the web\nMicrosoft Teams for Windows and macOS (excludes Teams web app)\nMicrosoft Planner\nMicrosoft 365 Apps for Word, Excel, and PowerPoint; minimum versions:\nWindows: 2103\nmacOS: 16.45.1202\niOS: 2.48.303\nAndroid: 16.0.13924.10000\nMicrosoft 365 Apps for Outlook; minimum versions:\nWindows: 2103\nmacOS: 16.45.1202\niOS: 4.2109.0\nAndroid: 4.2025.1\nOneDrive sync app, minimum versions:\nWindows: 21.002\nmacOS: 21.002\niOS: 12.30\nAndroid: Not yet supported\nKnown limitations:\nFor the OneDrive sync app, supported for OneDrive only and not for other sites.\nThe following features and apps might be incompatible with authentication contexts, so we encourage you to check that these continue to work after a user successfully accesses a site by using an authentication context:\nWorkflows that use Power Apps or Power Automate\nThird-party apps\nConfigure settings for the default sharing link type for a site by using PowerShell advanced settings\nIn addition to the label settings for sites and groups that you can configure from the Microsoft Purview portal, you can also configure the default sharing link type for a site. Sensitivity labels for documents can also be configured for a default sharing link type. These settings that help to prevent over-sharing are automatically selected when users select the\nShare\nbutton in their Office apps.\nFor more information and instructions, see\nUse sensitivity labels to configure the default sharing link type for sites and documents in SharePoint and OneDrive\n.\nConfigure site sharing permissions by using PowerShell advanced settings\nAnother PowerShell advanced setting that you can configure for the sensitivity label to be applied to a SharePoint site is\nMembersCanShare\n. This setting is the equivalent configuration that you can set from the SharePoint admin center >\nSite permissions\n>\nSite Sharing\n>\nChange how members can share\n>\nSharing permissions\n.\nThe three options are listed with the equivalent values for the PowerShell advanced setting\nMembersCanShare\n:\nOption from the SharePoint admin center\nEquivalent PowerShell value for MembersCanShare\nSite owners and members can share files, folders, and the site. People with Edit permissions can share files and folders.\nMemberShareAll\nSite owners and members, and people with Edit permissions can share files and folders, but only site owners can share the site.\nMemberShareFileAndFolder\nOnly site owners can share files, folders, and the site.\nMemberShareNone\nFor more information about these configuration options, see\nChange how members can share\nfrom the SharePoint community documentation.\nExample, where the sensitivity label GUID is\n8faca7b8-8d20-48a3-8ea2-0f96310a848e\n:\nSet-Label -Identity 8faca7b8-8d20-48a3-8ea2-0f96310a848e -AdvancedSettings @{MembersCanShare=\"MemberShareNone\"}\nFor more help in specifying PowerShell advanced settings, see\nPowerShell tips for specifying the advanced settings\n.\nSensitivity label management\nUse the following guidance for when you create, modify, or delete sensitivity labels that are configured for sites and groups.\nCreating and publishing labels that are configured for sites and groups\nUse the following guidance to publish a label for your users when that label is configured for site and group settings:\nAfter you create and configure the sensitivity label, add this label to a label policy that applies to just a few test users.\nWait for the change to replicate:\nNew label: Wait at least one hour, unless your configured settings include\nTeams shared channel controls\n. If that's the case, wait at least 24 hours.\nExisting label: Wait at least 24 hours.\nFor more information about the timing of labels, see\nWhen to expect new labels and changes to take effect\n.\nAfter this wait period, use one of the test user accounts to create a team, Microsoft 365 group, or SharePoint site with the label that you created in step 1.\nIf there are no errors during this creation operation, you know it's safe to publish the label to all users in your tenant.\nModifying published labels that are configured for sites and groups\nAs a best practice, don't change the site and group settings for a sensitivity label after the label has been applied to teams, groups, or sites. If you do, remember to wait at least 24 hours for the changes to replicate to all containers that have the label applied.\nIn addition, if your changes include the\nExternal users access\nsetting:\nThe new setting applies to new users but not to existing users. For example, if this setting was previously selected and as a result, guest users accessed the site, these guest users can still access the site after this setting is cleared in the label configuration.\nThe privacy settings for the group properties hiddenMembership and roleEnabled aren't updated.\nDeleting published labels that are configured for sites and groups\nIf you delete a sensitivity label that has the site and group settings enabled, and that label is included in one or more label policies, this action can result in creation failures for new teams, groups, and sites. To avoid this situation, use the following guidance:\nRemove the sensitivity label from all label policies that include the label.\nWait at least one hour.\nAfter this wait period, try creating a team, group, or site and confirm that the label is no longer visible.\nIf the sensitivity label isn't visible, you can now safely delete the label.\nHow to apply sensitivity labels to containers\nYou're now ready to apply the sensitivity label or labels to the following containers:\nMicrosoft 365 group in Microsoft Entra ID\nMicrosoft Teams team site\nMicrosoft 365 group in Outlook on the web\nSharePoint site\nYou can use PowerShell if you need to\napply a sensitivity label to multiple sites\n.\nApply sensitivity labels to Microsoft 365 groups\nYou're now ready to apply the sensitivity label or labels to Microsoft 365 groups. Return to the Microsoft Entra documentation for instructions:\nAssign a label to a new group in Azure portal\nAssign a label to an existing group in Azure portal\nRemove a label from an existing group in Azure portal\n.\nApply a sensitivity label to a new team\nUsers can select sensitivity labels when they create new teams in Microsoft Teams. When they select the label from the\nSensitivity\ndropdown, the privacy setting might change to reflect the label configuration. Depending on the external users access setting you selected for the label, users can or can't add people outside the organization to the team.\nLearn more about sensitivity labels for Teams\nAfter you create the team, the sensitivity label appears in the upper-right corner of all channels.\nThe service automatically applies the same sensitivity label to the Microsoft 365 group and the connected SharePoint team site.\nApply a sensitivity label to a new group in Outlook on the web\nIn Outlook on the web, when you create a new group, you can select or change the\nSensitivity\noption for published labels:\nApply a sensitivity label to a new site\nAdmins and end users can select sensitivity labels when they\ncreate modern team sites and communication sites\n, and expand\nAdvanced settings\n:\nThe dropdown box displays the label names for the selection, and the help icon displays all the label names with their tooltip, which can help users determine the correct label to apply.\nWhen the label is applied, and users browse to the site, they see the name of the label and applied policies. For example, this site has been labeled as\nConfidential\n, and the privacy setting is set to\nPrivate\n:\nUse PowerShell to apply a sensitivity label to multiple sites\nYou can use the\nSet-SPOSite\nand\nSet-SPOTenant\ncmdlet with the\nSensitivityLabel\nparameter from the current\nSharePoint Online Management Shell\nto apply a sensitivity label to many sites. You can use the same procedure to replace an existing label. The sites can be any SharePoint site collection, or a OneDrive site.\nMake sure you have version 16.0.19418.12000 or later of the SharePoint Online Management Shell.\nOpen a PowerShell session with the\nRun as Administrator\noption.\nIf you don't know your label GUID:\nConnect to Security & Compliance PowerShell\nand get the list of sensitivity labels and their GUIDs.\nGet-Label |ft Name, Guid\nNow\nconnect to SharePoint Online PowerShell\nand store your label GUID as a variable. For example:\n$Id = [GUID](\"e48058ea-98e8-4940-8db0-ba1310fd955e\")\nCreate a new variable that identifies multiple sites that have an identifying string in common in their URL. For example:\n$sites = Get-SPOSite -IncludePersonalSite $true -Limit all -Filter \"Url -like 'documents\"\nRun the following command to apply the label to these sites. Using our examples:\n$sites | ForEach-Object {Set-SPOTenant $_.url -SensitivityLabel $Id}\nThis series of commands lets you label multiple sites across your tenant with the same sensitivity label, which is why you use the Set-SPOTenant cmdlet, rather than the Set-SPOSite cmdlet that's for per-site configuration. However, use the Set-SPOSite cmdlet when you need to apply a different label to specific sites by repeating the following command for each of these sites:\nSet-SPOSite -Identity -SensitivityLabel \"\"\nView and manage sensitivity labels in the SharePoint admin center\nTo view, sort, and search the applied sensitivity labels, use\nActive sites\nin the SharePoint admin center. You might need to first add the\nSensitivity\ncolumn:\nFor more information about managing sites from the Active sites page, including how to add a column, see\nManage sites in the SharePoint admin center\n.\nYou can also change and apply a label from this page:\nSelect the site name to open the details pane.\nSelect the\nPolicies\ntab, and then select\nEdit\nfor the\nSensitivity\nsetting.\nFrom the\nEdit sensitivity setting\npane, select the sensitivity label you want to apply to the site. Unlike user apps, where sensitivity labels can be assigned to specific users, the admin center displays all container labels for your tenant. After you've chosen a sensitivity label, select\nSave\n.\nFor information about managing sensitivity labels for containers in SharePoint Embedded, which are the equivalent of sites, see\nManage SharePoint Embedded containers in SharePoint Admin Center\n.\nSupport for sensitivity labels\nWhen you use admin centers that support sensitivity labels, with the exception of the Microsoft Entra admin center, you see all sensitivity labels for your tenant. In comparison, user apps and services that filter sensitivity labels according to publishing policies can result in you seeing a subset of those labels. The Microsoft Entra admin center also filters the labels according to publishing policies.\nThe following apps and services support sensitivity labels configured for sites and group settings:\nAdmin centers:\nSharePoint admin center\nTeams admin center\nMicrosoft 365 admin center\nMicrosoft Purview portal\nUser apps and services:\nSharePoint\nTeams\nOutlook on the web and for Windows, macOS, iOS, and Android\nForms\nStream\nPlanner\nThe following apps and services don't currently support sensitivity labels configured for sites and group settings:\nAdmin centers:\nExchange admin center\nUser apps and services:\nDynamics 365\nViva Engage\nProject\nPower BI\nMy Apps portal\nClassic Microsoft Entra group classification\nAfter you enable sensitivity labels for containers, the group classifications from Microsoft Entra ID are no longer supported by Microsoft 365 and won't display on sites that support sensitivity labels. However, you can convert your old classifications to sensitivity labels.\nAs an example of how you might have used the old group classification for SharePoint, see\nSharePoint \"modern\" sites classification\n.\nThese classifications were configured by using Microsoft Graph PowerShell or the PnP Core library and defining values for the\nClassificationList\nsetting.\n($setting[\"ClassificationList\"])\nTo convert your old classifications to sensitivity labels, do one of the following:\nUse existing labels: Specify the label settings you want for sites and groups by editing existing sensitivity labels that are already published.\nCreate new labels: Specify the label settings you want for sites and groups by creating and publishing new sensitivity labels that have the same names as your existing classifications.\nThen:\nUse PowerShell to apply the sensitivity labels to existing Microsoft 365 groups and SharePoint sites by using name mapping. See the next section for instructions.\nRemove the old classifications from the existing groups and sites.\nAlthough you can't prevent users from creating new groups in apps and services that don't yet support sensitivity labels, you can run a recurring PowerShell script to look for new groups that users have created with the old classifications, and convert these to use sensitivity labels.\nTo help you manage the coexistence of sensitivity labels and Microsoft Entra classifications for sites and groups, see\nMicrosoft Entra classification and sensitivity labels for Microsoft 365 groups\n.\nUse PowerShell to convert classifications for Microsoft 365 groups to sensitivity labels\nFirst,\nconnect to Security & Compliance PowerShell\nwith a compliance administrator account.\nGet the list of sensitivity labels and their GUIDs by using the\nGet-Label\ncmdlet:\nGet-Label |ft Name, Guid\nMake a note of the GUIDs for the sensitivity labels you want to apply to your Microsoft 365 groups.\nNow\nconnect to Exchange Online PowerShell\nin a separate Windows PowerShell window.\nUse the following command as an example to get the list of groups that currently have the classification of \"General\":\n$Groups= Get-UnifiedGroup | Where {$_.classification -eq \"General\"}\nFor each group, add the new sensitivity label GUID. For example:\nforeach ($g in $groups)\n{Set-UnifiedGroup -Identity $g.Identity -SensitivityLabelId \"457fa763-7c59-461c-b402-ad1ac6b703cc\"}\nRepeat steps 5 and 6 for your remaining group classifications.\nAuditing sensitivity label activities\nImportant\nIf you use label separation by selecting just the\nGroups & sites\nscope for labels that protect containers: Because of the\nDetected document sensitivity mismatch\naudit event and email described in this section, consider\nordering labels\nbefore labels that have a scope for\nFiles & other data assets\n.\nIf somebody uploads a document to a site that's protected with a sensitivity label and their document has a\nhigher priority\nsensitivity label than the sensitivity label applied to the site, this action isn't blocked. For example, you've applied the\nGeneral\nlabel to a SharePoint site, and somebody uploads to this site a document labeled\nConfidential\n. Because a sensitivity label with a higher priority identifies content that is more sensitivity than content that has a lower priority order, this situation could be a security concern.\nAlthough the action isn't blocked, it automatically generates an email that's sent to the person who uploaded the document, and also sent to site owners and site admins (maximum of 100 in total). As a result, both the user and these administrators can identify documents that have this misalignment of label priority and take action if needed. For example, delete or move the uploaded document from the site.\nImportant\nIf the file is in the Preservation Hold library, it's not supported to interact with files in this location. Files can automatically move into the Preservation Hold library as a result of compliance requirements to automatically retain files that users delete or are cloud attachments. For more information, see\nLearn about retention for SharePoint and OneDrive\n.\nIt wouldn't be a security concern if the document has a lower priority sensitivity label than the sensitivity label applied to the site. For example, a document labeled\nGeneral\nis uploaded to a site labeled\nConfidential\n. In this scenario, an auditing event and email aren't generated.\nNote\nJust as for the policy option that requires users to provide a justification for changing a label to a lower classification, sublabels for the same parent label are all considered to have the same priority.\nTo search the audit log for this event, look for\nDetected document sensitivity mismatch\nfrom the\nFile and page activities\ncategory.\nThe automatically generated email has the subject\nIncompatible sensitivity label detected\nand the email message explains the labeling mismatch with a link to the uploaded document and site. It also contains a line for your own internal documentation:\nHelpLink : Troubleshooting Guide\n. You must configure the hyperlink for the troubleshooting guide by using the\nSet-SPOTenant\ncmdlet with the\nLabelMismatchEmailHelpLink\nparameter. For example:\nSet-SPOTenant –LabelMismatchEmailHelpLink \"https://support.contoso.com\"\nThe email message also has a Microsoft documentation link that provides basic information for users to change the sensitivity label:\nApply sensitivity labels to your files and email in Office\nExcept for the internal URL that you must specify, these automated emails cannot be customized. However, you can prevent them from being sent when you use the following PowerShell command from\nSet-SPOTenant\n:\nSet-SPOTenant -BlockSendLabelMismatchEmail $True\nWhen somebody adds or removes a sensitivity label to or from a site or group, these activities are also audited but without automatically generating an email.\nAll these auditing events can be found in the\nSensitivity label activities\nsection from the audit log activities documentation.\nHow to disable sensitivity labels for containers\nYou can turn off sensitivity labels for Microsoft Teams, Microsoft 365 groups, and SharePoint sites by using the same instructions from\nEnable sensitivity label support in PowerShell\n. However, to disable the feature, in step 5, specify\n$setting[\"EnableMIPLabels\"] = \"False\"\n.\nIn addition to making all the settings unavailable for groups and sites when you create or edit sensitivity labels, this action reverts which property the containers use for their configuration. Enabling sensitivity labels for Microsoft Teams, Microsoft 365 groups, and SharePoint sites switches the property used from\nClassification\n(used for\nMicrosoft Entra group classification\n) to\nSensitivity\n. When you disable sensitivity labels for containers, the containers ignore the Sensitivity property and use the Classification property again.\nThis means that any label settings from sites and groups previously applied to containers won't be enforced, and containers no longer display the labels.\nIf these containers have Microsoft Entra classification values applied to them, the containers revert to using the classifications again. Be aware that any new sites or groups that were created after enabling the feature won't display a label or have a classification. For these containers, and any new containers, you can now apply classification values. For more information, see\nSharePoint \"modern\" sites classification\nand\nCreate classifications for Office groups in your organization\n.\nAdditional resources\nFor more information about managing Teams connected sites and channel sites, see\nManage Teams connected sites and channel sites\n.\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Sensitivity Labels for Sites", @@ -638,7 +638,7 @@ "https://learn.microsoft.com/en-us/purview/audit-solutions-overview": { "content_hash": "sha256:2c9242a3cd326b0150e6fc54f2f7bfe2e9800068b96d9472bc836fa13fc4362c", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nLearn about auditing solutions in Microsoft Purview\nFeedback\nSummarize this article for me\nMicrosoft Purview auditing solutions provide an integrated solution to help organizations effectively respond to security events, forensic investigations, internal investigations, and compliance obligations. Your organization's unified audit log captures, records, and retains thousands of user and admin operations performed in dozens of Microsoft services and solutions. Security ops, IT admins, insider risk teams, and compliance and legal investigators in your organization can search audit records for these events. This capability provides visibility into the activities performed across your organization.\nComparison of key capabilities\nThe following table compares the key capabilities available in Audit (Standard) and Audit (Premium). Audit (Premium) includes all Audit (Standard) functionality.\nCapability\nAudit (Standard)\nAudit (Premium)\nEnabled by default\nThousands of searchable audit events\nAudit search tool in the Microsoft Purview portal\nAudit Search Graph API\nSearch-UnifiedAuditLog cmdlet\nExport audit records to CSV file\nAccess to audit logs via Office 365 Management Activity API\n1\n180-day audit log retention\nUp to 1-year audit log retention\n10-year audit log retention\n2\nAudit log retention policies\nIntelligent insights\nNote\n1\nAudit (Premium) includes higher bandwidth access to the Office 365 Management Activity API, which provides faster access to audit data.\n2\nIn addition to the required licensing for Audit (Premium) (described in the next section), a user must be assigned a 10-Year Audit Log Retention add-on license to retain their audit records for 10 years.\nAudit (Standard)\nMicrosoft Purview Audit (Standard) enables you to log and search for audited activities to support your forensic, IT, compliance, and legal investigations.\nEnabled by default\n. Audit (Standard) is enabled by default for all organizations with the appropriate subscription. That configuration captures and makes searchable records for audited activities. You only need to assign the necessary permissions to access the audit log search tool (and the corresponding cmdlet) and ensure that users have the right license for Microsoft Purview Audit (Premium) features.\nThousands of searchable audit events\n. You can search for a wide range of audited activities that occur in most of the Microsoft services in your organization. For a list of the activities you can search for, see\nAudit log activities\n. For a list of the services and features that support audited activities, see\nAudit log record type\n.\nAudit search tool in the Microsoft Purview portal\n. Use the Audit log search tool in the portal to search for audit records. You can search for specific activities, for activities performed by specific users, and activities that occurred within a date range.\nAudit Search Graph API\n. Microsoft Graph offers a unified API endpoint for accessing data from multiple Microsoft cloud services in a single response. The\nAudit Search Graph API\nallows you to programmatically access the audit search experience through Microsoft Graph.\nSearch-UnifiedAuditLog cmdlet\n. You can also use the\nSearch-UnifiedAuditLog\ncmdlet in Exchange Online PowerShell (the underlying cmdlet for the search tool) to search for audit events or to use in a script. For more information, see:\nSearch-UnifiedAuditLog cmdlet reference\nUse a PowerShell script to search the audit log\nExport audit records to a CSV file\n. After running the Audit log search tool in the Microsoft Purview portal, you can export the audit records returned by the search to a CSV file. This process lets you use Microsoft Excel to sort and filter on different audit record properties. You can also use Excel Power Query transform functionality to split each property in the AuditData JSON object into its own column. This process lets you effectively view and compare similar data for different events. For more information, see\nExport, configure, and view audit log records\n.\nAccess to audit logs via Office 365 Management Activity API\n. A third method for accessing and retrieving audit records is to use the Office 365 Management Activity API. This method lets organizations retain auditing data for longer periods than the default 180 days and lets them import their auditing data to a SIEM solution. For more information, see\nOffice 365 Management Activity API reference\n.\n180-day audit log retention\n. When a user or admin performs an audited activity, the system generates an audit record and stores it in the audit log for your organization. In Audit (Standard), the system retains records for 180 days, which means you can search for activities that occurred within the past six months.\nImportant\nThe default retention period for Audit (Standard) changed from 90 days to 180 days. Audit (Standard) logs generated before October 17, 2023, are retained for 90 days. Audit (Standard) logs generated on or after October 17, 2023, follow the new default retention of 180 days.\nAudit (Premium)\nImportant\nClassic Search retired on November 30, 2023.\nNew Search\nincludes enhancements such as faster search times, additional search options, ability to save searches, and more.\nAudit (Premium) builds on the capabilities of Audit (Standard) by providing audit log retention policies, longer retention of audit records, high-value intelligent insights, and higher bandwidth access to the Office 365 Management Activity API.\nAudit log retention policies\n. Create customized audit log retention policies to retain audit records for longer periods, up to one year (and up to 10 years for users with the required add-on license). Create a policy to retain audit records based on the service where the audited activities occur, specific audited activities, or the user who performs an audited activity.\nLonger retention of audit records\n. Microsoft Entra ID, Exchange, OneDrive, and SharePoint audit records are retained for one year by default. Audit records for all other activities are retained for 180 days by default, or you can use audit log retention policies to configure longer retention periods.\nAudit (Premium) intelligent insights\n. Audit records for intelligent insights can help your organization conduct forensic and compliance investigations by providing visibility to events such as when mail items were accessed, or when mail items were replied to and forwarded, or when and what a user searched for in Exchange Online and SharePoint Online. These intelligent insights can help you investigate possible breaches and determine the scope of compromise.\nHigher bandwidth to the Office 365 Management Activity API\n. Audit (Premium) provides organizations with more bandwidth to access auditing logs through the Office 365 Management Activity API. Although all organizations (that have Audit (Standard) or Audit (Premium)) initially receive a baseline of 2,000 requests per minute, this limit dynamically increases depending on an organization's seat count and their licensing subscription. This change results in organizations with Audit (Premium) getting about twice the bandwidth as organizations with Audit (Standard).\nLong-term retention of audit logs\nAudit (Premium) retains all Exchange, SharePoint, and Microsoft Entra audit records for one year. This retention happens through a default audit log retention policy that retains any audit record that contains the value of\nAzureActiveDirectory\n,\nExchange\n,\nOneDrive\n, or\nSharePoint\n, for the\nWorkload\nproperty (which indicates the service in which the activity occurred) for one year. Retaining audit records for longer periods can help with ongoing forensic or compliance investigations. For more information, see the \"Default audit log retention policy\" section in\nManage audit log retention policies\n.\nIn addition to the one-year retention capabilities of Audit (Premium), we also released the capability to retain audit logs for 10 years. The 10-year retention of audit logs helps support long running investigations and respond to regulatory, legal, and internal obligations.\nNote\nRetaining audit logs for 10 years requires an additional per-user add-on license. After you assign this license to a user and set an appropriate 10-year audit log retention policy for that user, audit logs covered by that policy start to be retained for the 10-year period. This policy isn't retroactive and can't retain audit logs that were generated before the 10-year audit log retention policy was created.\nAudit log retention policies\nAll audit records that other services generate and that the default audit log retention policy doesn't cover are retained for 180 days. You can create customized audit log retention policies to retain other audit records for longer periods, up to 10 years. You can create a policy to retain audit records based on one or more of the following criteria:\nThe Microsoft service where the audited activities occur.\nSpecific audited activities.\nThe user who performs an audited activity.\nImportant\nThe default retention period for Audit (Standard) changed from 90 days to 180 days. Audit (Standard) logs generated before October 17, 2023, are retained for 90 days. Audit (Standard) logs generated on or after October 17, 2023, follow the new default retention of 180 days.\nYou can also specify how long to retain audit records that match the policy and a priority level so that specific policies take priority over other policies. Any custom audit log retention policy takes precedence over the default audit retention policy if you need to retain Exchange, SharePoint, or Microsoft Entra ID audit records for less than a year or for 10 years for some or all users in your organization. For more information, see\nManage audit log retention policies\n.\nImportant\nThe audit item lifetime for data is determined when the auditing pipeline adds the data and is based on the licensing defaults or applicable retention policies. Any changes to licensing or applicable retention policies change the expiration time of the audit data after updating. These changes don't affect any previously committed items.\nAudit (Premium) activity properties\nAudit (Premium) helps organizations conduct forensic and compliance investigations by providing access to important events, such as when users access mail items, reply to and forward mail items, and search in Exchange Online and SharePoint Online. These events can help you investigate possible breaches and determine the scope of compromise. In addition to these events in Exchange and SharePoint, other Microsoft services include important events that require assigning users the\nappropriate Audit (Premium) license\n. Assign an Audit (Premium) license to users so the system generates audit logs when they perform these events.\nThese activities require that you assign users the\nappropriate Audit (Premium) license\n. Assign an Audit (Premium) license to users so the system generates audit logs when they perform these activities and properties.\nAudit (Premium) provides access to the following activity properties:\nExchange Online\nActivity\nProperty\nMailItemsAccessed\nSensitivityLabel\nMicrosoft Teams\nActivity\nProperty\nChatCreated\nAppAccessContext\nChatRetrieved\nAppAccessContext\nChatUpdated\nAppAccessContext\nMeetingParticipantDetail\nIsJoinedFromLobby\nArtifactShared\nMessageCreatedNotification\nAppAccessContext\nMessageDeletedNotification\nAppAccessContext\nMessageHostedContentsListed\nAppAccessContext\nMessageHostedContentRead\nAppAccessContext\nMessagesListed\nAppAccessContext\nMessageRead\nAppAccessContext\nMessageSent\nAppAccessContext\nParticipatingDomainInformation\nParticipantInfo\nMessageUpdated\nParticipantInfo\nAppAccessContext\nMessageUpdatedNotification\nAppAccessContext\nSubscribedToMessages\nAppAccessContext\nHigh-bandwidth access to the Office 365 Management Activity API\nOrganizations that access auditing logs through the Office 365 Management Activity API faced throttling limits at the publisher level. This throttling limit meant that if a publisher pulled data for multiple customers, all those customers shared the same limit.\nWith Audit (Premium), this limit changed from a publisher-level limit to a tenant-level limit. Each organization now gets its own fully allocated bandwidth quota to access its auditing data. The bandwidth isn't a static, predefined limit. Instead, it's modeled on a combination of factors, including the number of seats in the organization. E5, A5, and G5 organizations get more bandwidth than non-E5, non-A5, and non-G5 organizations.\nAll organizations initially get a baseline of 2,000 requests per minute. This limit dynamically increases based on an organization's seat count and licensing subscription. E5, A5, and G5 organizations get about twice as much bandwidth as non-E5, non-A5, and non-G5 organizations. A cap on the maximum bandwidth protects the health of the service.\nFor more information, see the\nAPI throttling\nsection in\nOffice 365 Management Activity API reference\n.\nLicensing requirements\nBefore you get started, review the\nsubscription requirements\nfor Audit (Standard) and Audit (Premium).\nTraining\nTraining your security operations team, IT administrators, and compliance investigators in the fundamentals for Audit (Standard) and Audit (Premium) can help your organization get started more quickly using auditing to help with your investigations. Microsoft Purview provides the following resource to help these users in your organization get started with auditing:\nDescribe the eDiscovery and audit capabilities of Microsoft Purview\n.\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Audit Logging", @@ -647,7 +647,7 @@ "https://learn.microsoft.com/en-us/purview/audit-copilot": { "content_hash": "sha256:c2b64262c0056402f40919b7fe07a9ccdc0f7f0b057caaae4162a72796d22a25", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nAudit logs for Copilot and AI applications\nFeedback\nSummarize this article for me\nThis article provides an overview of audit logs generated for user interactions and admin activities related to\nMicrosoft Copilot\nand AI applications. The system automatically logs these activities as part of\nAudit (Standard)\n. If your organization enables auditing, you don't need to take extra steps to configure auditing support for Copilot and AI applications.\nBilling for auditing non-Microsoft AI applications\nAudit logs for non-Microsoft AI applications use pay-as-you-go billing. This billing model provides user and admin interaction audit logs that the system retains for 180 days. These audit logs cover interactions with non-Microsoft AI applications.\nYour enterprise subscription doesn't include audit logs for this type of user interaction. Instead, it falls under\npay-as-you-go billing\n. The system logs these interactions under the\nAIAppInteraction\nrecordType or\nAIApp\nworkload. Some scenarios logged under the\nConnectedAiAppInteraction\nrecordType are also part of this pay-as-you-go billing model. You need to enable pay-as-you-go features to turn on these logs. When you enable them, the system retains these audit logs for 180 days. Consumption is charged based on the number of audit records ingested for user interactions with these non-Microsoft AI applications.\nPay-as-you-go billing doesn't apply to Microsoft applications. All Microsoft applications, including Microsoft Copilots like\nMicrosoft Security Copilot\n,\nCopilot in Microsoft Fabric\n, and custom applications built using\nMicrosoft Copilot Studio\nand Azure AI Studio are included in Audit Standard.\nAdmin activities with Copilot and AI applications\nThe system generates audit logs when an administrator performs activities related to Copilot settings, plugins, promptbooks, or workspaces. For more information, see\nMicrosoft 365 Copilot activities\n.\nUser activities with Copilot and AI applications\nThe system automatically generates audit logs when a user interacts with Copilot or an AI Application. These audit records contain details about which user interacted with Copilot, when the interaction took place, and where it occurred. Audit records also include references to files, sites, or other resources Copilot and AI applications accessed to generate responses to user prompts.\nCommon properties in Copilot audit logs\nThe following table outlines some of the common properties included in audit logs.\nAttribute\nDefinition\nExamples\nAccessedResources\nReferences to all resources (files, documents, emails, etc.) which Copilot accessed in response to the user’s request.\n-\nID\nis the unique identifier for the resource. This could be a fileId on OneDrive, or a messageId in Teams, or email ID in Outlook, etc.\n-\nSiteUrl\nis the URL of the resource that was accessed. This could be the URL of a SharePoint site, full file path of a file, etc.\n-\nListItemUniqueId\nis a unique identifier for an item in SharePoint.\n-\nType\nrefers to the type of resource that was accessed. It can contain values like the filetype extension (pptx, docx, etc.) or describe the type of resource (for non-SharePoint resources).\n-\nName\nis the user-friendly readable name of the resource (for example, fileName).\n-\nSensitivityLabelId\nis the ID of the sensitivity label assigned to the resource. This is helpful in identifying whether Copilot accessed any sensitive information while generating its response.\n-\nAction\nrefers to the nature of access which Copilot performed on the resource. Common values include\nread\n,\ncreate\n,\nmodify\n.\n-\nPolicyDetails\nis used in scenarios where Copilot's access to a particular resource was blocked or restricted based on some policy. This property can include details like\nPolicyId\n,\nPolicyName\n, list of rules, etc.\n-\nStatus\nis used to specify whether Copilot's action on a specific resource was a\nsuccess\nor\nfailure\n.\n-\nXPIADetected\nis a boolean that denotes whether there was an XPIA (Cross Prompt Injection Attack) detected from a particular resource which Copilot accessed.\nFor example: \"AccessedResources\":[{\"Action\":\"Read\",\"ID\":\"AAAAAEYE2GAACp1FlnN_CHXStUkHAGWJYgtgcv1eOxe2v4H4jOsAAAQsLLeAAGWJYgtgcv1EoXe2v4H4josAABwvq8gAAA2\",\"Name\":\"Document1.docx\",\"SensitivityLabelId\":\"f41ab342-8706-4188-bd11-ebb85995028c\",\"SiteUrl\":\"\nhttps://microsoft.sharepoint.com/teams/OfficeSerbia/Shared%20Documents/SPOPPE/Document%20transformation%20services/Crawled%20Word%20documents/IW/Document1.docx?web=1\n\",\"Type\":\"docx\",\"listItemUniqueId\":\"AAAAAEYE2GAACp1FlnN_CHXStUkHAGWJYgtgcv1eOxe2v4H4jOsAAAQsLLeAAGWJYgtgcv1EoXe2v4H4josAABwvq8gAAA2\"}],\nAgentId\nUnique identifier for an agent. The string can also include details about the category of agent involved in the interaction.\nFor example, when a user interacts with a Declarative Agent or a Custom-engine Agent created through Microsoft Copilot Studio, then Agent ID contains values like\nCopilotStudio.Declarative.8ad83f3e-b424-4d54-8ddb-15dc19247088\nor\nCopilotStudio.CustomEngine.11fd28b5-4452-4615-be3d-7046a6f31131\n.\nAgentName\nA friendly readable name of the agent.\nJiraStatusAgent\n,\nSalesAgent\n,\nReminderBot\n, and others.\nAgentVersion\nThe version number or version ID of the agent involved.\nValues like\n25.001\n,\n8076fbed-be52-4004-ac89-81181ecd7b33\n, and others.\nAISystemPlugin\nDetails of plugins or extensions enabled for the Copilot interaction.\n-\nName\nis the name of the plugin that Copilot uses in generating the response.\n-\nID\nis the unique identifier for the plugin.\n-\nVersion\nrefers to the version of plugin used.\nAppHost\nThe same Copilot application can be deployed within multiple host applications. This property helps identify the application that hosted the interaction between a user and Copilot.\nSome of the common AppHost scenarios are:\n-\nBizChat\n: The Copilot interaction was performed in the Microsoft 365 Copilot Chat client (either via Teams, or the app), or via the website microsoft365.com/copilot or microsoft365.com/chat\n-\nBing\n: The Copilot interaction was performed through the Microsoft Edge browser, Office mobile apps, or copilot.cloud.microsoft.com\n-\nOffice\n: The Copilot interaction was performed through office.com or microsoft365.com\n- Other application-specific values: Values like\nWord\n,\nExcel\n,\nPowerPoint\n,\nOneNote\n,\nStream\n, and others indicate that the interaction was performed within these applications\nAppIdentity\nA detailed string that you use to uniquely identify the specific Copilot or AI Application that the user interacted with. It typically follows the structure\nworkloadName.appGroup.appName\n.\nFor example, interactions with first-party Copilot apps developed by Microsoft use values like\nCopilot.MicrosoftCopilot.Microsoft365Copilot\n,\nCopilot.Fabric.CopilotforPowerBI\n,\nCopilot.Security.SecurityCopilot\n, and others.\nInteractions with custom-built Copilots created through Copilot Studio use values like\nCopilot.Studio.AppId\n.\nInteractions with third-party AI apps deployed within your organization (which use\nConnectedAIApp\nas the workload) use values like\nConnectedAIApp.Entra.AppId\nor\nConnectedAIApp.AzureAI.AzureResourceName\n. Interactions with third-party AI apps that are audited through network/browser Data Loss Prevention (DLP) (which use\nAIApp\nas the workload) use values like\nAIApp.SaaS.AppName\n.\nCapacityId\nUnique identifier for the Microsoft Fabric Capacity in a tenant.\nYou can find the\nCapacityID\nin the URL of the capacity management page. In Microsoft Fabric, go to\nSettings\n>\nGovernance and insights\n>\nAdmin portal\n>\nCapacity settings\nand select a capacity. The\nCapacityID\nis shown in the URL after\n/capacities/\n. For example,\n00001111-aaaa-2222-bbbb-3333cccc4444\nis the\nCapacityID\nin the following URL:\nhttps://app.powerbi.com/admin-portal/capacities/00001111-aaaa-2222-bbbb-3333cccc4444\n.\nClientRegion\nThe user’s region when they performed the operation.\nContexts\nContains a collection of attributes to help describe where the user was during the Copilot interaction.\n-\nID\nis the identifier of the resource that was being used during the Copilot interaction.\n-\nType\nis the filetype/name of the app or service where the interaction occurred.\n-\nID\ncontains values like\nFileId\nor\nFilePath\n(for SharePoint scenarios), or Teams Chat ID or Meeting ID (for Teams scenarios), and others.\n-\nType\ncontains values like docx, pptx, xlsx, TeamsMeeting, TeamsChannel, TeamsChat, and others.\nMessages\nContains details about the prompt and response messages within the Copilot interaction. A single audit record typically contains a prompt-response pair but can also include a prompt with multiple response messages (that is, all Copilot responses associated with that prompt).\n-\nID\nis the messageId of the prompt/response message in the Copilot interaction.\n-\nIsPrompt\nis a boolean flag to denote whether this message is a user prompt or Copilot response.\n-\nJailbreakDetected\nis a boolean flag to denote whether a jailbreak attempt was made using this prompt message.\n-\nSize\nis currently not used.\n\"Messages\": [ {\"ID\":\"1715186983849\", \"isPrompt\":true}, {\"ID\":\"1715186984291\", \"isPrompt\":false} ]\nModelTransparencyDetails\nDetails of the AI/GAI model provider.\n-\nModelName\nis the name of the model used.\n-\nModelVersion\nis the version of the model used.\n-\nModelProviderName\nis the publisher of the model.\nOperation\nSpecifies the name of the activity that was audited.\nFor user interactions with Copilot, this property uses values like\nCopilotInteraction\n,\nConnectedAIAppInteraction\n, and\nAIAppInteraction\n, as described for RecordType.\nAlso includes Copilot admin operations like\nUpdateTenantSettings\n,\nCreatePlugin\n,\nDeletePlugin\n,\nEnablePromptBook\n, and others.\nRecordType\nIdentifies the category of Copilot or AI application that the user interacted with.\nCopilotInteraction\nrefers to scenarios where a user interacted with a Microsoft-developed Copilot application.\nConnectedAIAppInteraction\nrefers to scenarios where a user interacted with a custom-built Copilot or third-party AI application deployed and registered within your organization.\nAIAppInteraction\nrefers to interactions with third-party AI applications that aren't deployed within your organization.\nWorkload\nIdentifies the app category, similar to RecordType.\nCopilot\n,\nConnectedAIApp\n,\nAIApp\nCommon AppHost scenarios in Copilot\nThe following table lists some of the commonly used values for AppHost and describes the scenarios in which they're used.\nAppHost\nCopilot Scenario\nCopilot product\nBing\nBusiness Chat via Bing/Windows interface.\nRefers to Microsoft 365 Copilot's cross-app\nBusiness Chat\naccess through the Bing Chat experience (for instance, in the Microsoft Edge browser sidebar, Windows Copilot, or the copilot.cloud.microsoft.com web portal). This scenario occurs when a user engages the Copilot in a general-purpose chat outside any specific Office app.\nMicrosoft 365 Copilot Chat\nBookings\nCopilot in Microsoft Bookings.\nMicrosoft 365 Copilot\nCopilot in Azure\nCopilot in Microsoft Azure (Cloud Management).\nThis refers to Copilot usage in the context of\nMicrosoft Azure\n. Though labeled under Microsoft 365 Copilot, the scenario is a cloud admin or developer using Copilot to manage Azure resources. For example, in the Azure portal, ad admin might ask, \"How do I set up an alert for CPU usage on my VMs?\" and Copilot would generate an ARM template or Azure CLI steps. Or \"list untagged resources and suggest a tagging scheme\". Essentially, Copilot in Azure is an AI cloud assistant to query and control Azure services via natural language.\nMicrosoft 365 Copilot\nCopilot in Defender\nSecurity Copilot in Microsoft Defender.\nThe user is using Copilot within the\nMicrosoft Defender\nsecurity portal. This assists security operations (SecOps) tasks. For example, an analyst could ask, \"Investigate alert ID 23455 and summarize what happened,\" and Copilot analyzes Defender alerts and incidents to produce an explanation or even recommend next steps. It can also cross-reference threat intelligence. This scenario specifically focuses on\nDefender for Endpoint/Office/Cloud, etc.\ndata via the Security Copilot interface.\nSecurity Copilot\nCopilot in Intune\nSecurity Copilot in Microsoft Intune (Endpoint Management).\nCopilot is used in the\nMicrosoft Intune\nadmin center. It helps IT admins with device management and security posture. For instance, an admin might ask, \"How many devices are noncompliant this week and why?\". Copilot retrieves Intune data on device compliance and summarize reasons. It can also assist with troubleshooting by comparing device configuration or retrieving app deployment status. This is essentially Security Copilot tapping into Intune's information to answer questions and provide insights for IT and security teams.\nSecurity Copilot\nDesigner\nCopilot in Microsoft Designer.\nMicrosoft 365 Copilot\nEdge\nBusiness Chat in Edge sidebar.\nRepresents Microsoft 365 Copilot chat accessed through the\nEdge browser's Copilot (Bing Chat Enterprise) sidebar\n. In this scenario, the user is likely using the Microsoft Edge sidebar Copilot to query organizational data (a BizChat experience within Microsoft Edge).\nMicrosoft 365 Copilot Chat\nExcel\nCopilot in Microsoft Excel.\nThe user is using Copilot inside an Excel spreadsheet. For example, Copilot might be asked to analyze data, create formulas, or generate a summary of a table. It's the Excel-integrated Copilot helping with computations or insights in workbooks.\nMicrosoft 365 Copilot\nForms\nCopilot in Microsoft Forms.\nRepresents Copilot being used in the context of Forms. For example, Copilot could help create survey questions, quizzes, or analyze form responses. This would be the AI assistant helping content creation or summarization in Forms.\nMicrosoft 365 Copilot\nLogic App\nCopilot in Azure Logic Apps.\nMicrosoft 365 Copilot\nLoop\nCopilot in Microsoft Loop.\nThis refers to Copilot assisting within the Microsoft Loop application or Loop components. The user might have Copilot generate or summarize content in a Loop workspace. For example, Copilot could help brainstorm in a Loop page, given Loop's collaborative canvas. This integration brings Copilot to the Loop app context.\nMicrosoft 365 Copilot\nM365AdminCenter\nCopilot in Microsoft 365 Admin Center.\nThis refers to an AI assistant for IT administrators. In this scenario, Copilot could help an admin with tasks in the Microsoft 365 admin center. For example, answering questions about settings, generating PowerShell scripts, or summarizing user reports.\nMicrosoft 365 Copilot\nM365App\nBusiness Chat via Microsoft 365 app (desktop or mobile).\nSimilar to the Office apphost, this denotes Business Chat launched from the\nMicrosoft 365 unified app\non Windows or mobile. It covers the scenario where a user uses the Office/Microsoft 365 app itself (outside a specific product like Word) to chat with Copilot across their data.\nMicrosoft 365 Copilot Chat\nMicrosoft Purview\nSecurity Copilot in Purview (Compliance).\nThe user is interacting with Copilot in the\nMicrosoft Purview\ncompliance portal. Copilot helps compliance officers and security teams triage and summarize issues related to data protection and governance. For example, Copilot can summarize a batch of Data Loss Prevention (DLP) alerts, highlight inside risk activities, or answer questions like \"Have we had any policy violations in email this week?\". It can work in both an embedded way (inside Microsoft Purview UI, summarizing whatever section you're on) and in a standalone Q&A way for Microsoft Purview data. This scenario brings AI to compliance workflows, making it faster to grasp risks and decide on actions.\nSecurity Copilot\nOffice\nBusiness Chat via Office.com or Microsoft 365 app.\nIndicates the Copilot Business Chat accessed through the Office.com or Microsoft 365 home app (web, desktop, or mobile). For example, when a user opens the \"Copilot\" chat on Office.com (microsoft365.com) or the Microsoft 365 mobile app to ask cross-domain questions.\nMicrosoft 365 Copilot Chat\nOfficeCopilotNotebook\nCopilot Notebook (Microsoft 365).\nRefers to the\nCopilot Notebooks\nfeature in Microsoft 365 Copilot (a central AI-powered notebook). This scenario is when Copilot compiles or interacts with a\ncross-application notebook\nof content. Copilot Notebooks allow users to gather and generate information across multiple sources in a notebook interface. This AppHost appears when that notebook is used on the Office.com/Microsoft 365 side (outside of OneNote).\nMicrosoft 365 Copilot Chat\nOfficeCopilotSearchAnswer\nCopilot answer in Microsoft 365 Search.\nThis value refers to Copilot generating an\nAI-powered answer in the Office/Microsoft 365 search experience\n. For instance, when a user searches on Office.com or SharePoint and Copilot provides a natural language answer (drawing from workplace data) instead of just search results. This is a Business Chat-like Q&A feature within the search context.\nMicrosoft 365 Copilot Chat\nOneDrive\nAlso refers to the\nCopilot in SharePoint\nscenario, as described previously.\nMicrosoft 365 Copilot\nOneNote\nCopilot in Microsoft OneNote.\nThe user is using Copilot inside OneNote notebooks. This in-app OneNote Copilot can summarize notes, generate plans or lists, or answer questions based on NoteNote content. Copilot \"\nsupercharges your note-taking\n\" in OneNote, helping to create, recall, and organize information.\nMicrosoft 365 Copilot\nOneNoteCopilotNotebook\nCopilot Notebook in OneNote.\nSimilar to the OfficeCopilotNotebook scenario, but specifically when the Copilot Notebook is accessed within OneNote. Microsoft introduced Copilot Notebooks integrated into OneNote, so this AppHost logs when a user uses the\nAI-powered notebook inside OneNote\n(bringing cross-data Copilot functionality into OneNote).\nMicrosoft 365 Copilot Chat\nOutlook\nCopilot in Microsoft Outlook.\nThe user in engaging Copilot while using Outlook (desktop, web, or mobile). This typically means the Copilot is helping with email tasks. For example, drafting an email reply, summarizing a long email thread, or organizing an inbox.\nMicrosoft 365 Copilot Chat\nOutlookOnCanvas\nCopilot inline in Outlook compose.\nThis refers to Copilot's assistance directly in the email canvas. For example, when composing an email, Copilot might autogenerate text right in the draft. It's the on-canvas helper in Outlook's compose window (as opposed to using a separate Copilot pane).\nMicrosoft 365 Copilot\nOutlookSidepane\nCopilot in Outlook side pane.\nDenotes the classic Outlook Copilot experience via the Copilot pane in Outlook. For example, a user opens the Copilot sidebar in Outlook to draft or summarize messages. This value explicitly captures that side pane interaction.\nMicrosoft 365 Copilot\nPlanner\nCopilot in Microsoft Planner.\nIndicates a Copilot scenario in Planner, likely assisting with project plans or tasks. A user might ask Copilot to draft a plan, generate task checklists, or update task descriptions.\nMicrosoft 365 Copilot\nPower BI\nCopilot in Microsoft Power BI.\nMicrosoft 365 Copilot\nPowerPoint\nCopilot in Microsoft PowerPoint.\nCopilot is being used within a PowerPoint presentation. In this scenario, a user could ask Copilot to create slides, generate speaker notes, or redesign content in PowerPoint.\nMicrosoft 365 Copilot\nPowerPointOnCanvas\nCopilot on PowerPoint slides.\nIndicates an on-canvas Copilot experience in PowerPoint, where Copilot inserts or modifies content directly on slides. This could be the scenario of Copilot generating layouts or bulleted points straight into the presentation (without solely relying on the chat pane). It's a more embedded form of the PowerPoint Copilot.\nMicrosoft 365 Copilot\nSecurity Copilot Standalone\nSecurity Copilot standalone experience.\nThe user is interacting with Security Copilot through the standalone experience, as opposed to interacting with Copilot through an embedded experience within Defender, Purview, etc.\nSecurity Copilot\nSharePoint\nCopilot in SharePoint.\nCopilot is used within SharePoint (likely on a SharePoint site or page). For instance, a user might ask Copilot to summarize a SharePoint news post or draft content for a SharePoint page. This scenario corresponds to a SharePoint-integrated Copilot helping with intranet content.\nMicrosoft 365 Copilot\nStream\nCopilot in Microsoft Stream.\nMicrosoft 365 Copilot\nTeams\nCopilot in Microsoft Teams.\nRepresents Copilot usage inside the Microsoft Teams app (Web/Desktop/Mobile). This covers interactions with Copilot in Teams chats or channels. For example, asking Copilot to summarize a Teams chat, answer a question in a channel, or assist in a meeting context. This is essentially the Copilot experience within Teams' interface. For example, the \"Chat Copilot\" in a Teams chat thread.\nMicrosoft 365 Copilot\nTeamsAdminPortal\nCopilot in Microsoft Teams Admin Center.\nSimilar to the M365AdminCenter scenario, this indicated a Copilot scenario in the Teams Admin Portal. An admin might use Copilot to configure Teams settings or generate reports.\nMicrosoft 365 Copilot\nVivaEngage\nCopilot in Viva Engage (Yammer).\nThe user is interaction with Copilot within Viva Engage. For example, drafting a post or summarizing conversation threads in a Yammer community. This scenario covers any AI assistance inside Viva Engage, such as helping craft announcements or answers.\nMicrosoft 365 Copilot\nVivaGoals\nCopilot in Viva Goals.\nViva Goals manages OKRs (objectives and key results). A Copilot here could draft OKRs, update progress, or analyze goal attainment.\nMicrosoft 365 Copilot\nVivaPulse\nCopilot in Viva Pulse.\nViva Pulse is a feedback survey tool; Copilot here might draft survey questions or summarize sentiment from responses.\nMicrosoft 365 Copilot\nWhiteboard\nCopilot in Microsoft Whiteboard.\nThe user is using Copilot on a digital whiteboard. Copilot in Whiteboard helps brainstorm and organize ideas on the whiteboard canvas. For example, suggesting ideas, clustering sticky notes, or summarizing the board's content. This scenario covers using Copilot during a Whiteboard session (in Teams or the Whiteboard app) to enhance creativity and structure.\nMicrosoft 365 Copilot\nWord\nCopilot in Microsoft Word.\nThe user is interacting with Copilot within Word. For example, asking it to draft or edit portions of a Word document. This is the in-app\nWord Copilot\n(side-pane chat and commands in Word). Copilot can generate content, summarize text, or adjust formatting the document context.\nMicrosoft 365 Copilot\nWordOnCanvas\nCopilot inline in Word's document.\nThis refers to Copilot assistance directly\non the canvas\nin Word. Instead of the side pane, Copilot acts within the document editing area. For example, the feature where Copilot writes directly into the document or provides inline suggestions. It's essentially Word Copilot's capabilities applies within the document body, rather than via the chat pane.\nMicrosoft 365 Copilot\nExample Copilot scenarios for user activities\nThe following tables list some example scenarios and how they appear in the audit log. These example audit logs come from Copilot activities.\nMicrosoft Copilot\nA user interacts with Microsoft Copilot through the Microsoft 365 Copilot Chat client.\nOperation\nRecordType\nAppIdentity\nAppHost\nCopilotInteraction\nCopilotInteraction\nCopilot.MicrosoftCopilot.BizChat\nBizChat\nSecurity Copilot\nA user interacts with Security Copilot within Microsoft Defender.\nOperation\nRecordType\nAppIdentity\nAppHost\nCopilotInteraction\nCopilotInteraction\nCopilot.Security.SecurityCopilot\nDefender\nCopilot Studio applications\nA user interacts with a custom-built Copilot Studio application (whose appId is the GUID contained in appIdentity). The interaction takes place within Microsoft Teams, where this custom-built application is deployed.\nOperation\nRecordType\nAppIdentity\nAppHost\nCopilotInteraction\nCopilotInteraction\nCopilot.Studio.f4d97b45-1deb-40ce-9004-b473b79eab85\nTeams\nMicrosoft Facilitator\nMicrosoft Facilitator updates AI Notes, Live Notes, or Meeting Moderation in Microsoft Teams.\nOperation\nRecordType\nAppIdentity\nAppHost\nAINotesUpdate\nTeamCopilotInteraction\nCopilot.TeamCopilot.AINotes\nTeams\nLiveNotesUpdate\nTeamCopilotInteraction\nCopilot.TeamCopilot.LiveNotes\nTeams\nLiveNotesUpdate\nTeamCopilotInteraction\nCopilot.TeamCopilot.MeetingModerator\nTeams\nTeamCopilotMsgInteraction\nTeamCopilotInteraction\nCopilot.TeamCopilot.Message\nTeams\nIdentifying if Copilot accessed the web\nWhen you enable web search, Microsoft 365 Copilot and Microsoft 365 Copilot Chat parse user prompts and determine whether web search would improve the quality of the response. To identify if Copilot referenced the public web in a user interaction, review the\nAISystemPlugin.Id\nproperty in the\nCopilotInteraction\naudit record.\nAISystemPlugin.Id\ncontains the value\nBingWebSearch\nwhen user Copilot requests use the public web via Microsoft Bing for additional data.\nAccessing Copilot audit logs\nAccess Copilot audit logs by using the Microsoft Purview portal and selecting\nAudit\n.\nTo search for specific Copilot or AI application scenarios, use the\nActivities – operation names\nfield in the Microsoft Purview portal to filter audit logs by properties like\nOperation\n,\nRecordType\n, and\nWorkload\n.\nIf you need to search for audit logs containing a specific\nAppIdentity\nvalue or set of values, first search and export all relevant Copilot audit logs by filtering by operation name. From the exported search results, apply a filter on the\nAppIdentity\nproperty offline.\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Audit Copilot Activities", @@ -656,7 +656,7 @@ "https://learn.microsoft.com/en-us/purview/audit-log-retention-policies": { "content_hash": "sha256:7874cd8fe320c669cb1b6b506d9dc03e02338200ae7b9d24ff0fd67daae68f92", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nManage audit log retention policies\nFeedback\nSummarize this article for me\nYou can create and manage audit log retention policies in the\nMicrosoft Purview portal\n. Audit log retention policies are part of the new Microsoft Purview Audit (Premium) capabilities. An audit log retention policy lets you specify how long to retain audit logs in your organization. You can retain audit logs for up to 10 years. You can create policies based on the following criteria:\nAll activities in one or more Microsoft services\nSpecific activities in a Microsoft service performed by all users or by specific users\nA priority level that specifies which policy takes precedence if you have multiple policies in your organization\nDefault audit log retention policy in Audit (Premium)\nAudit (Premium) in Microsoft Purview provides a default audit log retention policy for all organizations. You can't modify this policy. It retains all Exchange Online, SharePoint, OneDrive, and Microsoft Entra audit records for one year. This default policy retains audit records that contain the value of\nAzureActiveDirectory\n,\nExchange\n,\nOneDrive\n, and\nSharePoint\nfor the\nWorkload\nproperty (which is the service in which the activity occurred). Audit records for all other activities are retained for 180 days by default or you can change the retention to a different duration using a custom retention policy.\nNote\nThe default audit log retention policy only applies to audit records for activity performed by users who are assigned an Office 365 or Microsoft 365 E5 license or have a Microsoft Purview Suite (formerly known as Microsoft 365 E5 Compliance) or E5 eDiscovery and Audit add-on license. If you have non-E5 users or guest users in your organization, their corresponding audit records are retained for 180 days.\nImportant\nThe default retention period for Audit (Standard) changed from 90 days to 180 days. Audit (Standard) logs generated before October 17, 2023 are retained for 90 days. Audit (Standard) logs generated on or after October 17, 2023 follow the new default retention of 180 days.\nBefore you create an audit log retention policy\nYou need the\nOrganization Configuration\nrole in the Microsoft Purview portal to create or modify an audit retention policy.\nYour organization can have up to 50 audit log retention policies.\nTo retain an audit log for longer than 180 days (and up to 1 year), the user who generates the audit log (by performing an audited activity) must have an Office 365 E5 or Microsoft 365 E5 license or a Microsoft Purview Suite (formerly known as Microsoft 365 E5 Compliance) or E5 eDiscovery and Audit add-on license. To retain audit logs for 10 years, the user who generates the audit log must also have a 10-year audit log retention add-on license in addition to an E5 license.\nNote\nIf the user generating the audit log doesn't meet these licensing requirements, data is retained according to the highest priority retention policy. This retention might be either the default retention policy for the user's license or the highest priority policy that matches the user and its record type.\nAll custom audit log retention policies (created by your organization) take priority over the default retention policy. For example, if you create an audit log retention policy for Exchange mailbox activity that has a retention period that's shorter than one year, audit records for Exchange mailbox activities are retained for the shorter duration specified by the custom policy.\nThe audit item lifetime for data is determined when you add it to the auditing pipeline and is based on the licensing defaults or applicable retention policies. Any changes to licensing or applicable retention policies change the expiration time of the audit data after updating. These changes don't update any previously committed items.\nCreate an audit log retention policy\nComplete the following steps to create an audit retention policy:\nSign in to the\nMicrosoft Purview portal\nwith a user account assigned the\nOrganization Configuration\nrole on the\nRoles & scopes\npage in the Microsoft Purview portal.\nSelect the\nAudit\nsolution card. If the\nAudit\nsolution card isn't displayed, select\nView all solutions\nand then select\nAudit\nfrom the\nCore\nsection.\nSelect\nCreate audit retention policy\n, and then complete the following fields on the flyout page:\nPolicy name\n: The name of the audit log retention policy. This name must be unique in your organization, and you can't change it after creating the policy.\nDescription\n: Optional, but helpful to provide information about the policy, such as the record type or workload, users specified in the policy, and the duration.\nUsers:\nSelect one or more users to apply the policy to. If you leave this box blank, the policy applies to all users.\nRecord type\n: The audit record type the policy applies to. If you leave this property blank, the policy applies to all record types. You can select a single record type or multiple record types:\nIf you select a single record type, the\nActivities\nfield is dynamically displayed. Use the drop-down list to select activities from the selected record type to apply the policy to. If you don't choose specific activities, the policy applies to all activities of the selected record type.\nIf you select multiple record types, you don't have the ability to select activities. The policy applies to all activities of the selected record types.\nDuration:\nThe amount of time to retain the audit logs that meet the criteria of the policy. The available options are\n7 Days\n,\n30 Days\n,\n6 Months\n,\n9 Months\n,\n1 Year\n,\n3 Years\n,\n5 Years\n, and\n7 Years\n. Users with the 10-year Audit Log Retention add-on license can select a\n10 Years\noption.\nImportant\nTo retain audit logs for the 7 and 30 days duration options, you must have a Microsoft 365 Enterprise E5 subscription. To retain audit logs for the 3, 5, and 7 years duration options, you must be assigned to a 10-Year Audit Log Retention add-on license in addition to your Microsoft 365 Enterprise E5 subscription. For more information about Audit subscriptions and add-ons, see\nAuditing solutions in Microsoft Purview\nPriority\n: This value determines the order in which audit log retention policies in your organization are processed. A lower value indicates a higher priority. Valid priorities are numerical values between\n1\nand\n10000\n. A value of\n1\nis the highest priority, and a value of\n10000\nis the lowest priority. For example, a policy with a value of\n5\ntakes priority over a policy with a value of\n10\n. Any custom audit log retention policy takes priority over the default policy for your organization.\nSelect\nSave\nto create the new audit log retention policy.\nThe new policy appears in the list on the\nPolicies\npage.\nManage audit log retention policies in the Microsoft Purview portal\nThe\nAudit retention policies\ntab (also called the\ndashboard\n) lists audit log retention policies. You can use the dashboard to view, edit, and delete audit retention policies.\nView policies in the dashboard\nThe dashboard lists audit log retention policies. One advantage of viewing policies in the dashboard is that you can select the\nPriority\ncolumn to list the policies in the priority order in which they're applied. As previously explained, a lower value indicates a higher priority.\nYou can also select a policy to display its settings on the flyout page.\nNote\nThe dashboard doesn't display the default audit log retention policy for your organization.\nEdit policies in the dashboard\nTo edit a policy, select it to display the flyout page. You can modify one or more settings and then save your changes.\nImportant\nIf you use the\nNew-UnifiedAuditLogRetentionPolicy\ncmdlet, you might create an audit log retention policy for record types or activities that aren't available in the\nCreate audit retention policy\ntool in the dashboard. In this case, you can't edit the policy (for example, change the retention duration or add and remove activities) from the\nAudit retention policies\ndashboard. You can only view and delete the policy in the Microsoft Purview portal. To edit the policy, you need to use the\nSet-UnifiedAuditLogRetentionPolicy\ncmdlet in Security & Compliance PowerShell.\nTip\nA message is displayed at the top of the flyout page for policies that you need to edit by using PowerShell.\nDelete policies in the dashboard\nTo delete a policy, select the\nDelete\nicon and then confirm that you want to delete the policy. The policy is removed from the dashboard, but it might take up to 30 minutes for the policy to be removed from your organization.\nCreate and manage audit log retention policies in PowerShell\nYou can also use Security & Compliance PowerShell to create and manage audit log retention policies. One reason to use PowerShell is to create a policy for a record type or activity that isn't available in the UI.\nCreate an audit log retention policy in PowerShell\nFollow these steps to create an audit log retention policy in PowerShell:\nConnect to Security & Compliance PowerShell\n.\nRun the following command to create an audit log retention policy:\nNew-UnifiedAuditLogRetentionPolicy -Name \"Microsoft Teams Audit Policy\" -Description \"One year retention policy for all Microsoft Teams activities\" -RecordTypes MicrosoftTeams -RetentionDuration TenYears -Priority 100\nThis example creates an audit log retention policy named \"Microsoft Teams Audit Policy\" with these settings:\nA description of the policy.\nRetains all Microsoft Teams activities (as defined by the\nRecordType\nparameter).\nRetains Microsoft Teams audit logs for 10 years.\nA priority of 100.\nHere's another example of creating an audit log retention policy. This policy retains audit logs for the \"User logged in\" activity for six months for the user admin@contoso.onmicrosoft.com.\nNew-UnifiedAuditLogRetentionPolicy -Name \"SixMonth retention for admin logons\" -RecordTypes AzureActiveDirectoryStsLogon -Operations UserLoggedIn -UserIds admin@contoso.onmicrosoft.com -RetentionDuration SixMonths -Priority 25\nFor more information, see\nNew-UnifiedAuditLogRetentionPolicy\n.\nView policies in PowerShell\nUse the\nGet-UnifiedAuditLogRetentionPolicy\ncmdlet in Security & Compliance PowerShell to view audit log retention policies.\nThe following command displays the settings for all audit log retention policies in your organization. This command sorts the policies from the highest to lowest priority.\nGet-UnifiedAuditLogRetentionPolicy | Sort-Object -Property Priority -Descending | FL Priority,Name,Description,RecordTypes,Operations,UserIds,RetentionDuration\nNote\nThe\nGet-UnifiedAuditLogRetentionPolicy\ncmdlet doesn't return the default audit log retention policy for your organization.\nEdit policies in PowerShell\nUse the\nSet-UnifiedAuditLogRetentionPolicy\ncmdlet in Security & Compliance PowerShell to edit an existing audit log retention policy.\nDelete policies in PowerShell\nUse the\nRemove-UnifiedAuditLogRetentionPolicy\ncmdlet in Security & Compliance PowerShell to delete an audit log retention policy. It might take up to 30 minutes for the policy to be removed from your organization.\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Audit Log Retention", @@ -674,7 +674,7 @@ "https://learn.microsoft.com/en-us/purview/ai-microsoft-purview": { "content_hash": "sha256:363b69560df065421151aee1082fd4f057756f587c63781759e01399bd720ca8", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nMicrosoft Purview data security and compliance protections for generative AI apps\nFeedback\nSummarize this article for me\nMicrosoft Purview service description\nUse Microsoft Purview to mitigate and manage the risks associated with AI usage, and implement corresponding protection and governance controls.\nThe following sections on this page provide an overview of Data Security Posture Management for AI and the Microsoft Purview capabilities that provide additional data security and compliance controls to accelerate your organization's adoption of Copilots and agents, and other generative AI apps.\nIn some Microsoft Purview solutions, you might see the supported AI apps grouped by the following category names:\nCopilot experiences and agents\nor\nMicrosoft Copilot experiences\nfor supported Copilots and agents that include:\nMicrosoft 365 Copilot\nSecurity Copilot\nCopilot in Fabric\nCopilot Studio\nEnterprise AI apps\nfor non-Copilot AI apps and agents connected to your organization through Entra registration, data connectors, Microsoft Foundry, and other methods, and include:\nEntra-registered AI apps\nChatGPT Enterprise\nMicrosoft Foundry\nOther AI apps\nthat are detected through browser activity and categorized as \"Generative AI\" in the Defender for Cloud Apps catalog. This category can include AI apps and agents from the other two categories but also uniquely includes AI apps and agents from third-party LLMs, such as:\nChatGPT\nGoogle Gemini\nMicrosoft Copilot (consumer version)\nDeepSeek\nNote\nNow rolling out with the\nFrontier preview program\ndata security and compliance protections from Microsoft Purview also support\nMicrosoft Agent 365\n. Currently, agent instances are identified and managed like other users.\nFor a breakdown of Microsoft Purview security and compliance supported capabilities for AI interactions by app, see the additional pages identified in the following table. Where these AI apps support agents, they inherit the same security and compliance capabilities as their parent AI app. However, for a quick summary, see\nUse Microsoft Purview to manage data security & compliance for AI agents\n.\nCopilot experiences and agents\nEnterprise AI apps\nOther AI apps\nMicrosoft 365 Copilot & Microsoft 365 Copilot Chat\nEntra-registered AI apps\nOther AI apps\nMicrosoft Security Copilot\nMicrosoft Foundry\nCopilot in Fabric\nChatGPT Enterprise\nMicrosoft Copilot Studio\nMicrosoft Facilitator\nChannel Agent in Teams\nFor a list of supported Microsoft Purview security and compliance supported capabilities for Microsoft Agent 365, see\nUse Microsoft Purview to manage data security & compliance for Microsoft Agent 365\n.\nIf you're new to Microsoft Purview, you might also find an overview of the product helpful:\nLearn about Microsoft Purview\n.\nDSPM for AI (classic) and DSPM (preview)\nUse\nData Security Posture Management for AI (classic)\nor\nData Security Posture Management (preview)\nas your front door to discover, secure, and apply compliance controls for AI usage across your enterprise. Both DSPM versions use existing controls from Microsoft Purview information protection and compliance management with easy-to-use graphical tools and reports to quickly gain insights into AI use within your organization. With personalized recommendations, and one-click policies help you protect your data and comply with regulatory requirements.\nMicrosoft Purview strengthens information protection for AI apps\nBecause of the power and speed AI can proactively surface content, generative AI amplifies the problem and risk of oversharing or leaking data. Learn how information protection capabilities from Microsoft Purview can help to strengthen your existing data security solutions.\nSensitivity labels and AI interactions\nAI apps that Microsoft Purview support use\nexisting controls to ensure that data stored in your tenant is never returned\nto the user or used by a large language model (LLM) if the user doesn't have access to that data. When the data has\nsensitivity labels\nfrom your organization applied to the content, there's an extra layer of protection:\nWhen a file is open in Word, Excel, PowerPoint, or similarly an email or calendar event is open in Outlook, the sensitivity of the data is displayed to users in the app with the label name and content markings (such as header or footer text) that have been configured for the label.\nLoop components and pages also support the same sensitivity labels\n.\nWhen the sensitivity label applies encryption, users must have the\nEXTRACT usage right\n, as well as VIEW, for the AI apps to return the data.\nThis protection extends to data stored outside your Microsoft 365 tenant when it's open in an Office app (data in use). For example, local storage, network shares, and cloud storage.\nTip\nIf you haven't already, we recommend you enable sensitivity labels for SharePoint and OneDrive and also familiarize yourself with the file types and label configurations that these services can process. When sensitivity labels aren't enabled for these services, the encrypted files that Copilot and agents can access are limited to data in use from Office apps on Windows.\nFor instructions, see\nEnable sensitivity labels for Office files in SharePoint and OneDrive\n.\nIf you're not already using sensitivity labels, see\nGet started with sensitivity labels\n.\nEncryption without sensitivity labels and AI interactions\nEven if a sensitivity label isn't applied to content, services and products might use the encryption capabilities from the Azure Rights Management service. As a result, AI apps can still check for the VIEW and EXTRACT usage rights before returning data and links to a user, but there's no automatic inheritance of protection for new items.\nTip\nYou'll get the best user experience when you always use sensitivity labels to protect your data, and encryption is applied by a label.\nExamples of products and services that can use the encryption capabilities from the Azure Rights Management service without sensitivity labels:\nMicrosoft Purview Message Encryption\nMicrosoft Information Rights Management (IRM)\nMicrosoft Rights Management connector\nMicrosoft Rights Management SDK\nFor other encryption methods that don't use the Azure Rights Management service:\nS/MIME protected emails won't be returned by Copilot, and Copilot isn't available in Outlook when an S/MIME protected email is open.\nPassword-protected documents can't be accessed by AI apps unless they're already opened by the user in the same app (data in use). Passwords aren't inherited by a destination item.\nAs with other Microsoft 365 services, such as eDiscovery and search, items encrypted with\nMicrosoft Purview Customer Key\nor\nyour own root key (BYOK)\nare supported and eligible to be returned by Copilot.\nData loss prevention and AI interactions\nMicrosoft Purview Data Loss Prevention\n(DLP) helps you identify sensitive items across Microsoft 365 services and endpoints, monitor them, and helps protect against leakage of those items. It uses deep content inspection and contextual analysis to identify sensitive items and it enforces policies to protect sensitive data such as financial records, health information, or intellectual property.\nWindows computers that are\nonboarded to Microsoft Purview\ncan be configured for Endpoint data loss prevention (DLP) policies that warn or block users from sharing sensitive information with third-party generative AI sites that are accessed via a browser. For example, a user is prevented from pasting credit card numbers into ChatGPT, or they see a warning that they can override. For more information about the supported DLP actions and which platforms support them, see the first two rows in the table from\nEndpoint activities you can monitor and take action on\n.\nInsider Risk Management and AI interactions\nMicrosoft Purview Insider Risk Management\nhelps you detect, investigate, and mitigate internal risks such as IP theft, data leakage, and security violations. It leverages machine learning models and various signals from Microsoft 365 and third-party indicators to identify potential malicious or inadvertent insider activities. The solution includes privacy controls like pseudonymization and role-based access, ensuring user-level privacy while enabling risk analysts to take appropriate actions.\nUse the\nRisky AI usage policy template\nto detect risky usage that includes prompt injection attacks and accessing protected materials. Insights from these signals are integrated into Microsoft Defender XDR to provide a comprehensive view of AI-related risks.\nData classification and AI interactions\nMicrosoft Purview data classification provides a comprehensive framework for identifying and tagging sensitive data across various Microsoft services, including Office 365, Dynamics 365, and Azure. Classifying data is often the first step to ensure compliance with data protection regulations and safeguard against unauthorized access, alteration, or destruction. You can use built-in system classifications or create your own.\nSensitive information types and trainable classifiers can be used to find sensitive data in user prompts and responses when they use AI apps. The resulting information then surfaces in\nMicrosoft Purview Reports overview\nand\nactivity explorer\nin DSPM for AI and the\nAI activities\ntab in activity explorer from the preview version of DSPM.\nMicrosoft Purview supports compliance management for AI apps\nInteractions using supported AI apps can be monitored for each user in your tenant. As such, together with data classification, you can use Microsoft Purview's auditing, communication compliance, eDiscovery with content search, and automatic retention and deletion capabilities from Data Lifecycle Management to manage this AI usage.\nAuditing and AI interactions\nMicrosoft Purview Audit solutions\nprovide comprehensive tools for searching and managing audit records of activities performed across various Microsoft services by users and admins, and help organizations to effectively respond to security events, forensic investigations, internal investigations, and compliance obligations.\nLike other activities, prompts and responses are\ncaptured in the unified audit log\n. Events include how and when users interact with the AI app, and can include in which Microsoft 365 service the activity took place, and references to the files stored in Microsoft 365 that were accessed during the interaction. If these files have a sensitivity label applied, that's also captured.\nThese events flow into\nactivity explorer\nin DSPM for AI and the\nAI activities\ntab in activity explorer from the preview version of DSPM, where the data from prompts and responses can be displayed. You can also use the\nAudit\nsolution from the\nMicrosoft Purview portal\nto search and find these auditing events.\nFor more information, see\nAudit logs for Copilot and AI activities\n.\nCommunication compliance and AI interactions\nMicrosoft Purview Communication Compliance\nprovides tools to help you detect and manage regulatory compliance and business conduct violations across various communication channels, which include user prompts and responses for AI apps. It's designed with privacy by default, pseudonymizing usernames and incorporating role-based access controls. The solution helps identify and remediate inappropriate communications, such as sharing sensitive information, harassment, threats, and adult content.\nTo learn more about using communication compliance policies for AI apps, see\nConfigure a communication compliance policy to detect for generative AI interactions\n.\neDiscovery with content search and AI interactions\nMicrosoft Purview eDiscovery\nlets you identify and deliver electronic information that can be used as evidence in legal cases. The eDiscovery tools in Microsoft Purview support searching for content in Exchange Online, OneDrive for Business, SharePoint Online, Microsoft Teams, Microsoft 365 Groups, and Viva Engage teams. You can then prevent the information from deletion and export the information.\nBecause user prompts and responses for AI apps are stored in a user's mailbox, you can create a case and use\nsearch\nwhen a user's mailbox is selected as the source for a search query. For example, select and retrieve this data from the source mailbox by selecting from the query builder\nAdd condition\n>\nType\n>\nContains any of\n>\nEdit\n>\nCopilot activity\n. This query condition includes all Copilot and other AI application activity.\nAfter the search is refined, you can export the results or add to a\nreview set\n. You can review and export information directly from the review set.\nTo learn more about identifying and deleting user AI interaction data, see\nSearch for and delete Copilot data in eDiscovery\n.\nData Lifecycle Management and AI interactions\nMicrosoft Purview Data Lifecycle Management\nprovides tools and capabilities to manage the lifecycle of organizational data by retaining necessary content and deleting unnecessary content. These tools ensure compliance with business, legal, and regulatory requirements.\nUse\nretention policies\nto automatically retain or delete user prompts and responses for AI apps. For detailed information about this retention works, see\nLearn about retention for Copilot & AI apps\n.\nAs with all retention policies and holds, if more than one policy for the same location applies to a user, the\nprinciples of retention\nresolve any conflicts. For example, the data is retained for the longest duration of all the applied retention policies or eDiscovery holds.\nCompliance Manager and AI interactions\nMicrosoft Purview Compliance Manager\nis a solution that helps you automatically assess and manage compliance across your multicloud environment. Compliance Manager can help you throughout your compliance journey, from taking inventory of your data protection risks to managing the complexities of implementing controls, staying current with regulations and certifications, and reporting to auditors.\nTo help you keep compliant with AI regulations, Compliance Manager provides regulatory templates to help you assess, implement, and strengthen your compliance requirements for all generative AI apps. For example, monitoring AI interactions and preventing data loss in AI applications. For more information, see\nAssessments for AI regulations\n.\nOther documentation to help you secure and manage generative AI apps\nBlog post announcement:\nAccelerate AI adoption with next-gen security and governance capabilities\nMicrosoft 365 Copilot:\nMicrosoft 365 Copilot documentation\nApply principles of Zero Trust to Microsoft 365 Copilot\nRelated resources:\nSecure Generative AI with Microsoft Entra\nGovern AI apps and data for regulatory compliance\nBlog posts (April 2025):\nHow to deploy Microsoft Purview DSPM for AI to secure your AI apps\nHow to use DSPM for AI Data Risk Assessment to Address Internal Oversharing\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "DSPM for AI", @@ -692,7 +692,7 @@ "https://learn.microsoft.com/en-us/purview/communication-compliance": { "content_hash": "sha256:000a343897848db5130064910fe1ec45a9b16ebd1a727367892b7a0c8f8981db", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nLearn about Communication Compliance\nFeedback\nSummarize this article for me\nImportant\nMicrosoft Purview Communication Compliance\nprovides the tools to help organizations detect regulatory compliance (for example, SEC or FINRA) and business conduct violations such as sensitive or confidential information, harassing or threatening language, and sharing of adult content. Communication Compliance is built with privacy by design. Usernames are pseudonymized by default, role-based access controls are built in, investigators are opted in by an admin, and audit logs are in place to help ensure user-level privacy.\nMicrosoft Purview Communication Compliance is an insider risk solution that helps you minimize communication risks by helping you detect, capture, and act on potentially inappropriate messages in your organization. Predefined and custom policies allow you to check internal and external communications for policy matches so designated reviewers can examine them. Reviewers can investigate email, Microsoft Teams, Microsoft 365 Copilot and Microsoft 365 Copilot Chat, Viva Engage, or third-party communications in your organization and take appropriate actions to make sure they're compliant with your organization's message standards.\nCommunication Compliance policies in Microsoft 365 help you overcome many modern challenges associated with compliance and internal and external communications, including:\nChecking increasing types of communication channels\nThe increasing volume of message data\nRegulatory enforcement and the risk of fines\nAdditionally, there might be a separation of duties between your IT admins and your compliance management team. Communication Compliance supports the separation between configuration of policies and the investigation and review of messages. For example, the IT group for your organization might be responsible for setting up Communication Compliance role permissions, groups, and policies. Investigators and reviewers might be responsible for message triage, review, and mitigation actions.\nFor more information and an overview of the planning process to address compliance and risky activities in your organization, see\nStarting an Insider Risk Management program\n.\nWatch the following video to learn how to fulfill regulatory compliance requirements with Communication Compliance:\nImportant\nCommunication Compliance is currently available in tenants hosted in geographical regions and countries supported by Azure service dependencies. To verify that Communication Compliance is supported for your organization, see\nAzure dependency availability by country/region\n.\nScenarios for Communication Compliance\nCommunication Compliance policies can help you review messages in your organization for several important compliance areas:\nCorporate policies\nUsers must comply with acceptable use, ethical standards, and other corporate policies in all their business-related communications. Communication Compliance policies can detect policy matches and help you take corrective actions to help mitigate these types of incidents. For example, you can check user communications in your organization for human resources concerns such as harassment or the use of potentially inappropriate or offensive language.\nRisk management\nOrganizations are responsible for all communications distributed throughout their infrastructure and corporate network systems. By using Communication Compliance policies to help identify and manage potential legal exposure and risk, you can minimize risks before they damage corporate operations. For example, you can check messages in your organization for unauthorized communications and conflicts of interest about confidential projects such as upcoming acquisitions, mergers, earnings disclosures, reorganizations, or leadership team changes.\nRegulatory compliance\nMost organizations must comply with some type of regulatory compliance standards as part of their normal operating procedures. These regulations often require organizations to implement some type of scoping or oversight process for messaging that is appropriate for their industry. The Financial Industry Regulatory Authority (FINRA) Rule 3110 is a good example of a requirement for organizations to have scoping procedures in place to check user communications and the types of businesses in which it engages. Another example might be a need to review broker-dealer communications in your organization to safeguard against potential insider trading, collusion, or bribery activities. Communication Compliance policies can help your organization meet these requirements by providing a process to both analyze and report on corporate communications. For more information on support for financial organizations, see\nKey compliance and security considerations for US banking and capital markets\n.\nKey feature areas\nCommunication Compliance offers several important features to help address compliance concerns on your messaging platforms:\nIntelligent customizable templates\nFlexible remediation workflows\nActionable insights\nIntelligent customizable templates\nIntelligent customizable templates in Communication Compliance help you apply machine learning to detect communication violations in your organization.\nCustomizable pre-configured templates\n: Policy templates help address the most common communications risks. You can create and update policies faster with predefined templates that analyze and mitigate potentially inappropriate content, sensitive information, conflict of interest, and regulatory compliance issues.\nNew machine learning support\n: Built-in\nclassifiers\nanalyze and mitigate discrimination, threats, harassment, profanity, and potentially inappropriate images. They help reduce misclassified content in communication messages, saving reviewers time during the investigation and remediation process.\nImproved condition builder\n: You can now configure policy conditions through a single, integrated experience in the policy workflow. This update reduces confusion about how conditions apply to policies.\nFlexible remediation workflows\nBuilt-in remediation workflows help you quickly identify and take action on messages with policy matches in your organization. The following new features increase efficiency for investigation and remediation activities:\nFlexible remediation workflow\n: The new remediation workflow helps you quickly take action on policy matches. New options let you escalate messages to other reviewers and send email notifications to users with policy matches.\nConversation policy matching\n: Messages in conversations group by policy matches to give you more visibility about how conversations relate to your communication policies. For example, conversation policy matching in the\nPending\ntab automatically shows all messages in a Teams channel that match your communications policies for analyzing and mitigating potentially inappropriate messages. Other messages in conversations that don't match your communications policies don't display.\nKeyword highlighting\n: Terms that match policy conditions highlight in the message text view to help reviewers quickly analyze and remediate policy alerts.\nOptical character recognition (OCR)\n: You can check, detect, and investigate printed and handwritten text within images embedded or attached to email or Microsoft Teams chat messages.\nNew filters\n: You can investigate and remediate policy alerts faster with message filters for several fields, including sender, recipient, date, domains, and many more.\nImproved message views\n: Investigation and remediation actions are now quicker with new message source and text views. You can now view message attachments to provide complete context when taking remediation actions.\nUser history\n: A historical view of all user message remediation activities, such as past notifications and escalations for policy matches, now provides reviewers with more context during the remediation workflow process. First-time or repeat instances of policy matches for users are now archived and easily viewable.\nPattern detected notification\n: Many harassing and bullying actions take place over time and involve recurring instances of the same behavior by a user. The pattern detected notification displayed in alert details helps raise attention to these alerts and this type of behavior.\nTranslation\n: You can quickly investigate message details in eight languages using translate support in the remediation workflow. Messages in other languages automatically convert to the display language of the reviewer.\nAttachment detection\n: You can check, detect, and investigate linked content (Modern attachments) from OneDrive and Microsoft Teams that match policy classifiers and conditions for Microsoft Teams messages. Attachment content is automatically extracted to a text file for detailed review and action.\nSummarize message content for policy matches\n: Save time for investigators by\nusing Microsoft Copilot in Microsoft Purview\nto summarize lengthy Teams, email, or Viva Engage messages. Copilot in Microsoft Purview creates a summary of the conversation, including recordings, meeting transcripts, and attachments.\nActionable insights\nNew interactive dashboards for alerts, policy matches, actions, and trends help you quickly view the status of pending and resolved alerts in your organization.\nProactive intelligent alerts\n: Get alerts for policy matches that need immediate attention. New dashboards show pending items sorted by severity. Designated reviewers receive new automatic email notifications.\nInteractive dashboards\n: View new dashboards that display policy matches, pending and resolved actions, and trends by users and policy.\nAuditing support\n: Easily export a full log of policy and review activities from the Microsoft Purview portal to help support audit review requests.\nIntegration with Microsoft 365 services\nCommunication Compliance policies check, detect, and capture messages across several communication channels to help you quickly review and remediate compliance issues:\nGenerative AI\n: Use Communication Compliance policies to analyze interactions (prompts and responses) entered into generative AI applications to help detect inappropriate or risky interactions or sharing of confidential information. Coverage is supported for\nMicrosoft 365 Copilot\n, Copilots built using\nMicrosoft Copilot Studio\n, AI applications connected by\nMicrosoft Entra\nor\nMicrosoft Purview Data Map\nconnectors, and more.\nMicrosoft Teams\n: Communication Compliance supports chat communications for public and private\nMicrosoft Teams\nchannels and individual chats as a standalone channel source or with other Microsoft 365 services. You can also detect communications included in meetings transcripts (preview). You need to manually add individual users, distribution groups, or specific Microsoft Teams channels when you select users and groups to apply a Communication Compliance policy to. Teams users can also self-report potentially inappropriate messages in private and group channels and chats for review and remediation.\nExchange Online\n: All mailboxes hosted on\nExchange Online\nin your Microsoft 365 organization are eligible for analyses. Emails and attachments matching Communication Compliance policy conditions are instantly available for investigation and in compliance reports. Exchange Online is now an optional source channel and is no longer required in Communication Compliance policies.\nMicrosoft 365 Copilot and Microsoft 365 Copilot Chat\n: Communication Compliance policies detect interactions (prompts and responses) entered by users into Copilot.\nViva Engage\n: Communication Compliance policies support private messages and public community conversations in\nViva Engage\n. Viva Engage is an optional channel and must be in\nnative mode\nto support checking of messages and attachments.\nThird-party sources\n: You can check messages from\nthird-party sources\nfor data imported into mailboxes in your Microsoft 365 organization. Communication Compliance supports connections to several popular platforms, including Instant Bloomberg and others.\nTo learn more about messaging channel support in Communication Compliance policies, see\nDetect channel signals with Communication Compliance\n.\nWatch the following video to learn how to detect communication risks in Microsoft Teams with Communication Compliance:\nIntegration with Microsoft Purview Insider Risk Management\nWhen users experience employment stressors, they might engage in risky activities. Workplace stress can lead to uncharacteristic or malicious behavior by some users that surfaces as potentially inappropriate behavior on your organization's messaging systems. Counterproductive work behavior can be a precursor to more serious violations, such as sabotaging company assets or leaking sensitive information. By integrating Communication Compliance with\nMicrosoft Purview Insider Risk Management\n, you can detect stressors that indicate an unhealthy workplace environment.\nLearn more about integrating Communication Compliance with Insider Risk Management\nGet started with recommended actions\nWhether you're setting up Communication Compliance for the first time or getting started with creating new policies, the new\nrecommended actions\nexperience can help you get the most out of Communication Compliance capabilities. Recommended actions include setting up permissions, creating distribution groups, creating policies, and more.\nWorkflow\nCommunication Compliance helps you address common pain points associated with complying with internal policies and regulatory compliance requirements. With focused policy templates and a flexible workflow, you can use actionable insights to quickly resolve detected compliance issues.\nBefore you create a policy, decide whether you want to apply an\nadaptive scope\n. For more information, see\nAdaptive policy scopes for compliance solutions\n. If you decide to create an adaptive policy, you must create one or more adaptive scopes before you create your policy, then select them during the create new policy process. For instructions, see\nConfiguration information for adaptive scopes\n.\nIdentifying and resolving compliance issues with Communication Compliance uses the following workflow:\nConfigure\nIn this workflow step, you identify your compliance requirements and configure applicable Communication Compliance policies. Policy templates are a great way to not only quickly configure a new compliance policy but also quickly modify and update policies as your requirements change. For example, you might want to quickly test a policy for potentially inappropriate content on communications for a small group of users before configuring a policy for all users in your organization.\nImportant\nBy default, Global Administrators don't have access to Communication Compliance features. To enable permissions for Communication Compliance features, see\nAssign permissions in Communication Compliance\n.\nYou can choose from the following policy templates in the Microsoft Purview portal:\nDetect inappropriate text\n: Use this template to quickly create a policy that uses built-in classifiers to automatically detect text in messages that might be considered inappropriate, abusive, or offensive.\nDetect inappropriate images\n: Use this template to quickly create a policy that uses built-in classifiers to automatically detect content that contains adult and racy images that might be considered inappropriate in your organization.\nDetect sensitive info types\n: Use this template to quickly create a policy to check communications containing defined sensitive information types or keywords to help make sure that important data isn't shared with people that shouldn't have access.\nDetect financial regulatory compliance\n: Use this template to quickly create a policy to check communications for references to standard financial terms associated with regulatory standards.\nDetect conflict of interest\n: Use this template to quickly create a policy to detect communications between two groups or two users to help avoid conflicts of interest.\nCustom policy\n: Use this template to configure specific communication channels, individual detection conditions, and the amount of content to detect and review in your organization.\nUser-reported messages policy\n: This system policy supports user reported messages from channel, group, and private chat messages. Enabled by default in the Teams admin center.\nTip\nUse\nrecommended actions\nto help you determine if you need a sensitive information type policy or if you need to update existing inappropriate content policies.\nInvestigate\nIn this step, you can look deeper into the issues detected as matching your Communication Compliance policies. This step includes the following actions available in the Microsoft Purview portal:\nAlerts\n: When a group of messages matches a policy condition, an alert is automatically generated. For each alert, you can see the status, the severity, the time detected, and if an eDiscovery (Premium) case is assigned and its status. New alerts are displayed on the Communication Compliance home page and the\nAlerts\npage and are listed in order of severity.\nIssue management\n: For each alert, you can take investigative actions to help remediate the issue detected in the message.\nDocument review\n: During the investigation of an issue, you can use several views of the message to help properly evaluate the detected issue. The views include a conversation summary, text-only, and detail views of the communication conversation.\nReviewing user activity history\n: View the history of user message activities and remediation actions, such as past notifications and escalations, for policy matches.\nFilters\n: Use filters such as sender, recipient, date, and subject to quickly narrow down the message alerts that you want to review.\nRemediate\nRemediate Communication Compliance issues you investigate by using the following options:\nResolve\n: After reviewing an issue, resolve the alert. Resolving an alert removes it from the\nPending\ntab. The action is preserved as an entry on the\nResolved\ntab for the matching policy. Alerts are automatically resolved after you mark the alert as misclassified, send a notice to a user about the alert, or open a new case for the alert.\nTag a message\n: As part of the resolution of an issue, tag a policy match as\nCompliant\n,\nNon-compliant\n, or\nQuestionable\nas it relates to the policies and standards for your organization. You can also\ncreate a custom tag\n. Tagging can help you micro-filter policy alerts for escalations or as part of other internal review processes. You can filter on any tag value.\nNotify the user\n: Often, users accidentally or inadvertently violate a Communication Compliance policy. Use the notify feature to provide a warning notice to the user and to resolve the issue.\nEscalate to another reviewer\n: Sometimes, the initial reviewer of an issue needs input from other reviewers to help resolve the incident. You can easily escalate message issues to reviewers in other areas of your organization as part of the resolution process.\nReport as misclassified\n: Messages incorrectly detected as matches of compliance policies will occasionally slip through to the review process. Mark these types of alerts as misclassified to submit feedback to Microsoft about the misclassification to help improve global classifiers and automatically resolve the issue.\nRemove message in Teams\n: Potentially inappropriate messages can be removed from displaying in Microsoft Teams channels or personal and group chat messages. Those identified messages that are removed are replaced with a notification that the message has been removed for a policy violation.\nEscalate for investigation\n: In the most serious situations, you might need to share Communication Compliance information with other reviewers in your organization. Communication Compliance is tightly integrated with other Microsoft Purview features to help you with end-to-end risk resolution. Escalating a case for investigation allows you to transfer data and management of the case to Microsoft Purview eDiscovery (Premium). eDiscovery (Premium) provides an end-to-end workflow to preserve, collect, review, analyze, and export content that's responsive to your organization's internal and external investigations. It allows legal teams to manage the entire legal hold notification workflow. To learn more about eDiscovery (Premium) cases, see\nOverview of Microsoft Purview eDiscovery (Premium)\n.\nMaintain\nKeeping track and mitigating compliance issues identified by Communication Compliance policies spans the entire workflow process. As alerts are generated and investigation and remediation actions are implemented, you might need to review and update existing policies or create new policies.\nReview and report\n: Use Communication Compliance dashboard widgets, export logs, and events recorded in the unified audit logs to continually evaluate and improve your compliance posture.\nGet insights on policy health\n: Communication Compliance provides warnings and recommendations to improve policy health.\nLearn more about policy health\nReady to get started?\nFor planning information, see\nPlan for Communication Compliance\n.\nCheck out the\ncase study for Contoso\nand see how they quickly configured a Communication Compliance policy to detect potentially inappropriate content in Microsoft Teams, Exchange Online, and Viva Engage communications.\nTo configure Communication Compliance for your Microsoft 365 organization, see\nConfigure Communication Compliance\n.\nMore resources\nFor the latest Ignite videos about Communication Compliance, see the following resources:\nFoster a culture of safety and inclusion with Communication Compliance\nLearn how to reduce communication risks within your organization\nBetter with Microsoft Teams - Learn more about the latest native Teams integrated features in Communication Compliance\nFor a quick overview of Communication Compliance, see the\nDetect workplace harassment and respond with Communication Compliance\nvideo on the\nMicrosoft Mechanics channel\n.\nSee how\nTD Securities is using Communication Compliance\nto address their regulatory obligations and meet their security and stability needs.\nWatch the\nMicrosoft Mechanics video\non how Insider Risk Management and Communication Compliance work together to help minimize data risks from users in your organization.\nTo keep up with the latest Communication Compliance updates, select\nWhat's new\nin the Communication Compliance solution for your organization.\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Communication Compliance", @@ -701,7 +701,7 @@ "https://learn.microsoft.com/en-us/purview/communication-compliance-policies": { "content_hash": "sha256:272197b4d118a1df97d8b197a6ec8b19f7dd40e5f4a8713066d2ef0f8fce3e0a", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nCreate and manage Communication Compliance policies\nFeedback\nSummarize this article for me\nImportant\nMicrosoft Purview Communication Compliance\nprovides the tools to help organizations detect regulatory compliance (for example, SEC or FINRA) and business conduct violations such as sensitive or confidential information, harassing or threatening language, and sharing of adult content. Communication Compliance is built with privacy by design. Usernames are pseudonymized by default, role-based access controls are built in, investigators are opted in by an admin, and audit logs are in place to help ensure user-level privacy.\nPolicies\nImportant\nPowerShell isn't supported for creating and managing Communication Compliance policies. To create and manage these policies, use the policy management controls in the Communication Compliance solution.\nCreate Communication Compliance policies for Microsoft 365 organizations in the Microsoft Purview portal. Communication Compliance policies define which communications and users are subject to review in your organization, set custom conditions the communications must meet, and specify who should do reviews. Users assigned the\nCommunication Compliance Admins\nrole can set up policies. Anyone with this role can access the\nCommunication Compliance\npage and global settings in Microsoft Purview. If needed, you can export the history of modifications to a policy to a .csv (comma-separated values) file that also includes the status of alerts pending review, escalated items, and resolved items. You can't rename policies. You can delete policies when no longer needed.\nPolicy templates\nPolicy templates are predefined policy settings that you can use to quickly create policies to address common compliance scenarios. Each of these templates has differences in conditions and scope. All templates use the same types of detection signals. You can choose from the following policy templates:\nArea\nPolicy Template\nDetails\nConflict of interest\nDetect conflict of interest\n- Locations: Exchange Online, Microsoft Teams, Viva Engage\n- Direction: Internal\n- Review Percentage: 100%\n- Conditions: None\nCopilot interactions\nDetect Microsoft 365 Copilot and Microsoft 365 Copilot Chat interactions\n- Location: Microsoft 365 Copilot and Microsoft 365 Copilot Chat\n- Direction: Inbound, Outbound, Internal\n- Review Percentage: 100%\n- Conditions: Prompt Shields, Protected material classifiers\nInappropriate content\nDetect inappropriate content\n- Location: Microsoft Teams, Viva Engage\n- Direction: Inbound, Outbound, Internal\n- Review Percentage: 100%\n- Conditions: Hate, Violence, Sexual, Self-harm classifiers\nInappropriate images\nDetect inappropriate images\n- Locations: Exchange Online, Microsoft Teams\n- Direction: Inbound, Outbound, Internal\n- Review Percentage: 100%\n- Conditions: Adult and Racy image classifiers\nInappropriate text\nDetect inappropriate text\n- Locations: Exchange Online, Microsoft Teams, Viva Engage\n- Direction: Inbound, Outbound, Internal\n- Review Percentage: 100%\n- Conditions: Threat, Discrimination, and Targeted harassment classifiers\nRegulatory compliance\nDetect financial regulatory compliance\n- Locations: Exchange Online, Microsoft Teams, Viva Engage\n- Direction: Inbound, Outbound\n- Review Percentage: 10%\n- Conditions: Customer complaints, Gifts & entertainment, Money laundering, Regulatory collusion, Stock manipulation, and Unauthorized disclosure classifiers\nSensitive information\nDetect sensitive info types\n- Locations: Exchange Online, Microsoft Teams, Viva Engage\n- Direction: Inbound, Outbound, Internal\n- Review Percentage: 10%\n- Conditions: Sensitive information, out-of-the-box content patterns, and types, custom dictionary option, attachments larger than 1 MB\nUser-reported messages policy\nNote\nThe\nUser-reported messages\npolicy is implemented for your organization when you purchase a license that includes Microsoft Purview Communication Compliance. However, it can take up to 30 days for this feature to be available after you purchase the license.\nAs part of a layered defense to detect and remediate inappropriate messages in your organization, you can supplement Communication Compliance policies with user-reported messages in Microsoft Teams and Viva Engage (preview). To help foster a safe and compliant work environment, this feature empowers users in your organization to self-report inappropriate internal Teams chat messages and Viva Engage conversations, such as harassing or threatening language, sharing of adult content, and sharing of sensitive or confidential information.\nMicrosoft Teams\nEnabled by default in the\nTeams admin center\n, the\nReport inappropriate content\noption in Teams messages lets users in your organization submit inappropriate internal personal and group chat messages for review by Communication Compliance reviewers for the policy. A default system policy supports these messages and supports reporting messages in Teams group and private chats.\nAzure AI Content Safety's hate, self-harm, sexual, and violence classifiers evaluate user-reported messages. This evaluation helps Communication Compliance Investigators understand any potential risk in user-reported content more quickly so they can take confident action to mitigate risk.\nWhen a user submits a Teams chat message for review, the message is copied to the User-reported message policy. Reported messages initially remain visible to all chat members and there's no notification to chat members or the submitter that a message has been reported in channel, private, or group chats. A user can't report the same message more than once and the message remains visible to all users included in the chat session during the policy review process.\nDuring the review process, Communication Compliance reviewers can perform all the standard\nremediation actions\non the message, including removing the message from the Teams chat. Depending on how the messages are remediated, the message sender and recipients see different notification messages in Teams chats after the review. Any user-reported content that triggers the Content Safety classifiers shows the classifier name that flagged the message and a corresponding severity value is visible in the\nSeverity\ncolumn.\nImportant\nIf a user reports a message that was sent before they were added to a chat, Teams message remediation isn't supported; the Teams message can't be removed from the chat.\nUser-reported messages from Teams chats are the only messages processed by the User-reported message policy and you can only modify the assigned reviewers for the policy. You can't edit all other policy properties. When you create the policy, all members of the\nCommunication Compliance Admins\nrole group (if populated with at least one user) or all members of your organization's\nGlobal Admin\nrole group are assigned as initial reviewers. A randomly selected user from the\nCommunication Compliance Admins\nrole group (if populated with at least one user) or a randomly selected user from your organization's\nGlobal Admin\nrole group is the policy creator.\nImportant\nMicrosoft recommends that you use roles with the fewest permissions. Minimizing the number of users with the Global Administrator role helps improve security for your organization. Learn more about Microsoft Purview\nroles and permissions\n.\nAdmins should immediately assign custom reviewers to this policy as appropriate for your organization. This assignment might include reviewers such as your Compliance Officer, Risk Officer, or members of your Human Resources department.\nCustomize the reviewers for chat messages submitted as user-reported messages\nSign in to the\nMicrosoft Purview portal\nusing credentials for an admin account in your Microsoft 365 organization.\nGo to the\nCommunication Compliance\nsolution.\nSelect\nPolicies\nin the left navigation.\nSelect the\nUser-reported messages\npolicy, then select\nEdit\n.\nOn the\nDetect user-reported messages\npane, assign reviewers for the policy. Reviewers must have mailboxes hosted on Exchange Online. When you add reviewers to a policy, they automatically receive an email message that notifies them of the assignment to the policy and provides links to information about the review process.\nSelect\nSave\n.\nThe\nReport inappropriate content\noption is enabled by default and you can control it via Teams messaging policies in the\nTeams Admin Center\n. Users in your organization automatically receive the global policy, unless you create and assign a custom policy. Edit the settings in the global policy or create and assign one or more custom policies to turn on or turn off the\nReport inappropriate content\noption. For more information, see\nManage messaging policies in Teams\n.\nImportant\nIf you use PowerShell to turn on or turn off the\nEnd user reporting\noption in the Teams Admin Center, use\nMicrosoft Teams cmdlets module version 4.2.0\nor later.\nViva Engage\nThe\nReport Conversations\noption is off by default in the Viva Engage admin center. When you turn on this option, you see different options depending on whether you have a license that includes Communication Compliance:\nLicenses that don't include Communication Compliance\n. If you don't have a license that includes Communication Compliance, when you turn on the option, the Viva Engage admin can specify an email address to receive reported conversations. The admin can also enter pre-submission instructions and post-submission confirmations for the user.\nLearn more about enabling the\nReport Conversations\noption if you don't have a license that includes Communication Compliance\nLicenses that do include Communication Compliance\n. If you have a license that includes Communication Compliance, when you turn on the option, any reported conversations automatically route through Communication Compliance for investigation.\nHow upgrades and downgrades in licensing affect the reported conversations workflow\nIf you upgrade from a license that doesn't include Communication Compliance to a license that includes Communication Compliance and turn on the\nReport Conversations\noption, reported conversations automatically route to Communication Compliance. The Viva Engage admin can continue to access earlier reported conversations and can view new reported conversations\nif they're added as an investigator in Communication Compliance\n.\nIf you downgrade from a license that includes Communication Compliance to a license that doesn't include Communication Compliance and turn on the\nReport Conversations\noption, the workflow reverts to the process described in this article for customers that don't have a Communication Compliance license. To continue to use the feature, an admin must turn on the\nReport Conversations\noption in the Viva Engage admin center again and specify an email address to send reported conversations to.\nUser experience for customers that have a license that includes Communication Compliance\nIf you choose to route user-reported conversations through Communication Compliance, you can create pre-submission instructions for the user.\nTo report a conversation in Viva Engage, the user selects the ellipsis (three dots), then selects\nReport conversation\n.\nNote\nThe command name changes to\nReport comment\n,\nReport question\n, or\nReport answer\ndepending on where the user is in Viva Engage or Answers in Viva.\nIf the user submits the report successfully, they see a confirmation message.\nWhen a conversation is reported, the conversation is copied to the Communication Compliance User-reported message policy. The Communication Compliance investigator can review reported conversations in the User Reported policy inbox and can do all the standard\nremediation actions\nfor the conversation.\nTurn on the Report Conversation feature in the Viva Engage admin center\nGo to the Viva Engage admin center.\nOn the\nSettings\npage, select\nReport Conversations\n.\nTurn on the setting.\nUnder\nReport recipient\n, add an email address to route reports to. This email address is used to notify the admin in case there's any problem with the configuration.\nIn the\nPre-submission details or instructions for users\nbox, enter any instructions you want to display for the user.\nIn the\nPost-submission instructions to user\nbox, enter any instructions (optional) that the user will see after reporting a conversation.\nIntegrate Communication Compliance with Microsoft Purview Insider Risk Management\nWhen users experience employment stressors, they might engage in risky activities. Workplace stress can lead to uncharacteristic or malicious behavior by some users that surfaces as potentially inappropriate behavior on your organization's messaging systems. Counterproductive work behavior can be a precursor to more serious violations, such as sabotaging company assets or leaking sensitive information. By integrating Communication Compliance with\nMicrosoft Purview Insider Risk Management\n, you can detect stressors that indicate an unhealthy workplace environment.\nYou can integrate Communication Compliance with Insider Risk Management by automatically creating a Communication Compliance policy from Insider Risk Management. You can create this type of policy in two different ways:\nBy creating an Insider Risk Management trigger for policy templates\nBy selecting Insider Risk Management policy indicators for data-based policy templates\nBy selecting generative AI policy indicators for policy templates\nCreate an Insider Risk Management trigger for policy templates\nCommunication Compliance can provide risk signals detected in applicable messages to Insider Risk Management risky user policies by using a dedicated\nDetect inappropriate text\npolicy. This policy is automatically created if you select it as an option when creating a policy by using the\nData leaks by risky users template\nor the\nSecurity policy violations by risky users template\nin Insider Risk Management. For example, the following screenshot shows the option selected in the\nData leaks by risky users\ntemplate.\nWhen configured for an Insider Risk Management policy, Communication Compliance creates a dedicated policy named\nInsider risk trigger - (date created)\nand automatically includes all organization users in the policy. This policy starts detecting risky behavior in messages by using the Microsoft provided\nThreat, Harassment, and Discrimination trainable classifiers\nand automatically sends these signals to Insider Risk Management. If needed, you can edit this policy to update the scope of included users and the policy conditions and classifiers.\nUsers who send five or more messages classified as potentially risky within 24 hours are automatically brought in-scope for Insider Risk Management policies that include this option. Once in-scope, the Insider Risk Management policy detects potentially risky activities configured in the policy and generates alerts as applicable. It can take up to 48 hours from the time risky messages are sent until the time a user is brought in-scope in an Insider Risk Management policy. If an alert is generated for a potentially risky activity detected by the Insider Risk Management policy, the triggering event for the alert is identified as being sourced from the Communication Compliance risky activity.\nAll users assigned to the\nInsider Risk Management Investigators\nrole group are automatically assigned as reviewers in the dedicated Communication Compliance policy. If Insider Risk Management investigators need to review the associated risky user alert directly on the Communication Compliance alerts page (linked from the Insider Risk Management alert details), you must manually add them to the\nCommunication Compliance Investigators\nrole group.\nBefore integrating Communication Compliance with Insider Risk Management, consider the following guidance when detecting messages containing potentially inappropriate text:\nFor organizations without an existing\nDetect inappropriate text\npolicy\n. The Insider Risk Management policy workflow automatically creates the new\nRisky user in messages - (date created)\npolicy. In most cases, you don't need to take any further action.\nFor organizations with an existing\nDetect inappropriate text\npolicy\n. The Insider Risk Management policy workflow automatically creates the new\nRisky user in messages - (date created)\npolicy. Although you have two Communication Compliance policies for potentially inappropriate text in messages, investigators don't see duplicate alerts for the same activity. Insider Risk Management investigators only see alerts for the dedicated integration policy and Communication Compliance investigators only see the alerts for the existing policy. If needed, you can edit the dedicated policy to change the included users or individual policy conditions as applicable.\nSelect Insider Risk Management policy indicators for data-based policy templates\nThe\nPolicy indicators\nsetting in Insider Risk Management provides Communication Compliance indicators to choose from when using the\nData theft\n,\nData leaks\n,\nData leaks by risky users\n, and\nData leaks by priority users\ntemplates:\nSending financial regulatory text that might be risky\nSending inappropriate images\nSending inappropriate content\nSending messages that contain specific sensitive info types\nFor more information on creating a Communication Compliance policy this way, see\nInsider Risk Management policy indicators settings\n.\nSelecting generative AI policy indicators for policy templates\nThe\nPolicy indicators\nsetting in Insider Risk Management provides generative AI app indicators to choose from when using the\nData leaks\n,\nData leaks by risky users\n,\nData leaks by priority users\n, or\nRisky AI usage\npolicy templates. These Azure AI Content Safety indicators help detect when prompts and responses in AI apps match the following classifiers:\nPrompt Shields\nProtected material detection\nFor more information on creating a Communication Compliance policy this way, see\nInsider Risk Management policy indicators settings\n. For more information on Azure AI Content Security, see\nAzure AI Content Security\n.\nPolicy health (preview)\nThe policy health status gives you insights into potential issues or optimizations for your Communication Compliance policies. At the top of the\nPolicies\npage, you see a summary that lists the total number of policy warnings and recommendations, and the total number of healthy policies. If a specific policy has a warning or recommendations, the\nPolicy health\ncolumn lists a link to the warning or recommendations. When you select the link, a details pane opens on the right side of the screen with the\nPolicy Health\ntab selected, which makes it easy to quickly review the warning or recommendation and take action on it.\nThe\nPolicy Health\ntab has two parts. The upper part of the tab shows general information about the policy, including whether the policy is active, whether it's updated, and general tips. The lower part of the tab shows the specific warning or recommendations. If you're a member of the\nCommunication Compliance\nor\nCommunication Compliance Admins\ngroup, you can take action directly from the warning or recommendation. If you're a member of the\nCommunication Compliance Analysts\nor\nCommunication Compliance Investigators\ngroup, you can see the warning or recommendation so you can prompt your admin to take action.\nWarnings\n: If you don't take action on a warning, the policy stops working. For the policy health preview, there's one warning, related to a policy's\nstorage limit size\n.\nRecommendations\n: Recommendations provide suggestions for optimizing policies. If you ignore a policy recommendation, the policy continues to work. If you're a member of the\nCommunication Compliance\nor\nCommunication Compliance Admins\ngroup, you can act on a recommendation as long as the policy is active and if it's not updating. If you're a member of the\nCommunication Compliance Analysts\nor\nCommunication Compliance Investigators\ngroup, you can see the recommendation and prompt your admin to take action. For the preview, there are two recommendations:\nPotentially reduce false positives by filtering out bulk emails\n: This recommendation prompts you to turn on the\nFilter email blasts setting\n. This setting helps reduce false positives by excluding bulk emails, such as newsletters, and spam, phishing, and malware, from getting flagged by Communication Compliance policies if the policy conditions are matched.\nReduce user scoping blind spots\n: This recommendation prompts you to\nreduce blind spots\nby turning on the\nShow insights and recommendations for users who match this policy's conditions but weren't included in the policy\ncheckbox (used with the\nSelected users\npolicy option).\nHealthy policies\n: If there are no warnings or recommendations for a specific policy, the policy is considered healthy.\nReduce user scoping blind spots\nYou can reduce user-scoping related blind spots by turning on the\nShow insights and recommendations for users who match this policy's conditions but weren't included in the policy\ncheckbox. When you turn on this checkbox, you see insights and recommendations for users who are sending messages that match the policy condition but aren't included in the scope of the policy. In this case, Communication Compliance recommends that you include those users in the scope of the policy so that you can review the messages sent by those users and take actions. You can select the specific users that you want to add or you can extend the policy to all users.\nInsights for these users are aggregated; you can't see the messages sent by them until you add them to the scope of the policy.\nIf an admin doesn't act on a recommendation, the recommendation continues to recur. To turn off recommendations for users outside the scope of the policy, turn off the checkbox.\nPause a policy\nAfter you create a Communication Compliance policy, you can temporarily pause the policy. Use pausing for testing or troubleshooting policy matches, or for optimizing policy conditions. Instead of deleting a policy in these circumstances, pausing a policy also preserves existing policy alerts and messages for ongoing investigations and reviews. Pausing a policy prevents inspection and alert generation for all user message conditions defined in the policy for the time the policy is paused. To pause or restart a policy, you must be a member of the\nCommunication Compliance Admins\nrole group.\nTo pause a policy, go to the\nPolicy\npage, select a policy, then select\nPause policy\nfrom the actions toolbar. On the\nPause policy\npane, confirm you'd like to pause the policy by selecting\nPause\n. In some cases, it can take up to 24 hours for a policy to be paused. Once the policy is paused, the system doesn't create alerts for messages matching the policy. However, messages associated with alerts that you created before pausing the policy remain available for investigation, review, and remediation.\nThe policy status for paused policies can show several states:\nActive\n: The policy is active.\nPaused\n: The policy is fully paused.\nPausing\n: The policy is in the process of being paused.\nResuming\n: The policy in the process of being resumed.\nError in resuming\n: An error was encountered when resuming the policy. For the error stack trace, hover your mouse over the\nError in resuming\nstatus in the Status column on the Policy page.\nError in pausing\n: An error was encountered when pausing the policy. For the error stack trace, hover your mouse over the\nError in pausing\nstatus in the Status column on the Policy page.\nTo resume a policy, go to the\nPolicy\npage, select a policy, then select\nResume policy\nfrom the actions toolbar. On the\nResume policy\npane, confirm you'd like to resume the policy by selecting\nResume\n. In some cases, it can take up to 24 hours for a policy to be resumed. Once the policy is resumed, the system creates alerts for messages matching the policy and makes them available for investigation, review, and remediation.\nCopy a policy\nIf your organization has existing Communication Compliance policies, you might find it helpful to create a new policy from an existing policy. When you copy a policy, you create an exact duplicate of an existing policy, including all included users, all assigned reviewers, and all policy conditions. Some scenarios where copying a policy is helpful include:\nPolicy storage limit reached\n: Active Communication Compliance policies have message storage limits. When the storage limit for a policy is reached, the policy is automatically deactivated. If your organization needs to continue detecting, capturing, and acting on inappropriate messages covered by the deactivated policy, you can quickly create a new policy with an identical configuration.\nDetect and review inappropriate messages for different groups of users\n: Some organizations prefer to create multiple policies with the same configuration but include different groups of users and different reviewers for each policy.\nSimilar policies with small changes\n: For policies with complex configurations or conditions, it might save time to create a new policy from a similar policy.\nTo copy a policy, you must be a member of the\nCommunication Compliance\nor\nCommunication Compliance Admins\nrole groups. After you create a new policy from an existing policy, it can take up to 24 hours to view messages that match the new policy configuration.\nTo copy a policy and create a new policy, complete the following steps:\nSelect the policy you want to copy.\nSelect\nCopy policy\non the command bar or select\nCopy policy\nfrom the action menu for the policy.\nIn the\nCopy policy\npane, accept the default name for the policy in the\nPolicy name\nfield or rename the policy. The policy name for the new policy can't be the same as an existing active or deactivated policy. Complete the\nDescription\nfield as needed.\nIf you don't need further customization of the policy, select\nCopy policy\nto complete the process. If you need to update the configuration of the new policy, select\nCustomize policy\n. This selection starts the policy workflow to help you update and customize the new policy.\nMark a policy as a favorite\nAfter you create a Communication Compliance policy, you can mark the policy as a favorite. When you mark a policy as a favorite, you can filter favorite policies to appear at the top of the policies list. By marking a policy as a favorite, you can also easily sort policies by favorites.\nTo mark a policy as a favorite, you have the following options:\nMark as favorite\n: Enables you to mark selected policies as favorites, so you can easily find the policies that you're most interested in rather than having to search for them.\nSort favorites\n: Sorts the policies by favorites, so your favorite policies appear at the top of the list.\nCustomize columns\n: Choose to list the favorites that you want to see. You can also choose to sort favorite policies in ascending or descending order.\nTo sort policies by groups:\nAll policies\n: This is the default view, displaying all the policies in the list.\nOnly favorites\n: Groups policies by favorites at the top of the list.\nPolicy activity detection\nCommunications are scanned every hour from the time you create a policy. For example, if you create an inappropriate content policy at 11:00 AM, the policy gathers Communication Compliance signals every hour starting from when you created the policy. Editing a policy doesn't change this time. To view the last scan date and Coordinated Universal Time (UTC) for a policy, go to the\nLast policy scan\ncolumn on the\nPolicy\npage. After creating a new policy, you might need to wait up to an hour to view the first policy scan date and time.\nThe following table outlines the time to detection for supported content types:\nContent type\nTime to detection\nEmail attachment\n24 hours\nEmail body content\n1 hour\nEmail metadata\n1 hour\nEmail OCR\n24 hours\nMicrosoft 365 Copilot and Microsoft 365 Copilot Chat body content (prompts and responses)\n1 hour\nTeam attachment\n24 hours\nTeams body content\n1 hour\nTeams metadata\n1 hour\nTeams modern attachment\n24 hours\nTeams OCR\n24 hours\nTeams shared channels\n24 hours\nTeams transcripts\n1 hour\nViva Engage attachment\nUp to 24 hours\nViva Engage body content\n1 hour\nFor existing policies created before July 31, 2022, it might take up to 24 hours to detect messages and review alerts that match these policies. To reduce the latency for these policies,\ncopy the existing policy\nand create a new policy from the copy. If you don't need to retain any data from the older policy, you can pause or delete it.\nTo identify an older policy, review\nLast policy scan\ncolumn on the\nPolicy\npage. Older policies display a full date for the scan while policies created after July 31, 2022 display\n1 hour ago\nfor the scan. To reduce latency, you can also wait until February 28, 2023 for your existing policies to be automatically migrated to the new detection criteria.\nStorage limit notification\nEach Communication Compliance policy has a storage limit size of 100 GB or 1 million messages, whichever limit you reach first. As the policy approaches these limits, the system automatically sends notification emails to users assigned to the\nCommunication Compliance\nor\nCommunication Compliance Admins\nrole groups. The system sends notification messages when the storage size or message count reaches 80, 90, and 95 percent of the limit. When the policy limit is reached, the system automatically deactivates the policy, and the policy stops processing messages for alerts.\nImportant\nIf the system deactivates a policy because it reached the storage and message limits, evaluate how to manage the deactivated policy. If you delete the policy, you permanently delete all messages, associated attachments, and message alerts. If you need to maintain these items for future use, don't delete the deactivated policy.\nTo manage policies that approach the storage and message limits, consider making a copy of the policy to maintain coverage continuity or take the following actions to help minimize current policy storage size and message counts:\nReduce the number of users assigned to the policy. Removing users from the policy or creating different policies for different groups of users can help slow the growth of policy size and total messages.\nExamine the policy for excessive false positive alerts. Add exceptions or make changes to the policy conditions to ignore common false positive alerts.\nMake a copy of the policy if a policy reaches the storage or message limits and deactivates it to continue detecting and taking action for the same conditions and users.\nPolicy settings\nUsers\nYou can select\nAll users\n, define specific users in a Communication Compliance policy, or select an adaptive scope.\nAll users\n: When you select\nAll users\n, you apply the policy to all users and all groups that any user is included in as a member.\nSelect users\n: When you define specific users, you apply the policy to the defined users and any groups the defined users are included in as a member. If you choose the\nSelected users\noption, the\nShow insights and recommendations for users who match this policy's conditions but weren't included in the policy\ncheckbox automatically appears. Leave this checkbox selected if you want to\nreceive recommendations when users outside of the selected users match the policy conditions\n. This setting isn't available if you choose the\nAll users\noption or the\nSelect adaptive scopes\noption.\nSelect adaptive scope\n: An adaptive scope uses a query that you specify to define the membership of users or groups. If you decide to create an adaptive policy, you must\ncreate one or more adaptive scopes\nbefore you create your policy, and then select them when you choose this option. The scopes that you can select depend on the scope types that you add. For example, if you only added a scope type of\nUser\n, you can select\nGroups\n.\nLearn more about the advantages of using an adaptive scope\n.\nDirection\nBy default, the\nDirection is\ncondition appears and you can't remove it. Choose communication direction settings in a policy individually or together:\nInbound\n: Detects communications sent\nto\nscoped users from external and internal senders, including other scoped users in the policy.\nOutbound\n: Detects communications sent\nfrom\nscoped users to external and internal recipients, including other scoped users in the policy.\nInternal\n: Detects communications\nbetween\nthe scoped users or groups in the policy.\nSensitive information types\nYou can include sensitive information types as part of your Communication Compliance policy. Sensitive information types are either predefined or custom data types that help identify and protect credit card numbers, bank account numbers, passport numbers, and more. As part of\nLearn about Microsoft Purview Data Loss Prevention\n, the sensitive information configuration can use patterns, character proximity, confidence levels, and even custom data types to help identify and flag content that might be sensitive. The default sensitive information types are:\nFinancial\nMedical and health\nPrivacy\nCustom information type\nImportant\nSensitive info types have two different ways of defining the max unique instance count parameters. To learn more, see\nCreate custom sensitive information types\n.\nThe Communication Compliance solution supports default sensitive information types and bundled named-entity sensitive information types, which are collections of sensitive information types. For more information about sensitive information details and the patterns included in the default types, see\nSensitive information type entity definitions\n. For information on supported bundled named-entity sensitive information types, see the following articles:\nAll credentials\nAll full names\nAll medical terms and conditions\nAll Physical Addresses\nFor more information about sensitive information details and the patterns included in the default types, see\nSensitive information type entity definitions\n.\nCustom keyword dictionaries\nConfigure custom keyword dictionaries (or lexicons) to easily manage keywords specific to your organization or industry. Keyword dictionaries support up to 100 KB of terms (post-compression) in the dictionary and support any language. The tenant limit is also 100 KB after compression. If needed, you can apply multiple custom keyword dictionaries to a single policy or have a single keyword dictionary per policy. You can source these dictionaries from a file, such as a .CSV or .TXT list, or from\na list you can Import\n. Use custom dictionaries when you need to support terms or languages specific to your organization and policies.\nMicrosoft provided trainable classifiers\nCommunication Compliance policies that use Microsoft provided trainable classifiers inspect and evaluate messages that meet a\nminimum word count requirement\n, depending on the language of the content. Custom trainable classifiers aren't supported. For a complete list of supported languages, word count requirements, and file types for Microsoft provided trainable classifiers, see\nTrainable classifier definitions\n.\nTo identify and take action on messages containing inappropriate language content that doesn't meet the word count requirement, create a\nsensitive information type\nor\ncustom keyword dictionary\nfor Communication Compliance policies detecting this type of content.\nMicrosoft provided trainable classifier\nDescription\nCorporate sabotage\nDetects messages that might mention acts to damage or destroy corporate assets or property. This classifier can help you manage regulatory compliance obligations such as NERC Critical Infrastructure Protection standards or state by state regulations like Chapter 9.05 RCW in Washington state.\nCustomer complaints\nDetects messages that might suggest customer complaints made on your organization's products or services, as required by law for regulated industries. This classifier can help you manage regulatory compliance obligations such as FINRA Rule 4530, FINRA 4513, FINRA 2111, Consumer Financial Protection Bureau, Code of Federal Regulations Title 21: Food and Drugs, and the Federal Trade Commission Act.\nDiscrimination\nDetects potentially explicit discriminatory language and is particularly sensitive to discriminatory language against the African American/Black communities when compared to other communities.\nGifts & entertainment\nDetects messages that might suggest exchanging gifts or entertainment in return for service, which violates regulations related to bribery. This classifier can help you manage regulatory compliance obligations such as Foreign Corrupt Practices Act (FCPA), UK Bribery Act, and FINRA Rule 2320.\nHarassment\nDetects potentially offensive content in multiple languages that targets people regarding race, color, religion, national origin.\nMoney laundering\nDetects signs that might suggest money laundering or engagement in acts to conceal or disguise the origin or destination of proceeds. This classifier can help you manage regulatory compliance obligations such as the Bank Secrecy Act, the USA Patriot Act, FINRA Rule 3310, and the Anti-Money Laundering Act of 2020.\nProfanity\nDetects potentially profane content in multiple languages that would likely offend most people.\nRegulatory collusion\nDetects messages that might violate regulatory anti-collusion requirements such as an attempted concealment of sensitive information. This classifier can help you manage regulatory compliance obligations such as the Sherman Antitrust Act, Securities Exchange Act 1933, Securities Exchange Act of 1934, Investment Advisers Act of 1940, Federal Commission Act, and the Robinson-Patman Act.\nStock manipulation\nDetects signs of possible stock manipulation, such as recommendations to buy, sell, or hold stocks that might suggest an attempt to manipulate the stock price. This classifier can help you manage regulatory compliance obligations such as the Securities Exchange Act of 1934, FINRA Rule 2372, and FINRA Rule 5270.\nThreat\nDetects potential threatening content in multiple languages aimed at committing violence or physical harm to a person or property.\nUnauthorized disclosure\nDetects sharing of information containing content that is explicitly designated as confidential or internal to unauthorized individuals. This classifier can help you manage regulatory compliance obligations such as FINRA Rule 2010 and SEC Rule 10b-5.\nImportant\nMicrosoft provided trainable classifiers might detect a large volume of bulk sender/newsletter content due to a known issue. You can mitigate the detection of large volumes of bulk sender/newsletter content by selecting the\nFilter email blasts\ncheck box\nwhen you create the policy. You can also edit an existing policy to turn on this feature.\nContent safety classifiers based on large language models\nCommunication Compliance includes a set of Azure AI classifiers for Microsoft 365 Copilot, Teams, and Viva Engage communications. These classifiers run on large language models (LLMs) and are highly accurate. They evaluate messages containing three or more words. If the evaluated severity is 4 or higher, the solution displays the message as an alert and adds a\nSeverity\ncolumn to the\nAlerts\ndashboard to make it easier to prioritize alerts. Investigators can sort and filter on the\nSeverity\ncolumn.\nNote\nThe\nSeverity\ncolumn only appears if the policy match comes from one of the classifiers listed in the following table. For all other classifiers, the column is empty.\nThe following table describes the classifiers:\nClassifier\nDescription\nHate\nHate\nrefers to any content that attacks or uses pejorative or discriminatory language with reference to a person or identity group based on certain differentiating attributes of these groups, including but not limited to race, ethnicity, nationality, gender identity and expression, sexual orientation, religion, immigration status, ability status, personal appearance, and body size.\nSexual\nSexual\ndescribes language related to anatomical organs and genitals, romantic relationships, acts portrayed in erotic or affectionate terms, pregnancy, physical sexual acts, including those portrayed as an assault or a forced sexual violent act against one’s will, prostitution, pornography, and abuse.\nViolence\nViolence\ndescribes language related to physical actions intended to hurt, injure, damage, or kill someone or something; describes weapons, guns and related entities, such as manufactures, associations, legislation, and so on.\nSelf-harm\nSelf-harm\ndescribes language related to physical actions intended to purposely hurt, injure, damage one’s body, or kill oneself.\nLearn more about Azure AI Content Safety\nConsiderations when using classifiers\nOnly Microsoft 365 Copilot, Teams, and Viva Engage workloads are supported.\nThe solution evaluates only messages containing five or more words. It doesn't evaluate attachments and OCR images.\nThe solution doesn't evaluate Teams meeting transcripts.\nFeedback loop isn't yet supported to submit misclassified items to Microsoft.\nThe solution supports up to 10,000 characters per message.\nLearn about supported languages\nDetect risky generative AI interactions\nCommunication Compliance detects harmful user-generated and AI-generated content in applications and services. This detection includes evaluation of user prompts submitted to generative AI services and the inclusion of known text content that might be sensitive to your organization.\nThe following table describes the classifiers.\nClassifier\nDescription\nPrompt Shield\nDetects adversarial user input attacks such as user prompt injection attacks (jailbreak). User prompt injection attacks deliberately exploit system vulnerabilities to elicit unauthorized behavior from the large language model.\nProtected material\nDetects known text content that might be protected under copyright or branding laws. Detecting the display of protected material in generative AI responses ensures compliance with intellectual property laws and maintains content originality.\nFor more information, see\nLearn more about Azure AI Content Safety\n.\nOptical character recognition (OCR)\nNote\nMicrosoft Purview includes\nOCR (preview) settings\nfor Microsoft Purview Insider Risk Management, Microsoft Purview Data Loss Prevention, Microsoft Purview Data Loss Management, and auto-labeling. Use the OCR (preview) settings to provide image-scanning capabilities for those solutions and technologies. Communication Compliance has its own built-in OCR scanning functionality as described in this section and doesn't support the OCR (preview) settings at this time.\nYou can configure built-in or custom Communication Compliance policies to scan and identify printed or handwritten text from images that might be inappropriate in your organization. Integrated\nAzure Cognitive Services and optical scanning support\nfor identifying text in images help analysts and investigators detect and act on instances where inappropriate conduct might be missed in communications that is primarily nontextual.\nYou can enable optical character recognition (OCR) in new policies from templates, custom policies, or update existing policies to expand support for processing embedded images and attachments. When you enable OCR in a policy created from a policy template, the solution automatically scans embedded or attached images in email and Microsoft Teams chat messages. For images embedded in document files, OCR scanning isn't supported. For custom policies, configure one or more conditional settings associated with keywords, Microsoft provided trainable classifiers, or sensitive info types in the policy to enable the selection of OCR scanning.\nThe solution scans and processes images from 100 KB to 4 MB in the following image formats:\n.jpg/.jpeg (joint photographic experts group)\n.png (portable network graphics)\n.bmp (bitmap)\n.tiff (tag image file format)\nWhen you review pending policy matches for policies with OCR enabled, you see images identified and matched to policy conditions as child items for associated alerts. You can view the original image to evaluate the identified text in context with the original message. It might take up to 48 hours for detected images to be available with alerts.\nChoose conditions for your policies\nThe conditions you choose for a policy apply to communications from both email and third-party sources in your organization (Instant Bloomberg, for example).\nAt this time, the condition names that you choose depend on whether you're creating a new policy or an existing policy:\nFor\nnew policies\n, we recommend using the condition builder. The condition builder supports the\nOR\noperator, which enables you to combine multiple conditions in the same policy to create compound conditions with AND, OR, and NOT operators. This support helps you address complex scenarios for your unique compliance requirements. If you create a new policy with a template, Communication Compliance automatically uses the condition builder.\nLearn more about the condition builder\n.\nNote\nAll exceptions in the condition builder are replaced with a NOT condition. You must include the NOT condition within a nested group.\nFor\nexisting policies\n, use the condition names listed in the first column in the following table.\nTip\nTo save time, you can\ntest certain conditions before creating your policy\n.\nCondition names for existing and new policies\nThe following table lists condition names to use for existing policies vs. new policies (using the condition builder). It also lists how to use each condition.\nCondition name for existing policies\nCondition name for new policies\nHow to use this condition\nAttachment contains any of these words\nAttachment contains none of these words\nAttachment contains words or phrases\nTo apply the policy when certain words or phrases are included or excluded in a message attachment (such as a Word document).\nMake sure to use the following syntax when entering conditional text:\n- Remove all leading and trailing spaces.\n- Add quotation marks before and after each keyword or key phrase.\n- Separate each keyword or key phrase with a comma.\n- Don't include spaces between items separated by a comma.\nExample:\n\"banker\",\"insider trading\",\"confidential 123\"\nEach word or phrase you enter is applied separately (only one word must apply for the policy to apply to the attachment). For more information about entering words or phrases, see the next section\nMatching words and phrases to emails or attachments\n.\nAttachment is any of these file types\nAttachment is none of these file types\nAttachment file extension is\nTo bring communications into scope that include or exclude specific types of attachments, enter the file extensions (such as .exe or .pdf). If you want to include or exclude multiple file extensions, enter file types separated by a comma (example\n.exe,.pdf,.zip\n). Don't include spaces between items separated by a comma. Only one attachment extension must match for the policy to apply.\nAttachment is larger than\nAttachment is not larger than\nAttachment size equals or is larger than\nTo review messages based on the size of their attachments, specify the maximum or minimum size an attachment can be before the message and its attachments are subject to review. For example, for an existing policy, if you specify\nAttachment is larger than\n>\n2.0 MB\n, all messages with attachments 2.01 MB and over are subject to review. You can choose bytes, kilobytes, megabytes, or gigabytes for this condition.\nContent contains any of these sensitive info types\nContent contains sensitive info types\nApply to the policy when any sensitive information types are included or excluded in a message. Each sensitive information type you choose is applied separately and only one of these sensitive information types must apply for the policy to apply to the message. For more information about custom sensitive information types, see\nLearn about sensitive information types\n.\nContent matches any of these classifiers\nContent matches trainable classifiers\nApply to the policy when any trainable classifiers are included or excluded in a message. Some classifiers are predefined in your organization, and custom classifiers must be configured separately before they're available for this condition. For existing policies, only one trainable classifier can be defined as a condition in a policy. If you're using the condition builder for a new policy, you're not limited to a single trainable classifier. For more information about configuring trainable classifiers, see\nLearn about trainable classifiers\n.\nMessage contains any of these words\nMessage contains none of these words\nMessage contains words or phrases\nTo apply the policy when certain words or phrases are included or excluded in a message.\nMake sure to use the following syntax when entering conditional text:\n- Remove all leading and trailing spaces.\n- Add quotation marks before and after each keyword or key phrase.\n- Separate each keyword or key phrase with a comma.\n- Don't include spaces between items separated by a comma.\nExample:\n\"banker\",\"insider trading\",\"confidential 123\"\nEach word or phrase you enter is applied separately (only one word must apply for the policy to apply to the message). For more information about entering words and phrases, see the next section\nMatching words and phrases to emails or attachments\n.\nMessage is classified with any of these labels\nMessage is not classified with any of these labels\nMessage has retention labels applied\nTo apply the policy when certain retention labels are included or excluded in a message. Retention labels must be configured separately and configured labels are chosen as part of this condition. Each label you choose is applied separately (only one of these labels must apply for the policy to apply to the message). For more information about retention labels, see\nLearn about retention policies and retention labels\n.\nMessage is received from any of these domains\nMessage is not received from any of these domains\nSender domain is\nApply the policy to include or exclude specific domains in received messages.\nMake sure to use the following syntax when entering conditional text:\n-Enter each domain and separate multiple domains with a comma.\n-Don't include spaces between items separated by a comma.\n-Remove all leading and trailing spaces.\nEach domain entered is applied separately, only one domain must apply for the policy to apply to the message. If you want to use\nMessage is received from any of these domains\nto look for messages from specific domains, you need to combine this with another condition like\nMessage contains any of these words\n, or\nContent matches any of these classifiers\n, or you might get unexpected results.\nFor existing policies, if you want to scan all emails but want to exclude messages from a specific domain that don't need review (newsletters, announcements, and so on), you must configure a\nMessage is not received from any of these domains\ncondition that excludes the domain (example 'contoso.com,wingtiptoys.com').\nMessage is received from any of these external email addresses\nMessage is not received from any of these external email addresses\nSender is\nApply the policy to include or exclude messages received or not received from specific external email addresses (example someone@outlook.com).\nUse this condition to detect only messages that come from outside the organization (messages that cross the firewall).\nMake sure to use the following syntax when entering email addresses:\n- Enter each email address and separate multiple email addresses with a comma.\n- Don't include spaces between email addresses separated by a comma.\n- Remove all leading and trailing spaces.\n- Remove any single quote or double quotes\nMessage is sent to any of these domains\nMessage is not sent to any of these domains\nRecipient domain is\nApply the policy to include or exclude specific domains in sent messages.\nMake sure to use the following syntax when entering conditional text:\n-Enter each domain and separate multiple domains with a comma.\n-Don't include spaces between items separated by a comma.\n-Remove all leading and trailing spaces.\nEach domain is applied separately; only one domain must apply for the policy to apply to the message.\nFor existing policies, if you want to exclude all emails sent to two specific domains, configure the\nMessage is not sent to any of these domains\ncondition with the two domains (example 'contoso.com,wingtiptoys.com').\nMessage is sent to any of these external email addresses\nMessage is not sent to any of these external email addresses\nRecipient is\nApply the policy to include or exclude messages sent or not sent to specific external email addresses (example someone@outlook.com).\nUse this condition to detect only messages that are sent outside the organization (messages that cross the firewall).\nMake sure to use the following syntax when entering email addresses:\n- Enter each email address and separate multiple email addresses with a comma.\n- Don't include spaces between email addresses separated by a comma.\n- Remove all leading and trailing spaces.\n- Remove any single quote or double quotes\nMessage size is larger than\nMessage size is not larger than\nMessage size equals or is larger than\nTo review messages based on a certain size, use these conditions to specify the maximum or minimum size a message can be before it's subject to review. For example, for an existing policy, if you specify\nMessage size is larger than\n>\n1.0 MB\n, all messages that are 1.01 MB and larger are subject to review. You can choose bytes, kilobytes, megabytes, or gigabytes for this condition.\nImportant\nIf a condition includes a list, don't include spaces between list items. For example, enter \"bias,harassment\" instead of \"bias, harassment\".\nMatching words and phrases to emails or attachments\nEach word you enter and separate with a comma is applied separately (only one word must apply for the policy condition to apply to the email or attachment). For example, let's use the condition\nMessage contains any of these words\nwith the keywords \"banker\", \"confidential\", and \"insider trading\" separated by a comma (banker,confidential,\"insider trading\"). The policy applies to any messages that includes the word \"banker\", \"confidential\", or the phrase \"insider trading\". Only one of these words or phrases must occur for this policy condition to apply. Words in the message or attachment must exactly match what you enter.\nImportant\nWhen importing a custom dictionary file, each word or phrase must be separated with a carriage return and on a separate line. For example:\nbanker\nconfidential\ninsider trading\nTo scan both email messages and attachments for the same keywords, create a\ncustom keyword dictionary\nfor the terms you want to scan. This policy configuration identifies defined keywords that appear in either the email message\nOR\nin the email attachment. Using the standard conditional policy settings (\nMessage contains any of these words\nand\nAttachment contains any of these words\n) to identify terms in messages and in attachments requires the terms to be present in\nBOTH\nthe message and the attachment.\nUse the condition builder to create complex conditions (new policies)\nIf you want to create nested conditions or use the\nOR\noperator in addition to the\nAND\noperator in new policies, use the condition builder. To use multiple operators (both\nAND\nand\nOR\n) in the same condition, you must create a separate group for each operator. Select\nAdd group\nto create a group.\nNote\nTo use the condition builder, you must first opt in. To do this, in the\nConditions\nsection, above the existing condition builder, turn the\nCondition builder\noption on. If you create a new policy by using a template, you don't have to opt in to the condition builder, however. Communication Compliance automatically uses the condition builder for new policies based on templates.\nAll exceptions in the condition builder are replaced with a NOT condition. For example, the existing condition builder includes the\nMessage is not received from any of these domains\ncondition. This condition is renamed to\nSender domain is\nin the condition builder, so to build the\nMessage is not received from any of these domains\ncondition in the condition builder, use the\nSender domain is\ncondition together with the\nNOT\ncondition. You must include the\nNOT\ncondition within a nested group.\nNote\nYou can use conditions with trainable classifiers and sensitive info types as an exception by using the\nNOT\ncondition in a nested group, but keyword highlighting isn't available in this case.\nTo make it easier to see a summary of a complex condition that includes multiple\nAND\nand\nOR\noperators or groups, the condition builder includes a simplified summary view of the condition. To see the summary for a condition you built, turn on the\nQuick Summary\noption.\nFor scenarios that show how to use conditions in policies, see\nScenarios for using conditions in Communication Compliance policies\n.\nReview percentage\nTo reduce the amount of content to review, specify a percentage of all communications governed by a Communication Compliance policy. The system selects a real-time, random sample of content from the total percentage of content that matches chosen policy conditions. If you want reviewers to review all items, configure\n100%\nin a Communication Compliance policy.\nFilter email blasts\nUse the\nFilter email blasts\nsetting to exclude messages sent from email blast services. Messages that match the conditions you specify don't generate alerts. This exclusion includes bulk email, such as newsletters, and spam, phishing, and malware. When you select this option, you can view a\nreport\nthat lists the bulk email senders that the system filtered out. Reports are retained for 60 days.\nNote\nThe list of senders is filtered before the content is analyzed, so there might be senders that don't match the content conditions (there might be extra senders in the report).\nThis setting is on by default for new policies. If the\nFilter email blasts setting\nis off for existing policies, a recommendation is generated to turn on the setting.\nAlert policies\nAfter you configure a policy, the system automatically creates a corresponding alert policy and generates alerts for messages that match conditions defined in the policy. It can take up to 24 hours after creating a policy to start receiving alerts from activity indicators. By default, all policy matches alert triggers are assigned a severity level of medium in the associated alert policy. The system generates alerts for a Communication Compliance policy once the aggregation trigger threshold level is met in the associated alert policy. A single email notification is sent once every 24 hours for any alerts, regardless of the number of individual messages that match policy conditions. For example, Contoso has an inappropriate content policy enabled and for January 1, there were 100 policy matches that generated six alerts. A single email notification for the six alerts is sent at end of January 1.\nFor Communication Compliance policies, the system configures the following alert policy values by default:\nAlert policy trigger\nDefault value\nAggregation\nSimple aggregation\nThreshold\nDefault: 4 activities\nMinimum: 3 activities\nMaximum: 2,147,483,647 activities\nWindow\nDefault: 60 minutes\nMinimum: 60 minutes\nMaximum: 10,000 minutes\nNote\nThe alert policy threshold trigger settings for activities support a minimum value of 3 or higher for Communication Compliance policies.\nYou can change the default settings for triggers on number of activities, period for the activities, and for specific users in alert policies on the\nAlert policies\npage in Microsoft Purview.\nChange the severity level for an alert policy\nTo change the severity level for an alert policy, see\nAlert policies in the Microsoft Defender portal\n.\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Create Policies", @@ -710,7 +710,7 @@ "https://learn.microsoft.com/en-us/purview/communication-compliance-investigate-remediate": { "content_hash": "sha256:4f584e4984659d06c9195cc9b62393f3be208bee1c05fececc3d2fbef5e48901", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nInvestigate and remediate Communication Compliance alerts\nFeedback\nSummarize this article for me\nImportant\nMicrosoft Purview Communication Compliance\nprovides the tools to help organizations detect regulatory compliance (for example, SEC or FINRA) and business conduct violations such as sensitive or confidential information, harassing or threatening language, and sharing of adult content. Communication Compliance is built with privacy by design. Usernames are pseudonymized by default, role-based access controls are built in, investigators are opted in by an admin, and audit logs are in place to help ensure user-level privacy.\nAfter you configure\nCommunication Compliance policies\n, you receive alerts for message issues that match your policy conditions. To view and act on alerts, assign users the following permissions:\nThe\nCommunication Compliance Analysts\nor the\nCommunication Compliance Investigators\nrole group\nReviewer in the policy that is associated with the alert\nAfter you establish required permissions, use the following working instructions to investigate and remediate issues.\nTip\nGet started with Microsoft Security Copilot to explore new ways to work smarter and faster using the power of AI. Learn more about\nMicrosoft Security Copilot in Microsoft Purview\n.\nInvestigate policy matches and alerts\nImportant\nOn June 1, 2025, the new\nPolicy Match Preservation\nsetting goes into effect. On that date, existing policy matches might be permanently deleted based on the time period you select in the\nPolicy Match Preservation\nsetting. New policy matches are preserved for that time period. You have until\nMay 19\nto change the time period if you don't want to use the default setting of\n1 year\n.\nLearn more about the Policy Match Preservation setting\n.\nTo investigate issues detected by your policies, review policy matches and alerts. The Communication Compliance area provides several features to help you quickly investigate policy matches and alerts:\nPolicies page\nWhen you sign in to the\nMicrosoft Purview portal\nwith an admin account in your Microsoft 365 organization, select the\nCommunication Compliance\nsolution, then select the\nPolicies\npage. This page shows Communication Compliance policies configured for your Microsoft 365 organization and links to recommended policy templates.\nNote\nIf your role group is\nscoped by one or more admin units\n, you see a message at the top of the page that you can only view and manage the policies that you're scoped for. To learn more about what you can access, select\nView role groups\nin the banner.\nEach policy listed includes the following columns:\nMessages scanned today\n: The total number of messages that the policy scanned on the current day for users and locations that are in scope of the policy. If there's a policy match, the item is added to the\nNew pending today\ncolumn. The number refreshes automatically once per hour, it doesn't update if you refresh the page. At the end of the UTC day, the count resets to zero.\nTip\nIf the value in the column is 0 or a low number, it might be because the policy is too strict. For example, the policy might be focused on just one user or on just one location that the in-scope user doesn't use. You might see a wide variety of aggregate numbers in this column if your policies have different focuses. For example, one policy might be focused on an entire department of users sending messages from multiple locations while another policy might be focused on just one user or just one location.\nNew pending today\n: Shows the number of policy matches for the current day. This value updates whenever you open the page. You can also select the\nRefresh\nbutton to get the latest count. At the end of the UTC day, the count resets to zero.\nTotal pending\n: The count of policy matches that need review. This value updates whenever you open the page. You can select the\nRefresh\nbutton to get the latest count. After refreshing, this number matches the number on the\nPending\ntab.\nTotal resolved\n: The total number of resolved policy matches. This value updates whenever you open the page. You can select the\nRefresh\nbutton to get the latest count. After refreshing, this number matches the number on the\nResolved\ntab.\nStatus\n: The status of the policy (Active or Deactivated).\nLast modified\n: The date and Coordinated Universal Time (UTC) of the last policy modification.\nLast policy scan\n: The UTC date for the last policy scan.\nTo start remediation actions, select the policy associated with the alert to launch the\nPolicy details\npage. From the\nPolicy details\npage, you can review a summary of the activities, review, and act on policy matches on the\nPending\ntab, summarize a lengthy message by using Microsoft Copilot in Microsoft Purview, or review the history of closed policy matches on the\nResolved\ntab.\nLearn more about remediation actions\n.\nAlerts page\nGo to\nCommunication Compliance\n>\nAlerts\nto display the last 30 days of alerts grouped by policy matches. This view lets you quickly see which Communication Compliance policies generate the most alerts, ordered by severity. An alert isn't the same thing as a policy match. An alert generally consists of multiple policy matches, not just one policy match. After the required number of policy matches is met for a particular alert, the alert is created and email is sent to the alert recipient.\nReports page\n: Go to\nCommunication Compliance\n>\nReports\nto display Communication Compliance report widgets. Each widget provides an overview of Communication Compliance activities and statuses, including access to deeper insights about policy matches and remediation actions.\nTip\nLearn how to analyze interactions entered into generative AI applications\n.\nTips for quickly reviewing policy matches on the Pending or Resolved tab\nWhen you select a message to review on the\nPending\ntab or the\nResolved\ntab, the condition that caused the policy match appears in an alert message bar (yellow banner) at the top of the\nSource\ntab. This alert message bar is a quick way to determine the condition or conditions that caused the policy match. If there are multiple conditions, select\nView all\nin the banner to see all the conditions that caused the policy match. At this time, only Microsoft provided trainable classifiers and sensitive information types are highlighted as conditions in the yellow banner.\nSometimes it's useful to quickly review policy settings without opening a policy. For example, if you're testing multiple policies with different conditions, you might want to save time by reviewing conditions for each policy to determine risk before opening the policy. You can do this by selecting\nPolicy settings\n, which opens a panel where you can view the policy settings. If you're a member of the Communication Compliance or Communication Compliance Admins role group, you can view and change settings from the panel. If you're a member of the Communication Compliance Investigators or Communication Compliance Analysts role group, you can view settings but you can't change them.\nPolicy Match Preservation setting\nWhen a Communication Compliance policy finds a message that matches the policy, the solution stores a\ncopy\nof the message (not the original message). Starting June 1, 2025, you can use the\nPolicy Match Preservation\nsetting to specify how long to save policy matches in Communication Compliance. You can choose\n1 month\n,\n6 months\n,\n1 year\n, or\n7 years\n. The default value is\n1 year\n.\nNote\nYou can also specify a preservation period when you create or edit a policy. The preservation period you select within a policy takes precedence over the global\nPolicy Match Preservation\nsetting.\nWhen the setting goes into effect on April 1, 2025:\nThe value in the\nPolicy Match Preservation\nsetting applies to all existing policy matches. The system automatically deletes policy matches that are older than the selected time period. For example, if the\nPolicy Match Preservation\nsetting is\n1 year\n(the default), the system permanently deletes any policy matches older than 1 year as of September 1. Only the policy matches in Communication Compliance are deleted because they're copies of the original messages. This process doesn't delete original copies from user shards. You can use\nMicrosoft Purview Data Lifecycle Management\nto set up a retention policy for original messages.\nNew policy matches as of September 1 receive a timestamp with the preservation period specified in the setting.\nIf you change the time period in the setting after September 1, the preservation period applies going forward. For example, if you select\n1 year\nas the time period on September 1, then change the setting to\n7 years\non September 3, policy matches from September 1 and September 2 are preserved for 1 year, and policy matches starting on September 3 are preserved for 7 years.\nWhat you need to do\nThe\nPolicy Match Preservation\nsetting becomes available for your organization on April 1, 2025. Between April 1 and May 30, select the time period for preserving policy matches for your organization. The selection isn't applied until June 1, 2025.\nChange the time period for the global\nPolicy Match Preservation\nsetting\nSign in to the\nMicrosoft Purview portal\nwith credentials for an admin account in your Microsoft 365 organization.\nSelect the\nSettings\nbutton in the upper-right corner of the page, then select\nCommunication Compliance\nto go to the Communication Compliance global settings.\nSelect the\nPolicy Match Preservation\nsetting, then select a new time period.\nSelect\nConfirm\nin the confirmation dialog box.\nChange the policy match preservation time period for a custom policy\nYou can also select a time period for preserving policy matches in the policy workflow for a custom policy. When you set a time period in a custom policy, it takes precedence over the selection in the global\nPolicy Match Preservation\nsetting. This option is useful for policies that are the exception to the norm.\nSign in to the\nMicrosoft Purview portal\nwith credentials for an admin account in your Microsoft 365 organization.\nGo to the\nCommunication Compliance\nsolution, then select\nPolicies\nin the left navigation.\nSelect the\nMore actions\n(ellipsis) button in the row for the policy you want to change, then select\nEdit\n.\nOn the\nName and describe your policy\npage in the policy workflow, under\nPreserve policy matches\n, make a selection. If you leave the\nGlobal Setting\nselection, the policy uses the time period selected in the global\nPolicy Match Preservation\nsetting. If you choose any other time period, it takes precedence over the selection in the global setting.\nUsing filters\nThe next step is to sort the messages so it's easier for you to investigate. From the\nPolicy details\npage, Communication Compliance supports multilevel filtering for several message fields to help you quickly investigate and review messages with policy matches. You can filter pending and resolved items for each configured policy. You can configure filter queries for a policy or configure and save custom and default filter queries for use in each specific policy. After configuring fields for a filter, you see the filter fields displayed on the top of the message queue that you can configure for specific filter values.\nKey filters (the\nBody/Subject\n,\nDate\n,\nSender\n, and\nTags\nfilters) always display on the\nPending\nand\nResolved\ntabs to make it easy to access those filters.\nFor the\nDate\nfilter, the date and time for events are listed in Coordinated Universal Time (UTC). When filtering messages for views, the requesting user's local date/time determines the results based on the conversion of the user's local date/time to UTC. For example, if a user in U.S. Pacific Daylight Time (PDT) filters a report from [DATE] to [DATE] at 00:00, the report includes messages from [DATE] 07:00 UTC to [DATE] 07:00 UTC. If the same user was in U.S. Eastern Daylight Time (EDT) when filtering at 00:00, the report includes messages from [DATE] 04:00 UTC to [DATE] 04:00 UTC.\nFilter details\nCommunication Compliance filters help you filter and sort messages for quicker investigation and remediation actions. You can filter on the\nPending\nand\nResolved\ntabs for each policy. To save a filter or filter set as a saved filter query, you must configure one or more values as filter selections.\nThe following table outlines filter details:\nFilter\nDetails\nBody/Subject\nThe message body or subject. Use this filter to search for keywords or a keyword phrase in the body or subject of the message. The subject appears in the\nSubject\ncolumn for email messages. For Teams messages, nothing appears in the\nSubject\ncolumn.\nClassifiers\nThe name of built-in and custom classifiers that apply to the message. Some examples include\nTargeted Harassment\n,\nProfanity\n,\nThreat\n, and more.\nDate\nThe date the message was sent or received by a user in your organization. To filter for a single day, select a date range that starts with the day you want results for and end with the following day. For example, to filter results for [DATE], choose a filter date range of [DATE]-[DATE].\nEscalated To\nThe user name of the person included as part of a message escalation action.\nFile class\nThe class of the message based on the message type, either\nmessage\nor\nattachment\n.\nHas attachment\nThe attachment presence in the message.\nItem class\nThe source of the message based on the message type, email, Microsoft Teams chat, Bloomberg, and more. For more information, see\nItem Types and Message Classes\n.\nLanguage\nThe detected language of text in the message. The message is classified according to the language of most the message text. For example, for a message containing both German and Italian text, but most text is German, the message is classified as German (DE). For a list of supported languages, see\nLearn about trainable classifiers\n.\nYou can also filter by more than one language. For example, to filter messages classified as German and Italian, enter 'DE,IT' (the two-digit language codes) in the Language filter search box. To view the detected language classification for a message, select a message, select View message details, and scroll to the\nEmailDetectedLanguage\nfield.\nRecipient domains\nThe domain to which the message was sent; this domain is typically your Microsoft 365 subscription domain by default.\nRecipient\nThe user to which the message was sent.\nNote\n: The\nRecipient\nfield includes recipients in the\nTo\nand\nCC\nfields.\nBCC\nfields aren't supported.\nSender domain\nThe domain that sent the message.\nSender\nThe person who sent the message.\nSize\nThe size of the message in KB.\nTags\nThe tags assigned to a message, either\nQuestionable\n,\nCompliant\n, or\nNoncompliant\n.\nConfigure a filter\nSign in to the\nMicrosoft Purview portal\nwith credentials for an admin account in your Microsoft 365 organization.\nGo to the\nCommunication Compliance\nsolution.\nSelect\nPolicies\nin the left navigation, then select a policy to see policy matches (if any) for that policy.\nOn the\nPolicy\npage, select either the\nPending\nor\nResolved\ntab to display the items for filtering.\nSelect\nFilters\n.\nSelect one or more filter checkboxes, then select\nApply\n.\nTo save the selected filters as a filter query, select\nSave the query\nafter you configure at least one filter value. Enter a name for the filter query, then select\nSave\n. This filter is available to use for only this policy and is listed in the\nSaved filter queries\nsection of the\nFilters\npage.\nReview and remediate policy matches and alerts\nNo matter where you start to review policy matches or alerts or the filtering you configure, the next step is to take remediation action. Start your remediation process with the following workflow on the\nPolicy\nor\nAlerts\npages.\nNote\nIf you see a policy prefaced with \"DSPM for AI\" (or \"AI hub\" from the preview name), the policy was created in Data Security Posture Management for AI, not in Communication Compliance. For more information about this solution, see\nLearn about Data Security Posture Management (DSPM) for AI\n.\nExamine the message basics\nSometimes it's obvious from the source or subject that you can immediately remediate a message. The message might be spurious or incorrectly matched to a policy and you should resolve it as misclassified. To mark a message as misclassified, select\nResolve\nand\nItem was misclassified\nto leave a note in the item's history that it's misclassified and remove it from the\nPending\nqueue. To share misclassified content with Microsoft, select\nSend message content, attachments, and subject\nto remove it from the\nPending\nqueue and report the items to Microsoft.\nNote\nMisclassified options in the\nResolve\npane don't appear if the condition that caused the policy match isn't based on a trainable classifier.\nReporting misclassified items to Microsoft isn't supported for Content Safety classifiers yet.\nFrom the source or sender information, you might already know how the message should be routed or handled in these circumstances. Consider using\nTag as\nor\nEscalate\nto assign a tag to applicable messages or to send messages to a designated reviewer.\nExamine the message details\nAfter reviewing the message basics, open the message to examine the details and determine further remediation actions.\nTip\nFor lengthy messages, you can\nsave time by summarizing the message (including attachments, transcripts, and recordings) with Copilot in Microsoft Purview\n.\nSelect a message to view the complete message header and body information. Several different options and views are available to help you decide the proper course of action:\nSentiment\n: Messages include a sentiment evaluation (powered by\nAzure Cognitive Service for Language\n) to help investigators quickly prioritize potentially riskier messages to address first. The\nSentiment\ncolumn displays the message sentiment and is enabled in the default view. Messages are flagged with one of the following values:\nValue\nDescription\nPositive\nMessages with\nPositive\nsentiment indicate a lower priority for triaging.\nNegative\nMessages with\nNegative\nsentiment indicate messages to prioritize.\nNeutral\nMessages with\nNeutral\nsentiment are determined to be neither positive nor negative.\nNot available\nNot available\nappears in the column for file formats that aren't supported. For example, sentiment analysis isn't available for any image that's attached to a Teams or email message, text extracted from an OCR image, or text or recordings from a Teams transcript.\nNot available\nalso appears if the message includes more than 5,120 words.\nScanning\nScanning\nappears in the the column when Communication Compliance is trying to determine the appropriate sentiment value for the message.\nAttachments\n: This option allows you to examine Modern attachments that match policy conditions. Modern attachments content is extracted as text and is viewable on the\nPending\ntab. For more information, see the\nCommunication Compliance feature reference\n.\nSource\n: This view is the standard message view commonly seen in most web-based messaging platforms. The header information is formatted in the normal style and the message body supports embedded graphic files and word-wrapped text. If\noptical character recognition (OCR)\nis enabled for the policy, images containing printed or handwritten text that match policy conditional are viewed as a child item for the associated message in this view.\nPlain text\n: Text view that displays a line-numbered text-only view of the message and includes keyword highlighting in messages and attachments for sensitive info type terms, terms identified by built-in classifiers assigned to a policy, or for terms included in a dedicated keyword dictionary assigned to a policy. Keyword highlighting, which is currently available for English language only, can help direct you to the area of interest in long messages and attachments. In some cases, highlighted text might be only in attachments for messages matching policy conditions. Embedded files aren't displayed and the line numbering in this view is helpful for referencing pertinent details among multiple reviewers.\nConversation\n: This view, which is available for Teams chat messages, displays up to five messages before and after a message to help reviewers view the activity in the conversational context. Select\nLoad more\nto load up to 20 messages before and after a message. To download messages, select\nDownload conversation\n. This action downloads an image file of everything you see in the user interface and also a .csv file of all the message metadata (UserId, UserName, and so on).\nThe Conversation view context helps reviewers quickly evaluate messages and make more informed message resolution decisions. Real-time message additions to conversations are displayed, including all inline images, emojis, and stickers available in Teams. Image or text file attachments to messages aren't displayed. Notifications are automatically displayed for messages that have been edited or for messages deleted from the Conversation window. When a message is resolved, the associated conversational messages aren't retained with the resolved message.\nPattern detected\nnotification: Many harassing and bullying actions over time involve recurring instances of the same behavior by a user. The\nPattern detected\nnotification is displayed in the message details and raises attention to the message. Detection of patterns is on a per-policy basis and evaluates behavior over the last 30 days when at least two messages are sent to the same recipient by a sender. Investigators and reviewers can use this notification to identify repeated behavior to evaluate the message as appropriate.\nTranslation\n: This view automatically converts message text to the language configured in the\nDisplayed language\nsetting in the Microsoft 365 subscription for each reviewer. This conversion includes the text for the policy match and everything included in the conversation view (up to five messages before and five messages after the policy match). The\nTranslation\nview helps broaden investigative support for organizations with multilingual users and eliminates the need for additional translation services outside of the Communication Compliance review process. Using Microsoft translation services, Communication Compliance automatically detects if the text is in a different language than the user's current system setting and displays alert message text accordingly. For a complete list of supported languages, see\nMicrosoft Translator Languages\n. Languages listed in the\nTranslator Language List\nare supported in the\nTranslation\nview.\nUser activity\n: This view provides risk profile, policy matches, and user activities captured by\nInsider Risk Management\nand Communication Compliance. This integration helps Communication Compliance investigators quickly see risk severity and the associated activities for the user while investigating and triaging pending policy matches.\nThe\nUser activity\nview displays the\nInsider risk severity\nlevel for the user and has two sections, one for Communication Compliance policy matches and activities and one for Insider Risk Management risk activities. View the risk severity of the user from Insider Risk Management in the\nSource\ntab if data sharing is enabled. For more information, see\nShare Insider Risk Management data with other solutions\n.\nCommunication Compliance policy matches\n: This section displays the total number of communication\nPolicy matches\nand the total number of\nRemediation actions\nfor the user included in this policy match.\nSelect\nView details\nin this section to display a timeline of Communication Compliance activities for the user. This timeline includes:\nTotals for\nPolicy matches\nand\nRemediation actions\nfor the user\nDetails for each activity associated with the user for the past 30 days.\nInsider risk activity\n: This section displays the total number of\nRisk activities\nand the total number of\nUnusual activities\nfor activities associated with this user for Insider Risk Management policies.\nUnusual activities\nare activities for the user that are considered potentially risky, as they're unusual and a departure from their typical activities.\nSelect\nView details\nin this section to display a timeline of insider risk activity for the user. This timeline includes:\nThe insider risk\nseverity level\n.\nTotals for\nAll activities\nand\nUnusual activities\n.\nDetails for each activity associated with the user from the\nActivity explorer\nin Insider Risk Management.\nTo view the insider risk activities in insider risk management, select\nOpen in insider risk management\n.\nTimeline\n: This section displays a history of all communications for user activities. This timeline allows investigators to review all communications for a user, helping to identify any inappropriate handling of confidential information. Only previous policy matches for the user associated with the alert are displayed.\nFor each Copilot interaction, the details for the user prompt and the Copilot response are available for review. The date and time for each interaction are provided in UTC.\nSummarize a message by using Copilot in Microsoft Purview\nYou can use Copilot in Microsoft Purview to get a contextual summary of a Teams, email, or Viva Engage message included in a policy match. The summary is in the context of one or more\nMicrosoft provided trainable classifiers\nthat flag the message (for example, stock manipulation). This feature saves time for investigators if the message content is lengthy.\nCopilot in Microsoft Purview summarizes the entire message, including any video recordings, meetings transcripts, or attachments (docx, pdf, or txt files).\nLimitations\nThe message content must be at least 100 words and less than 15,000 words. If the message content is fewer than 100 words or more than 15,000 words, Copilot in Microsoft Purview doesn't summarize it.\nSummarize items in various languages supported by Security Copilot. For more information, see\nSupported languages in Microsoft Security Copilot\n.\nAt this time, only individual Teams messages can be summarized. If you summarize an individual Teams message, Copilot doesn't pull in the surrounding conversation.\nPrerequisites for using Copilot in Microsoft Purview\nTo summarize a message in Communication Compliance by using Copilot in Microsoft Purview, you must have specific licenses and onboard your organization to Copilot in Microsoft Purview.\nLearn more about Copilot in Microsoft Purview licensing requirements and onboarding\n. You must also be a member of the\nCommunication Compliance\n,\nCommunication Compliance Analysts\n, or\nCommunication Compliance Investigators\nrole.\nSummarize a message\nIn Communication Compliance, go to the\nPolicies\npage, and then open any policy to view policy matches for that policy.\nSelect a message in the list. You can select a parent item or a child item (but it's easier to select the parent item).\nSelect\nSummarize\n. This option appears below the message details. It opens the\nCopilot\npanel on the right side of the screen and displays a summary of the message.\nImportant\nYou might see a generic error message that says \"An error occurred\" after selecting\nSummarize\n. This error message can appear for any of the following reasons:\nYou don't have the\nrequired license\nfor Copilot in Microsoft Purview.\nYou don't have the required Communication Compliance role and/or the Security Copilot contributor role. To use Copilot in Microsoft Purview, you must have one of the following Communication Compliance roles:\nCommunication Compliance\n,\nCommunication Compliance Analysts\n,\nCommunication Compliance Investigators\n. You must also have the\nSecurity Copilot contributor role\n, which should be turned on by default for all users in a Microsoft Entra organization.\nThere's an internal error. Microsoft provides more information for these errors.\nSelect one of the suggested prompts at the bottom of the panel to use Copilot to create the summary. For example, if the classifier that flagged a message is Stock manipulation, select\nSummarize this message and supported attachments in the context of Stock manipulation category detected\n. You can also enter an open-ended question in the \"Ask a question\" box.\na.\nSummarize\n.\nb. Summary generated by\nSummarize\n.\nc. Suggested Copilot prompts.\nd. \"Ask a question\" box for open-ended Copilot questions.\nNote\nOnly message content is summarized. If you ask a question relevant to the message content, Copilot provides a result. If the question isn't relevant to message content, Copilot instructs you to ask a different question.\nIf you want to provide feedback on the summarized content, select the drop-down arrow in the lower-right corner below the summary, and then enter your feedback.\nAfter reviewing the summary, you can remediate the policy match like any other policy match (see next section).\nNote\nCopilot in Microsoft Purview responses are also recorded in\nMicrosoft Security Copilot\n.\nUnderstanding hidden matches\nWhen a Communication Compliance policy triggers, it highlights the exact matches in the\nPlain text\ntab. However, policies can also flag messages where the keywords are hidden and embedded in nonvisible metadata, such as HTML tags or encoded strings. By default, the plain text view doesn't show this content.\nThe following scenarios show where hidden content matches might occur:\nHTML tags\n: Hyperlinks embedded in HTML tags, for example:\nClick here.\nAlternative text for images\n: Descriptive text for images, for example:\n\"Sunset\nEncoded strings\n: Encoded strings that represent different content when decoded, for example:\nBase64: U3Vuc2V0IG92ZXIgdGhlIGxha2U=\ndecodes to\nSunset over the lake\nFilename of attachment\n: The file name of the attachment might lead to a policy match but isn't clearly highlighted in the text view.\nOCR:\nText extracted from images using Optical Character Recognition (OCR).\nIf a message doesn't appear to match the conditions you set for the policy, complete the following steps:\nSelect and download the message with the policy match.\nExtract the contents of the zipped folder.\nLocate the .eml file.\nOpen the .eml file with any text editor.\nUse the\nFind\nfunction to search for the individual keywords in the policy conditions to identify where the match occurs.\nReview Microsoft Teams meetings transcripts\nIf you deploy Microsoft Teams in your organization, you can review Teams meetings transcripts for actionable alerts. Teams transcripts are automatically included if you choose Teams as a Microsoft 365 location when you create a custom policy or when you create a policy based on a template.\nScheduled meetings\n: Communication Compliance ignores communication direction for Teams transcripts. If an individual is an invitee or is present in a scheduled (nonrecurring) meeting, the policy includes all of the meeting content, regardless of who says what in the meeting. This approach also helps in situations where a user is attending the meeting through a hub device in a conference room, since it's not always possible to tell whether an offending communication comes from an in-scope user. Since all meeting content is included, an investigator can review the recording of the meeting to determine if the offending communication is by an in-scope user.\nRecurring meetings\n: For recurring meetings, the policy evaluates only the following users:\nUsers who were invited to the meeting\nUsers identified by the transcript as having spoken during the meeting\nUnscheduled meetings\n: For unscheduled meetings (Meet now meetings), the policy evaluates only users who are identified by the transcript as having spoken during the meeting.\nRequirements\nTo review Teams meeting transcripts, you must turn on meeting transcripts for your organization, since meeting transcripts aren't turned on by default.\nLearn more about turning on meeting transcripts for your organization\n.\nLimitations\nYou can set one or more of the following policy conditions:\nContent matches any of these classifiers.\nContent contains any of these sensitive info types.\nMessage contains any of these words.\nMessage contains none of these words.\nAny other policy conditions are ignored.\nLearn more about conditional settings\n.\nNote\nIf you don't set any conditions, the policy captures transcripts for all meetings.\nOnly sensitive info types, keyword lists, and regulatory Microsoft provided trainable classifiers are detected. Regulatory trainable classifiers include:\nCorporate sabotage\nGifts and entertainment\nMoney laundering\nRegulatory collusion\nStock manipulation\nUnauthorized disclosure\nNote\nAll other classifiers, including business conduct classifiers, aren't detected.\nAd-hoc (unscheduled) meetings aren't captured.\nExternal meetings aren't captured if the meeting organizer is outside your organization.\nMeeting recordings started by an uninvited user aren't captured.\nResolve an alert for a communication in a meeting transcript\nAfter you create a policy, an alert triggers when the policy detects a transcript that contains offending content. The alert brings the offending content and background context to the attention of an investigator.\nTo resolve an alert related to a meeting transcript:\nSelect the\nSource\ntab and then review the transcript for the offending content. The\nSource\ntab shows the entire transcript. When you select the\nSource\ntab, the transcript automatically scrolls to the line that contains the policy match. The offensive keyword or phrase is highlighted.\nUse the\nPlain text\ntab to do a line-by-line review of the text, including start and stop times in relation to the overall meeting time. Text is captured 30 seconds before and after the offending communication.\nSelect the\nTranslation\ntab to review translations in up to eight languages for the\nPlain text\ntab. Messages in other languages are automatically converted to the display language of the reviewer.\nSelect the\nUser history\ntab to see a historical view of all user message remediation activities, such as past notifications and escalations for policy matches.\nUse\nResolve\n,\nNotify\n,\nTag as\n,\nEscalate\n, and\nEscalate for investigation\noptions to resolve the alert. To learn more about these options, see\nDecide on a remediation action\n.\nDecide on a remediation action\nAfter reviewing message details, choose from several remediation actions:\nResolve\n: Selecting\nResolve\nimmediately removes the message from the\nPending\nqueue and prevents any further action on the message. When you select\nResolve\n, you close the message without further classification. You can also mark the message as misclassified if the alerting process or any Microsoft provided trainable classifiers incorrectly generated it. The\nResolved\ntab displays all resolved messages.\nTo save investigators time when duplicate policy matches appear across multiple policies, the solution turns on the\nCross-policy resolution\nsetting (preview) by default. When you resolve a policy match, Communication Compliance automatically resolves all instances of the same policy match in any policy where it's detected, regardless of the reviewer's scope. The item history for each policy notes cross-policy resolution actions, and the number of\nResolved\nitems increments in the\nItems and actions per policy\nand\nItems and actions per location\nreports for all related policies.\nWhen the\nCross-policy resolution\nsetting is turned on, this message appears in the\nResolve\npane.\nIf you don't want to automatically resolve all instances of the same policy match when an investigator resolves a policy match, turn off the\nCross-policy resolution\nsetting. Anyone who can resolve messages (anyone with the\nCommunication Compliance\n,\nCommunication Compliance Investigators\n, or\nCommunication Compliance Analyst\nrole) knows if the setting is turned off because the message in the\nResolve\npane disappears. To turn off the setting, in the\nMicrosoft Purview portal\n, select\nSettings\nin the upper-right corner of the page, select\nCommunication Compliance\n, then select the\nCross-policy resolution\nsetting.\nPower Automate\n: Use a Power Automate flow to automate process tasks for a message. By default, Communication Compliance includes the\nNotify manager when a user has a Communication Compliance alert\nflow template that reviewers can use to automate the notification process for users with message alerts. For more information about creating and managing Power Automate flows in Communication Compliance, see the\nConsider Power Automate flows\n.\nTag as\n: Tag the message as\nCompliant\n,\nNoncompliant\n, or\nQuestionable\nas it relates to the policies and standards for your organization. You can also create a\ncustom tag\n. Adding tags and tagging comments helps you micro-filter messages for escalations or as part of other internal review processes. After tagging is complete, you can also choose to resolve the message to move it out of the pending queue. You can filter on any tag value.\nNotify\n: Use\nNotify\nto assign a custom notice template to the message and send a warning notice to the user. Choose the appropriate notice template configured in the\nCommunication Compliance settings\narea and select\nSend\nto email a reminder to the user that sent the message and to resolve the issue.\nEscalate\n: Use\nEscalate\nto choose other people in your organization who should review the message. Choose from a list of reviewers configured in the Communication Compliance policy to send an email notification requesting additional review of the message. The selected reviewer can use a link in the email notification to go directly to items escalated to them for review.\nEscalate for investigation\n: Use\nEscalate for investigation\nto create a new\neDiscovery (Premium) case\nfor single or multiple messages. Provide a name and notes for the new case. The custodian is automatically filled in for you. You don't need any additional permissions to manage the case. Creating a case doesn't resolve or create a new tag for the message. You can select a total of 100 messages when creating an eDiscovery (Premium) case during the remediation process. Messages in all communication channels included in Communication Compliance are supported. For example, you could select 50 Microsoft Teams chats, 25 Exchange Online email messages, and 25 Viva Engage messages when you open a new eDiscovery (Premium) case for a user.\nRemove message in Teams\n: Use\nRemove message in Teams\nto block potentially inappropriate messages and content identified in messages from Microsoft Teams channels and 1:1 and group chats. This action includes Teams chat messages reported by users and chat messages detected using machine-learning and classifier-based Communication Compliance policies. Removed messages and content are replaced with a policy tip that explains that it's blocked and the policy that applies to its removal from view. Recipients are provided a link in the policy tip to learn more about the applicable policy and the review process. The sender receives a policy tip for the blocked message and content but can review the details of the blocked message and content for context regarding the removal.\nCreate a custom tag\nYou can tag a policy match as\nCompliant\n,\nNon-compliant\n, or\nQuestionable\nas it relates to the policies and standards for your organization. If you need more flexibility than the standard tags provide, create a custom tag. For example, you might want to create an\nEscalated\ntag so investigators can let other team members know that a policy match is already escalated. Since you can apply multiple tags to any policy match, using the same example, investigators could apply that tag and also apply the\nNon-compliant\ntag for a policy match.\nKeep the following points in mind for custom tags:\nYou can apply tags to policy matches that appear on the\nPending\ntab.\nYou can apply multiple tags to a policy match.\nYou can create up to 10 custom tags per policy.\nYou create custom tags at the policy level so if you want to use the same custom tag for multiple policies, you must recreate the tag for each policy.\nWhen you apply a custom tag to a policy match:\nThe tag appears in the\nTags\ncolumn in the list of policy matches. If you apply more than one tag, each tag appears in the\nTags\ncolumn.\nThe tag appears in the\nTags\nfilter. You can then select the checkbox for that tag to filter all policy matches by that tag.\nA\nCustom tag\ncolumn appears in the following reports:\nItems and actions per policy\n,\nItems and actions per location\n, and\nActivity by user\n.\nIf you delete a custom tag for a policy:\nIt's removed from any previous items that you tagged with that custom tag.\nYou can no longer filter by that tag.\nTip\nAny action you take with custom tags is tracked in the\nHistory of remediation actions\npane. Select\nView item history\nto display this pane.\nCreate and apply a custom tag for a policy match\nOn\nPolicies\n, select a policy to view the policy matches for that policy.\nSelect the checkbox for the policy match that you want to create a custom tag for. Select the checkboxes for multiple policy matches if you want to apply a custom tag in bulk.\nSelect\nTag as\non the command bar.\nIn the\nTag item\npane, select\nAdd a new tag\n.\nEnter the name of the new tag, then press Enter to add the new tag.\nNote\nA custom tag must be between 3 and 64 characters and can't include special characters. Spaces and hyphens are allowed.\nSelect the checkbox for the new tag to apply it to the policy match.\nIn the\nComment\nbox, add a comment to describe the purpose of the tag (optional).\nSelect\nSave\n.\nEdit or delete a custom tag\nOn\nPolicies\n, select a policy to view the policy matches for that policy.\nSelect the checkbox for the policy match that has the custom tag you want to edit.\nSelect\nTag as\non the command bar.\nIn the\nTag item\npane, select the ellipsis next to the custom tag.\nSelect\nEdit\n, make your changes, or select\nDelete\nto delete the tag.\nImportant\nIf you delete a custom tag, it's removed from any previous items that you tagged with that custom tag. You can no longer filter by that tag.\nSelect\nSave\n.\nUnresolve messages\nWhen you resolve messages, the messages are removed from the\nPending\ntab and displayed in the\nResolved\ntab. Investigation and remediation actions aren't available for messages in the\nResolved\ntab. However, you might need to take extra action on a message that you mistakenly resolved or that needs further investigation after initial resolution. Use the\nUnresolve\ncommand to move one or more messages from the\nResolved\ntab back to the\nPending\ntab.\nUnresolve a message\nSign in to the\nMicrosoft Purview portal\nwith credentials for an admin account in your Microsoft 365 organization.\nGo to the\nCommunication Compliance\nsolution.\nSelect\nPolicies\nin the left navigation, then select a policy that contains the resolved message to view the policy matches.\nSelect the\nResolved\ntab.\nOn the\nResolved\ntab, select one or more messages.\nOn the command bar, select\nUnresolve\n.\nOn the\nUnresolve item\npane, add any comments, then select\nSave\n.\nSelect the\nPending\ntab to verify that the selected items are displayed.\nArchive message details outside of Communication Compliance (optional)\nExport or download message details if you need to archive the messages in a separate storage solution. If you select one or more policy matches and then select\nDownload\nin the bar over the list, you automatically add the selected messages to a .zip file that you can save in storage outside of Microsoft 365. The size limit for files downloaded by using\nDownload\nis 3 MB.\nExport a collection of message details that exceeds 3 MB in size\nTo download messages that cumulatively exceed 3 MB in size, use\nExport files\n(displayed in the upper-right corner of the\nPolicies\npage). For example, you might want to download policy matches each week for archiving purposes. The download size limit for the\nExport files\ncommand is 3 GB. The maximum number of items depends on whether you select documents (limit of 1,000 items) or users (limit of 50,000). Files are exported to a .zip file that you can then download. The .zip file contains all the message files as well as a summary .TXT file that includes data for the following fields: Document, Doc ID, Custodian, Subject/title, File name, File type, Error, and Message.\nOn the\nExports\ntab, the export has a status of\nIn progress\nor\nReady to download\n. To download a file that has a\nReady to download\nstatus, select the files batch in the\nName\nlist, then select\nDownload export(s)\nover the list.\nYou can either select the policy matches that you want to export, select\nExport files\n, and then select the list of documents or users you want to export, or you can select\nExport files\nand then select the documents, users, and date ranges that you want to export. After you select\nExport\n, it takes a few minutes for the job to complete. You receive an email notification with a link to the\nExports\ntab when the export is ready to download. To download a file that has a\nReady to download\nstatus, select the files batch in the\nName\nlist, then select\nDownload export(s)\nover the list.\nNote\nTo include attachments in exported files, you must select them manually in the list of policy matches. They're not automatically included with their parent message file.\nTip\nYou can also\ncreate and download a Message details report\n.\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Investigate Alerts", @@ -719,45 +719,45 @@ "https://learn.microsoft.com/en-us/purview/insider-risk-management": { "content_hash": "sha256:177fe4d9d8598f39fc1d3a48766b2fc0a7bd6f0da7dcf569617f382007c58246", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nLearn about Insider Risk Management\nFeedback\nSummarize this article for me\nImportant\nMicrosoft Purview Insider Risk Management\ncorrelates various signals to identify potential malicious or inadvertent insider risks, such as IP theft, data leakage, and security violations. Insider Risk Management enables customers to create policies to manage security and compliance. Built with privacy by design, users are pseudonymized by default, and role-based access controls and audit logs are in place to help ensure user-level privacy.\nMicrosoft Purview Insider Risk Management is a compliance solution that helps minimize internal risks by enabling you to detect, investigate, and act on malicious and inadvertent activities in your organization. With insider risk policies, you can define the types of risks to identify and detect in your organization. You can also set up processes for acting on cases and escalating cases to Microsoft eDiscovery (Premium) if needed. Risk analysts in your organization can quickly take appropriate actions to make sure users are compliant with your organization's compliance standards.\nFor more information and an overview of the planning process to address potentially risky activities in your organization that might lead to a security incident, see\nStarting an Insider Risk Management program\n.\nWatch the following videos to learn how Insider Risk Management can help your organization prevent, detect, and contain risks while prioritizing your organization values, culture, and user experience:\nInsider Risk Management solution & development\n:\nInsider Risk Management workflow\n:\nCheck out the\nMicrosoft Mechanics video\non how Insider Risk Management and Communication Compliance work together to help minimize data risks from users in your organization.\nImportant\nInsider Risk Management is currently available in tenants hosted in geographical regions and countries supported by Azure service dependencies. To verify that Insider Risk Management is supported for your organization, see\nAzure dependency availability by country/region\n.\nModern risk pain points\nManaging and minimizing risk in your organization starts with understanding the types of risks found in the modern workplace. Some risks come from external events and factors that you can't directly control. Other risks come from internal events and user actions that you can minimize and avoid. Some examples are risks from illegal, inappropriate, unauthorized, or unethical behavior and actions by users in your organization. These behaviors include a broad range of internal risks from users:\nLeaks of sensitive data and data spillage\nConfidentiality violations\nIntellectual property (IP) theft\nFraud\nInsider trading\nRegulatory compliance violations\nUsers in the modern workplace can create, manage, and share data across a broad spectrum of platforms and services. In most cases, organizations have limited resources and tools to identify and mitigate organization-wide risks while also meeting user privacy standards.\nInsider Risk Management uses the full breadth of service and third-party indicators to help you quickly identify, triage, and act on risk activity. By using logs from Microsoft 365 and Microsoft Graph, Insider Risk Management enables you to define specific policies to identify risk indicators. These policies help you identify risky activities and act to mitigate these risks.\nInsider Risk Management centers around the following principles:\nTransparency\n: Balance user privacy versus organization risk with privacy-by-design architecture.\nConfigurable\n: Configurable policies based on industry, geographical, and business groups.\nIntegrated\n: Integrated workflow across Microsoft Purview solutions.\nActionable\n: Provides insights to enable reviewer notifications, data investigations, and user investigations.\nIdentifying potential risks with analytics\nInsider risk analytics enables you to evaluate potential insider risks in your organization without configuring any insider risk policies. This evaluation can help your organization identify potential areas of higher user risk and help determine the type and scope of Insider Risk Management policies you might consider configuring. This evaluation can also help you determine needs for additional licensing or future optimization of existing insider risk policies.\nFor more information about insider risk analytics, see\nInsider Risk Management settings: Analytics\n.\nGet started with recommended actions\nWhether you're setting up Insider Risk Management for the first time or getting started with creating new policies, the new\nrecommended actions\nexperience can help you get the most out of Insider Risk Management capabilities. Recommended actions include setting up permissions, choosing policy indicators, creating a policy, and more.\nWorkflow\nThe Insider Risk Management workflow helps you identify, investigate, and take action to address internal risks in your organization. With focused policy templates, comprehensive activity signaling across the Microsoft 365 service, and alert and case management tools, you can use actionable insights to quickly identify and act on risky behavior.\nIdentifying and resolving internal risk activities and compliance issues with Insider Risk Management uses the following workflow:\nPolicies\nYou create\nInsider Risk Management policies\nby using predefined templates and policy conditions that define which triggering events and risk indicators to examine in your organization. These conditions include how risk indicators are used for alerts, which users are included in the policy, which services are prioritized, and the detection time period.\nTo quickly get started with Insider Risk Management, select from the following policy templates:\nData theft by departing users\nData leaks\nData leaks by priority users\nData leaks by risky users\nPatient data misuse (preview)\nRisky Agents\nRisky AI usage\nRisky browser usage (preview)\nSecurity policy violations\nSecurity policy violations by departing users\nSecurity policy violations by risky users\nSecurity policy violations by priority users\nAlerts\nRisk indicators automatically generate alerts when they match policy conditions. You can see these alerts in the\nAlerts dashboard\nor the\nTriage Agent dashboard\n. These dashboards provide a quick view of all alerts that need review, open alerts over time, and alert statistics for your organization. All policy alerts display the following information to help you quickly identify the status of existing alerts and new alerts that need action:\nID\nUsers\nAlert\nStatus\nAlert severity\nTime detected\nCase\nCase status\nRisk factors\nTriage\nNew user activities that need investigation automatically generate alerts and assign them a\nNeeds review\nstatus. Reviewers can quickly identify, review, evaluate, and triage these alerts.\nResolve alerts by opening a new case, assigning the alert to an existing case, or dismissing the alert. With alert filters, you can quickly identify alerts by status, severity, or time detected. As part of the triage process, reviewers can view alert details for the activities identified by the policy, view user activity associated with the policy match, see the severity of the alert, and review user profile information.\nInvestigate\nQuickly investigate all risky activities for a selected user with\nUser activity reports (preview)\n. These reports let investigators in your organization examine activities for specific users during a defined time period without temporarily or explicitly assigning them to an Insider Risk Management policy. After examining activities for a user, investigators can dismiss individual activities as benign, share or email a link to the report with other investigators, or choose to assign the user temporarily or explicitly to an Insider Risk Management policy.\nCases\nare created for alerts that require deeper review and investigation of the activity details and circumstances around the policy match. The\nCase dashboard\nprovides an all-up view of all active cases, open cases over time, and case statistics for your organization. Reviewers can quickly filter cases by status, the date the case was opened, and the date the case was last updated.\nSelecting a case on the case dashboard opens the case for investigation and review. This step is the heart of the Insider Risk Management workflow. This area synthesizes risk activities, policy conditions, alert details, and user details into an integrated view for reviewers. The primary investigation tools in this area are:\nUser activity\n: An interactive chart automatically displays user risk activity. It plots activities over time and by risk level for current or past risk activities. Reviewers can quickly filter and view the entire risk history for the user and drill into specific activities for more details.\nContent explorer\n: The Content explorer automatically captures and displays all data files and email messages associated with alert activities. Reviewers can filter and view files and messages by data source, file type, tags, conversation, and many more attributes.\nCase notes\n: Reviewers can provide notes for a case in the Case Notes section. This list consolidates all notes in a central view and includes reviewer and date submitted information.\nAdditionally, the new\nAudit log (preview)\nenables you to stay informed of the actions taken on Insider Risk Management features. This resource allows an independent review of the actions taken by users assigned to one or more Insider Risk Management role groups.\nAction\nAfter investigating cases, reviewers can quickly act to resolve the case or collaborate with other risk stakeholders in your organization. If users accidentally or inadvertently violate policy conditions, reviewers can send a simple reminder notice to the user from notice templates you can customize for your organization. These notices might serve as simple reminders or might direct the user to refresher training or guidance to help prevent future risky behavior. For more information, see\nInsider Risk Management notice templates\n.\nIn more serious situations, you might need to share the Insider Risk Management case information with other reviewers or services in your organization. Insider Risk Management tightly integrates with other Microsoft Purview solutions to help you with end-to-end risk resolution.\neDiscovery (Premium)\n: Escalating a case for investigation allows you to transfer data and management of the case to Microsoft Purview eDiscovery (Premium). eDiscovery (Premium) provides an end-to-end workflow to preserve, collect, review, analyze, and export content that's responsive to your organization's internal and external investigations. It allows legal teams to manage the entire legal hold notification workflow. To learn more about eDiscovery (Premium) cases, see\nOverview of Microsoft Purview eDiscovery (Premium)\n.\nOffice 365 Management APIs integration (preview)\n: Insider Risk Management supports exporting alert information to security information and event management (SIEM) services via the Office 365 Management APIs. Having access to alert information in the platform that best fits your organization's risk processes gives you more flexibility in how to act on risk activities. To learn more about exporting alert information with Office 365 Management APIs, see\nExport alerts\n.\nScenarios\nInsider Risk Management can help you detect, investigate, and take action to mitigate internal risks in your organization in several common scenarios:\nData theft by departing users\nWhen users leave an organization, either voluntarily or as the result of termination, legitimate concerns often arise that company, customer, and user data are at risk. Users might innocently assume that project data isn't proprietary, or they might be tempted to take company data for personal gain and in violation of company policy and legal standards. Insider Risk Management policies that use the\nData theft by departing users\npolicy template automatically detect activities typically associated with this type of theft. With this policy, you automatically receive alerts for suspicious activities associated with data theft by departing users so you can take appropriate investigative actions. You need to configure a\nMicrosoft 365 HR connector\nfor your organization for this policy template.\nIntentional or unintentional leak of sensitive or confidential information\nIn most cases, users try their best to properly handle sensitive or confidential information. But occasionally users make mistakes and information is accidentally shared outside your organization or in violation of your information protection policies. In other circumstances, users might intentionally leak or share sensitive and confidential information with malicious intent and for potential personal gain. Insider Risk Management policies created using the following policy templates automatically detect activities typically associated with sharing sensitive or confidential information:\nData leaks\nData leaks by priority users\nData leaks by risky users\nRisk AI usage\nIntentional or unintentional security policy violations (preview)\nUsers typically have a large degree of control when managing their devices in the modern workplace. This control might include permissions to install or uninstall applications needed in the performance of their duties or the ability to temporarily disable device security features. Whether this risk activity is inadvertent, accidental, or malicious, this conduct can pose risk to your organization and is important to identify and act to minimize. To help identify these risky security activities, the following Insider Risk Management security policy violation templates scores security risk indicators and use Microsoft Defender for Endpoint alerts to provide insights for security-related activities:\nSecurity policy violations\nSecurity policy violations by departing users\nSecurity policy violations by priority users\nSecurity policy violations by risky users\nPolicies for users based on position, access level, or risk history\nUsers in your organization might have different levels of risk depending on their position, level of access to sensitive information, or risk history. This structure might include members of your organization's executive leadership team, IT administrators that have extensive data and network access privileges, or users with a past history of risky activities. In these circumstances, closer inspection and more aggressive risk scoring are important to help surface alerts for investigation and quick action. To help identify risky activities for these types of users, you can create priority user groups and create policies from the following policy templates:\nSecurity policy violations by priority users\nData leaks by priority users\nHealthcare (preview)\nFor organizations in the healthcare industry, recent studies found a very high rate of insider-related data breaches. Detecting misuse of patient data and health record information is a critical component of safeguarding patient privacy and complying with compliance regulation such as the Health Insurance Portability and Accountability Act (HIPAA) and the Health Information Technology for Economic and Clinical Health (HITECH) Act. Patient data misuse can range from accessing privileged patient records to accessing records of patients from family or neighbors with malicious intent. To help identify these types of risky activities, the following Insider Risk Management policy template uses the Microsoft 365 HR connector and a healthcare-specific data connector to start scoring risk indicators relating to behaviors that can occur within your electronic heath record (EHR) systems:\nPatient data misuse (preview)\nActions and behaviors by risky users\nEmployment stressor events can impact user behavior in several ways that relate to insider risks. These stressors might be a poor performance review, a position demotion, or the user being placed on a performance review plan. Stressors might also result in potentially inappropriate behavior such as users sending potentially threatening, harassing, or discriminatory language in email and other messages. Though most users don't respond maliciously to these events, the stress of these actions can result in some users behaving in ways that they might not normally consider during normal circumstances. To help identify these types of potentially risky activities, the following Insider Risk Management policy templates can use the HR connector and/or integration with a\ndedicated Communication Compliance policy\nto bring users into scope for Insider Risk Management policies and start scoring risk indicators relating to behaviors that might occur:\nData leaks by risky users\nRisk AI usage\nRisky browser usage (preview)\nSecurity policy violations by risky users\nVisual context for potentially risky user activities with forensic evidence\nHaving visual context is crucial for security teams during forensic investigations to get better insights into potentially risky user activities that might lead to a security incident. This insight might include visual capturing of these activities to help evaluate if they're indeed risky or taken out of context and not potentially risky. For activities that are determined to be risky, having forensic evidence captures can help investigators and your organization better mitigate, understand, and respond to these activities. To help with this scenario,\nenable forensic evidence capturing\nfor online and offline devices in your organization.\nReady to get started?\nSee\nPlan for Insider Risk Management\nto prepare for enabling Insider Risk Management policies in your organization.\nSee\nGet started with Insider Risk Management settings\nto configure global settings for insider risk policies.\nSee\nGet started with Insider Risk Management\nto configure prerequisites, create policies, and start receiving alerts.\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Insider Risk Management", "section": "Microsoft Purview" }, "https://learn.microsoft.com/en-us/purview/insider-risk-management-policies": { - "content_hash": "sha256:ba949ac2a4d17069a0a0c5ff6aa27a13187431937184700758c081e58b9fd97a", - "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nCreate and manage Insider Risk Management policies\nFeedback\nSummarize this article for me\nImportant\nMicrosoft Purview Insider Risk Management\ncorrelates various signals to identify potential malicious or inadvertent insider risks, such as IP theft, data leakage, and security violations. Insider Risk Management enables customers to create policies to manage security and compliance. Built with privacy by design, users are pseudonymized by default, and role-based access controls and audit logs are in place to help ensure user-level privacy.\nInsider Risk Management policies determine which users are in scope and which types of risk indicators are configured for alerts. You can quickly create a security policy that applies to all users in your organization or define individual users or groups for management in a policy. Policies support content priorities to focus policy conditions on multiple or specific Microsoft Teams, SharePoint sites, data sensitivity types, and data labels. By using templates, you can select specific risk indicators and customize event thresholds for policy indicators, effectively customizing risk scores, and level and frequency of alerts.\nYou can also configure quick data leak and data theft policies by departing user policies that automatically define policy conditions based on results from the latest analytics. Also, risk score boosters and anomaly detections help identify potentially risky user activity that is of higher importance or unusual. Policy windows allow you to define the time frame to apply the policy to alert activities and are used to determine the duration of the policy once activated.\nFor an overview of how policies created with built-in policy templates can help you quickly act on potential risks, see the\nInsider Risk Management Policies Configuration video\n.\nTip\nGet started with Microsoft Security Copilot to explore new ways to work smarter and faster using the power of AI. Learn more about\nMicrosoft Security Copilot in Microsoft Purview\n.\nPolicy dashboard\nThe\nPolicy dashboard\nlets you quickly see the user and agent policies in your organization, the health of each policy, manually add users to security policies, and view the status of alerts associated with each policy. Your policies are separated into two categories:\nUser policy\nand\nAgent policy\n. Switch between the two to locate relevant policy details.\nAgent policy\ncovers agents from Microsoft Copilot Studio and Microsoft Foundry.\nPolicy name\n: Name assigned to the policy in the policy workflow.\nStatus\n: Health status for each policy. Displays number of policy warnings and recommendations, or a status of\nHealthy\nfor policies without issues. You can select the policy to see the health status details for any warnings or recommendations.\nActive alerts\n: Number of active alerts for each policy.\nConfirmed alerts\n: Total number of alerts that resulted in cases from the policy in the last 365 days.\nActions taken on alerts\n: Total number of alerts that were confirmed or dismissed for the last 365 days.\nPolicy alert effectiveness\n: Percentage determined by total confirmed alerts divided by total actions taken on alerts (which is the sum of alerts that were confirmed or dismissed over the past year).\nPolicy recommendations from analytics\nInsider risk analytics gives you an aggregate view of anonymized user activities related to security and compliance. With this view, you can evaluate potential insider risks in your organization without configuring any insider risk policies. This evaluation helps your organization identify potential areas of higher risk and helps determine the type and scope of Insider Risk Management policies you might consider configuring. If you decide to act on analytics scan results for\ndata leaks\nor\ndata theft\nby departing users policies, you can even configure a quick policy based on these results.\nFor more information about insider risk analytics and policy recommendations, see\nInsider Risk Management settings: Analytics\n.\nNote\nYou must be an unrestricted administrator to access analytics insights.\nLearn how administrative groups affect permissions\n.\nQuick policies\nFor many organizations, getting started with an initial policy can be a challenge. If you're new to Insider Risk Management or are using the recommended actions to get started, use a quick policy to create and configure a new policy. Quick policy settings automatically populate from recommended best practices or on results from the latest analytics scan in your organization. For example, if the analytics check detected potential data leak activities in your organization, the\nDate leaks\nquick policy automatically includes the indicators used to detect those activities.\nYou can choose from the following quick policies:\nCritical assets protection\n: Detects activities involving your organization's most valuable assets. Loss of these assets could result in legal liability, financial loss, or reputational damage.​\nData leaks\n: Detect potential data leaks from all users in your organization, which can range from accidental oversharing of sensitive info to data theft with malicious intent. ​\nData theft from Microsoft 365 apps by users leaving your organization\n: Detects potential data theft from Microsoft 365 cloud apps by users leaving your organization or whose account was deleted from Microsoft Entra ID.\nData theft from non-Microsoft 365 apps by users leaving your organization\n: (preview) Detects potential data theft from non-Microsoft 365 cloud apps, including Microsoft Fabric, by users leaving your organization or whose account was deleted from Microsoft Entra ID.\nEmail exfiltration\n: Detects when users email sensitive assets outside your organization. For example, users emailing sensitive assets to their personal email address.​\nTo get started, go to\nInsider Risk Management\n>\nPolicies\nand select\nCreate policy\n>\nQuick policy\n. If you're reviewing analytics reports, select\nView details\n>\nGet started\nto get started with a quick policy for the applicable area.\nWhen you start the quick policy workflow, review the policy settings and configure the policy with a single selection. If you need to customize a quick policy, change the conditions after the policy is created. Stay up to date with the detection results for a quick policy by configuring email notifications each time you have a policy warning or each time the policy generates a high severity alert.\nNote\nYou must be an unrestricted administrator to create quick policies.\nLearn how administrative groups affect permissions\n.\nPrioritize content in policies\nInsider Risk Management policies support specifying a higher priority for content depending on where you store it, the type of content, or how you classify it. You can also choose whether to assign risk scores to all activities detected by a policy or only activities that include priority content. Specifying content as a priority increases the risk score for any associated activity, which in turn increases the chance of generating a high severity alert. However, some activities don't generate an alert at all unless the related content contains built-in or custom sensitive info types or you specify it as a priority in the policy.\nFor example, your organization has a dedicated SharePoint site for a highly confidential project. Data leaks for information in this SharePoint site could compromise the project and would have a significant impact on its success. By prioritizing this SharePoint site in a Data leaks policy, you automatically increase risk scores for qualifying activities. This prioritization increases the likelihood that these activities generate an insider risk alert and raises the severity level for the alert.\nAdditionally, you can choose to focus this policy for SharePoint site activity that only includes priority content for this project. Risk scores are assigned and alerts are generated only when specified activities include priority content. Activities without priority content aren't scored, but you can still review them if an alert is generated.\nImportant\nCumulative exfiltration activity always receives a risk score, regardless of whether or not it contains prioritized content.\nThe following policy templates support priority content selection:\nData leaks\nData leaks by priority users\nData leaks by risky users\nData theft by departing users\nRisky AI usage\nNote\nIf you configure a policy to generate alerts only for activity that includes priority content, no changes are applied to risk score boosters.\nWhen you create an Insider Risk Management policy in the policy workflow, you can choose from the following priorities:\nSharePoint sites\n: Assign a higher risk score to any activity associated with all file types in defined SharePoint sites. Users configuring the policy and selecting priority SharePoint sites can select SharePoint sites that they have permission to access. If SharePoint sites aren't available for selection in the policy by the current user, another user with the required permissions can select the sites for the policy later, or the current user should be given access to the required sites.\nSensitive information types\n: Assign a higher risk score to any activity associated with content that contains\nsensitive information types\n.\nSensitivity labels\n: Assign a higher risk score to any activity associated with content that has specific\nsensitivity labels\napplied.\nFile extensions\n: Assign a higher risk score to any activity associated with content that has specific file extensions. Users configuring a data theft/leak policy that selects\nFile extensions to prioritize\nin the policy workflow can define up to 50 file extensions to prioritize in the policy. Entered extensions can include or omit a '.' as the first character of the prioritized extension.\nTrainable classifiers\n: Assign a higher risk score to any activity associated with content that is included in a\ntrainable classifier\n. Users configuring a policy that selects Trainable classifiers in the policy workflow can select up to 5 trainable classifiers to apply to the policy. These classifiers can be existing classifiers that identify patterns of sensitive information like social security, credit card, or bank account numbers or custom classifiers created in your organization.\nSequence detection\nRisk management activities might not occur as isolated events. These risks are frequently part of a larger sequence of events. A sequence is a group of two or more potentially risky activities performed one after the other that might suggest an elevated risk. Identifying these related user activities is an important part of evaluating overall risk. When you select sequence detection for data theft or data leaks policies, you see insights from sequence information activities on the\nUser activity\ntab within an Insider Risk Management case. The following policy templates support sequence detection:\nData leaks\nData leaks by priority users\nData leaks by risky users\nData theft by departing users\nRisky AI usage\nThese Insider Risk Management policies can use specific indicators and the order that they occur to detect each step in a sequence of risk. Selected sequence detections only detect sequences of risk in the order displayed in policy setup workflow. For policies created from the\nData leaks\nand\nData leaks by priority user\ntemplates, you can also select which sequences trigger the policy. File names are used when mapping activities across a sequence. These risks are organized into four main categories of activity:\nCollection\n: Detects download activities by in-scope policy users. Example risk management activities include downloading files from SharePoint sites, third-party cloud services, unallowed domains, or moving files into a compressed folder.\nExfiltration\n: Detects sharing or extraction activities to internal and external sources by in-scope policy users. An example risk management activity includes sending emails with attachments from your organization to external recipients.\nObfuscation\n: Detects the masking of potentially risky activities by in-scope policy users. An example risk management activity includes renaming files on a device.\nClean-up\n: Detects deletion activities by in-scope policy users. An example risk management activity includes deleting files from a device.\nNote\nSequence detection uses indicators that you enable in the global settings for Insider Risk Management. If you don't select appropriate indicators, you can turn on these indicators in the sequence detection step in the policy workflow.\nYou can customize individual threshold settings for each sequence detection type when you configure it in the policy. These threshold settings adjust alerts based on the volume of files associated with the sequence type.\nNote\nA\nsequence\nmight contain one or more events that are excluded from risk scoring based on your settings configuration. For example, your organization might use the\nGlobal exclusions\nsetting\nto exclude .png files from risk scoring since .png files aren't normally risky. But a .png file could be used to obfuscate a malicious activity. For this reason, if an event that's excluded from risk scoring is part of a sequence due to an obfuscation activity, the event is included in the sequence since it might be interesting in the context of the sequence.\nLearn more about how exclusions that are part of a sequence are shown in the Activity explorer\n.\nTo learn more about sequence detection management in the\nUser activity\nview, see\nInsider Risk Management cases: User activity\n.\nCumulative exfiltration detection\nWith privacy on by default, insider risk indicators help identify unusual levels of risk activities when evaluated daily for users that are in-scope for insider risk policies. Cumulative exfiltration detection uses machine learning models to help you identify when exfiltration activities that a user performs over a certain time exceed the normal amount performed by users in your organization for the past 30 days over multiple exfiltration activity types. For example, if a user shared more files than most users over the past month, this activity is detected and classified as a cumulative exfiltration activity.\nInsider Risk Management analysts and investigators might use cumulative exfiltration detection insights to help identify exfiltration activities that might not typically generate\nalerts\nbut are more than what is typical for their organization. Some examples might be departing users slowly exfiltrate data across a range of days, or when users repeatedly share data across multiple channels more than usual for data sharing for your organization, or compared to their peer groups.\nNote\nBy default, cumulative exfiltration detection generates risk scores based on a user's cumulative exfiltration activity compared to their organization norms. You can enable\nCumulative exfiltration detection\noptions in the\nPolicy indicators\nsection of the Insider Risk Management settings page.\nHigher risk scores are assigned to cumulative exfiltration activities for SharePoint sites, sensitive information types, and content with\nsensitivity labels\nconfigured as priority content in a policy or for activity involving labels configured as high priority in\nMicrosoft Purview Information Protection\n.\nCumulative exfiltration detection is enabled by default when using the following policy templates:\nData leaks\nData leaks by priority users\nData leaks by risky users\nData theft by departing users\nImportant\nThe option to only score activities containing priority content doesn't apply to cumulative exfiltration activities. Cumulative exfiltration activity always receives a risk score, regardless of whether or not it contains prioritized content.\nPeer groups for cumulative exfiltration detection\nInsider Risk Management identifies three types of peer groups for analyzing exfiltration activity performed by users. It defines peer groups for users based on the following criteria:\nSharePoint sites\n: Insider Risk Management identifies peer groups based on users who access similar SharePoint sites.\nSimilar organization\n: Users with reports and team members based on organization hierarchy. This option requires that your organization uses Microsoft Entra ID to maintain organization hierarchy.\nSimilar job title\n: Users with a combination of organizational distance and similar job titles. For example, a user with a Senior Sales Manager title with a similar role designation as a Lead Sales Manager in the same organization is identified as similar job title. This option requires that your organization uses Microsoft Entra ID to maintain organization hierarchy, role designations, and job titles. If you don't have Microsoft Entra ID configured for organization structure and job titles, then Insider Risk Management identifies peer groups based on common SharePoint sites.\nWhen you enable cumulative exfiltration detection, your organization agrees to share Microsoft Entra data with the Microsoft Purview portal, including organization hierarchy and job titles. If your organization doesn't use Microsoft Entra ID to maintain this information, detection might be less accurate.\nNote\nCumulative exfiltration detection uses exfiltration indicators that you enable in the global settings for Insider Risk Management and exfiltration indicators that you select in a policy. As such, cumulative exfiltration detection only evaluates the necessary exfiltration indicators. Cumulative exfiltration activities for\nsensitivity labels\nconfigured in priority content generate higher risk scores.\nWhen you enable cumulative exfiltration detection for data theft or data leak policies, insights from cumulative exfiltration activities appear on the\nUser activity\ntab within an Insider Risk Management case. For more information about user activity management, see\nInsider Risk Management cases: User activities\n.\nPolicy health\nNote\nIf your policy is\nscoped by one or more administrative units\n, you can only view policy health for the policies you're scoped for. If you're an unrestricted administrator, you can view policy health for all policies in the tenant.\nThe policy health status gives you insights into potential issues with your Insider Risk Management policies. The\nStatus\ncolumn on the\nPolicies\ntab alerts you to policy issues that might prevent user activity from being reported or explain why the number of activity alerts is unusual. The policy health status also confirms that the policy is healthy and doesn't need attention or configuration changes.\nImportant\nYou must have the\nInsider Risk Management\nor the\nInsider Risk Management Admins\nrole to access policy health.\nIf there are issues with a policy, the policy health status displays notification warnings and recommendations to help you take action to resolve policy issues. These notifications can help you resolve the following issues:\nPolicies with incomplete configuration\n. These issues might include missing users or groups in the policy or other incomplete policy configuration steps.\nPolicies with indicator configuration issues\n. Indicators are an important part of each policy. If you don't configure indicators, or if you select too few indicators, the policy might not evaluate risky activities as expected.\nPolicy triggers aren't working, or policy trigger requirements aren't properly configured\n. Policy functionality might depend on other services or configuration requirements to effectively detect triggering events to activate risk score assignment to users in the policy. These dependencies might include issues with connector configuration, Microsoft Defender for Endpoint alert sharing, or data loss prevention policy configuration settings.\nVolume limits are nearing or over limits\n. Insider Risk Management policies use numerous Microsoft 365 services and endpoints to aggregate risk activity signals. Depending on the number of users in your policies, volume limits might delay identification and reporting of risk activities. Learn more about these limits in the Policy template limits section of this article.\nTo quickly view the health status for a policy, go to the\nPolicy\ntab and check the\nStatus\ncolumn. You see the following policy health status options for each policy:\nHealthy\n: No issues are identified with the policy.\nRecommendations\n: An issue with the policy that might prevent the policy from operating as expected.\nWarnings\n: An issue with the policy that might prevent it from identifying potentially risky activities.\nFor more details about any recommendations or warnings, select a policy on the\nPolicy\ntab to open the policy details card. The\nNotifications\nsection of the details card displays more information about the recommendations and warnings, including guidance on how to address these issues.\nNotification messages\nUse the following table to learn more about recommendations and warning notifications and actions to take to resolve potential issues.\nNotification messages\nPolicy templates\nCauses / Try this action to fix\nDLP policy doesn't meet requirements\n- Data leaks\n- Data leaks by priority users\nDLP policies used as triggering events must be configured to generate high severity alerts.\n1. Edit your DLP policy to assign applicable alerts as\nHigh severity\n.\nOR\n2. Edit this policy and select\nUser performs an exfiltration activity\nas the triggering event.\nDLP policy isn't selected as the triggering event\n- Data leaks\n- Data leaks by priority users\nA DLP policy wasn't selected as a triggering event or the selected DLP policy was deleted.\nEdit the policy and either select an active DLP policy or 'User performs an exfiltration activity' as the triggering event in the policy configuration.\nDLP policy used in this policy is turned off\n- Data leaks\n- Data leaks by priority users\nDLP policy used in this policy is turned off.\n1. Turn the DLP policy assigned to this policy on.\nOR\n2. Edit this policy and either select a new DLP policy or 'User performs an exfiltration activity' as the triggering event in the policy configuration.\nHR connector hasn't uploaded data recently\n- Data theft by departing user\n- Security policy violations by departing user\n- Data leaks by risky users\n- Security policy violations by risky users\nHR connector didn't import data in more than 7 days.\nCheck that your HR connector is configured correctly and sending data.\nHR connector isn't configured or working as expected\n- Data theft by departing user\n- Security policy violations by departing user\n- Data leaks by risky users\n- Security policy violations by risky users\nThere's an issue with the HR connector.\n1. If you're using an HR connector, check that your HR connector is sending correct data\nOR\n2. Select the Microsoft Entra account deleted triggering event.\nMicrosoft Defender for Endpoint alerts aren't being shared with the Microsoft Purview portal\n- Security policy violations\n- Security policy violations by departing users\n- Security policy violations by risky users\n- Security policy violations by priority users\nMicrosoft Defender for Endpoint alerts aren't being shared with the Microsoft Purview portal.\nConfigure sharing of Microsoft Defender for Endpoint alerts.\nNo devices are onboarded\n- Data theft by departing users\n- Data leaks\n- Data leaks by risky users\n- Data Leaks by priority users\nDevice indicators are selected but there aren't any devices onboarded to the Microsoft Purview portal\nCheck whether devices are onboarded and meet requirements.\nNo indicators have been selected for this policy\nAll policy templates\nIndicators weren't selected for the policy\nEdit your policy and select appropriate policy indicators for the policy.\nNo priority user groups are included in this policy\n- Data leaks by priority users\n- Security policy violations by priority users\nPriority user groups aren't assigned to the policy.\nConfigure priority user groups in Insider Risk Management settings and assign priority user groups to the policy.\nNo triggering event has been selected for this policy\nAll policy templates\nA triggering event isn't configured for the policy\nRisk scores won't be assigned to user activities until you edit the policy and select a triggering event.\nNo users or groups are included in this policy\nAll policy templates\nUsers or groups aren't assigned to the policy.\nEdit your policy and select users or groups for the policy.\nPolicy hasn't generated any alerts\nAll policy templates\nYou might want to review your policy configuration so that you're analyzing the most relevant scoring activity.\n1. Confirm that you selected indicators that you want to score. The more indicators selected, the more activities are assigned risk scores.\n2. Review threshold customization for policy. If the thresholds selected don't align with your organization's risk tolerance, adjust the selections so that alerts are created based on your preferred thresholds.\n3. Review the users and groups selected for the policy. Confirm you selected all of the applicable users and groups.\n4. For security violation policies, confirm you selected the alert triage status that you want to score for Microsoft Defender for Endpoint alerts in Intelligent Detections in settings.\nPolicy isn't assigning risk scores to activity\nAll policy templates\nYou might want to review your policy scope and triggering event configuration so that the policy can assign risk scores to activities\n1. Review the users that are selected for the policy. If you have few users selected, you might want to select additional users.\n2. If you're using an HR connector, check that your HR connector is sending the correct data.\n3. If you're using a DLP policy as your triggering event, check your DLP policy configuration to ensure it's configured to be used in this policy.\n4. For security violation policies, review the Microsoft Defender for Endpoint alert triage status selected in Insider risk settings > Intelligent detections. Confirm that the alert filter isn't too narrow.\nTriggering event is repeatedly occurring for over 15% of users in this policy\nAll policy templates\nAdjust the triggering event to help reduce how often users are brought into the policy scope.\nWe're unable to check the status of your HR connector right now, please check again later\n- Data theft by departing user\n- Security policy violations by departing user\n- Data leaks by risky users\n- Security policy violations by risky users\nThe Insider Risk Management solution is unable to check the status of your HR connector.\nCheck that your HR connector is configured correctly and sending data, or come back and check the policy status.\nYou're approaching the maximum limit of users being actively scored for this policy template\nAll policy templates\nEach policy template has a maximum number of included users. See the template limit section details.\nReview the users in the Users tab and remove any users who don't need to be scored anymore.\nYour organization doesn't have a Microsoft Defender for Endpoint subscription\n- Security policy violations\n- Security policy violations by departing users\n- Security policy violations by risky users\n- Security policy violations by priority users\nAn active Microsoft Defender for Endpoint subscription wasn't detected for your organization.\nUntil a Microsoft Defender for Endpoint subscription is added, these policies won't assign risk scores to user activity.\nCreate a new policy\nTo create a new Insider Risk Management policy, generally use the policy workflow in the\nInsider Risk Management\nsolution in the Microsoft Purview portal. You can also create quick policies for general data leaks and data theft by departing users from Analytics checks if applicable.\nComplete\nStep 6: Create an Insider Risk Management policy\nto configure new insider risk policies.\nUpdate a policy\nSign in to the\nMicrosoft Purview portal\nwith credentials for an admin account in your Microsoft 365 organization.\nGo to the\nInsider Risk Management\nsolution.\nSelect\nPolicies\nin the left navigation.\nOn the policy dashboard, select the policy you want to update.\nOn the policy details page, select\nEdit policy\n.\nOn the\nName and description\npage, update the description for the policy if you want.\nNote\nYou can't edit the\nPolicy template\nor\nName\nfield.\nSelect\nNext\nto continue.\nFollow\nstep 7 of the Create a policy procedure\n.\nCopy a policy\nYou might need to create a new policy that's similar to an existing policy but needs just a few configuration changes. Instead of creating a new policy from scratch, you can copy an existing policy and then modify the areas that need to be updated in the new policy.\nSign in to the\nMicrosoft Purview portal\nwith credentials for an admin account in your Microsoft 365 organization.\nGo to the\nInsider Risk Management\nsolution.\nSelect\nPolicies\nin the left navigation.\nOn the policy dashboard, select the policy you want to copy.\nOn the policy details page, select\nCopy\n.\nIn the policy workflow, name the new policy, and then update the policy configuration as needed.\nImmediately start scoring user activity\nSome scenarios require assigning risk scores to users with insider risk policies outside of the Insider Risk Management triggering event workflow. Use\nStart scoring activity for users\non the\nPolicies\ntab to manually add one or more users to one or more insider risk policies for a specific amount of time. This action starts assigning risk scores to their activity and bypasses the requirement for a user to have a triggering indicator, like a DLP policy match or an Employment End Date from the HR Connector.\nThe value in the\nReason for scoring activity\nfield appears on the users' activity timeline. The\nUsers\ndashboard displays users you manually add to policies, and alerts are created if the activity meets the policy alert thresholds. You can have up to 4,000 users in scope that you manually add by using the\nStart scoring activity for users\nfeature.\nSome scenarios where you might want to immediately start scoring user activities include:\nYou identify users with risk concerns and want to immediately start assigning risk scores to their activity for one or more of your policies.\nThere's an incident that might require you to immediately start assigning risk scores to involved users' activity for one or more of your policies.\nYou haven't configured your HR connector yet, but you want to start assigning risk scores to user activities for HR events by uploading a .csv file.\nNote\nIt might take several hours for manually added users to appear in the\nUsers\ndashboard. Activities for the previous 90 days for these users might take up to 24 hours to display. To view activities for manually added users, go to the\nUsers\ntab, select the user on the\nUsers\ndashboard, and then open the\nUser activity\ntab on the details pane.\nManually start scoring activity for users in one or more Insider Risk Management policies\nSign in to the\nMicrosoft Purview portal\nwith credentials for an admin account in your Microsoft 365 organization.\nGo to the\nInsider Risk Management\nsolution.\nSelect\nPolicies\nin the left navigation.\nOn the policy dashboard, select the policies where you want to add users.\nSelect\nStart scoring activity for users\n.\nIn the\nAdd users to multiple policies\npane, enter a reason for adding the users in the\nReason\nfield.\nDefine the number of days to score the user's activity in the\nThis should last for (choose between 5 and 30 days)\nfield.\nEnter the name of the user you want to add, or use the\nSearch user to add to policies\nfield to search for a user, then select the user name. Repeat this process to assign additional users. The list of users you select appears in the users section of the\nAdd users to multiple policies\npane.\nNote\nIf the policy is scoped by\none or more administrative units\n, you can only see users that you're scoped for.\nSelect\nImport\nto import a .csv (comma-separated values) file to import a list of users. The file must be in the following format and must list the user principal names:\nuser principal name\nuser1@domain.com\nuser2@domain.com\nSelect\nAdd users to policies\nto accept the changes.\nStop scoring users in a policy\nTo stop scoring users in a policy, see the\nInsider Risk Management users: Remove users from in-scope assignment to policies\narticle.\nDelete a policy\nImportant\nYou can't undo a policy deletion.\nWhen you delete a policy, you have two options. You can:\nDelete just the policy.\nDelete the policy and all associated alerts and users.\nIf you choose the second option:\nAll alerts generated by that policy are deleted unless they're associated with a case. Associated cases are never deleted when you delete a policy.\nAny user associated with an alert from that policy is removed from the\nUsers\npage.\nIf a user is in scope of more than one policy, you remove the user only from the policy that you're deleting. You don't remove the user from other active policies.\nFor example, you might create a policy for test purposes before rolling it out to your organization. After you finish testing, you can quickly delete the policy and all associated test data so that you can start fresh when you're ready to push the policy live.\nIt can take up to 72 hours to complete a policy deletion.\nNote\nIf you delete a policy associated with Insider Risk Management\nAdaptive Protection\n, you see a warning that Adaptive Protection stops assigning insider risk levels to users until you choose a different policy in Adaptive Protection. This warning appears because Adaptive Protection must be associated with a policy to be in effect.\nTo delete a policy\nSign in to the\nMicrosoft Purview portal\nwith credentials for an admin account in your Microsoft 365 organization.\nGo to the\nInsider Risk Management\nsolution.\nSelect\nPolicies\nin the left navigation.\nOn the policy dashboard, select the policy you want to delete.\nSelect\nDelete\non the dashboard toolbar.\nChoose one of the following options:\nSelect\nDelete only the policy\n.\nSelect\nDelete the policy and all associated alerts and users\n.\nImportant\nYou can't undo a policy deletion.\nSelect\nConfirm\n.\nYou see a message in the upper part of the screen that tells you if the deletion was successful or whether it's pending.\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "content_hash": "sha256:ba5a7ce655653e0bc97898a7d6fd720d485d98a9b75cf5c7a6ce777f55318b6f", + "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nCreate and manage Insider Risk Management policies\nFeedback\nSummarize this article for me\nImportant\nMicrosoft Purview Insider Risk Management\ncorrelates various signals to identify potential malicious or inadvertent insider risks, such as IP theft, data leakage, and security violations. Insider Risk Management enables customers to create policies to manage security and compliance. Built with privacy by design, users are pseudonymized by default, and role-based access controls and audit logs are in place to help ensure user-level privacy.\nInsider Risk Management policies determine which users are in scope and which types of risk indicators are configured for alerts. You can quickly create a security policy that applies to all users in your organization or define individual users or groups for management in a policy. Policies support content priorities to focus policy conditions on multiple or specific Microsoft Teams, SharePoint sites, data sensitivity types, and data labels. By using templates, you can select specific risk indicators and customize event thresholds for policy indicators, effectively customizing risk scores, and level and frequency of alerts.\nYou can also configure quick data leak and data theft policies by departing user policies that automatically define policy conditions based on results from the latest analytics. Also, risk score boosters and anomaly detections help identify potentially risky user activity that is of higher importance or unusual. Policy windows allow you to define the time frame to apply the policy to alert activities and are used to determine the duration of the policy once activated.\nFor an overview of how policies created with built-in policy templates can help you quickly act on potential risks, see the\nInsider Risk Management Policies Configuration video\n.\nTip\nGet started with Microsoft Security Copilot to explore new ways to work smarter and faster using the power of AI. Learn more about\nMicrosoft Security Copilot in Microsoft Purview\n.\nPolicy dashboard\nThe\nPolicy dashboard\nlets you quickly see the user and agent policies in your organization, the health of each policy, manually add users to security policies, and view the status of alerts associated with each policy. Your policies are separated into two categories:\nUser policy\nand\nAgent policy\n. Switch between the two to locate relevant policy details.\nAgent policy\ncovers agents from Microsoft Copilot Studio and Microsoft Foundry.\nPolicy name\n: Name assigned to the policy in the policy workflow.\nStatus\n: Health status for each policy. Displays number of policy warnings and recommendations, or a status of\nHealthy\nfor policies without issues. You can select the policy to see the health status details for any warnings or recommendations.\nActive alerts\n: Number of active alerts for each policy.\nConfirmed alerts\n: Total number of alerts that resulted in cases from the policy in the last 365 days.\nActions taken on alerts\n: Total number of alerts that were confirmed or dismissed for the last 365 days.\nPolicy alert effectiveness\n: Percentage determined by total confirmed alerts divided by total actions taken on alerts (which is the sum of alerts that were confirmed or dismissed over the past year).\nPolicy recommendations from analytics\nInsider risk analytics gives you an aggregate view of anonymized user activities related to security and compliance. With this view, you can evaluate potential insider risks in your organization without configuring any insider risk policies. This evaluation helps your organization identify potential areas of higher risk and helps determine the type and scope of Insider Risk Management policies you might consider configuring. If you decide to act on analytics scan results for\ndata leaks\nor\ndata theft\nby departing users policies, you can even configure a quick policy based on these results.\nFor more information about insider risk analytics and policy recommendations, see\nInsider Risk Management settings: Analytics\n.\nNote\nYou must be an unrestricted administrator to access analytics insights.\nLearn how administrative groups affect permissions\n.\nQuick policies\nFor many organizations, getting started with an initial policy can be a challenge. If you're new to Insider Risk Management or are using the recommended actions to get started, use a quick policy to create and configure a new policy. Quick policy settings automatically populate from recommended best practices or on results from the latest analytics scan in your organization. For example, if the analytics check detected potential data leak activities in your organization, the\nDate leaks\nquick policy automatically includes the indicators used to detect those activities.\nYou can choose from the following quick policies:\nCritical assets protection\n: Detects activities involving your organization's most valuable assets. Loss of these assets could result in legal liability, financial loss, or reputational damage.​\nData leaks\n: Detect potential data leaks from all users in your organization, which can range from accidental oversharing of sensitive info to data theft with malicious intent. ​\nData theft from Microsoft 365 apps by users leaving your organization\n: Detects potential data theft from Microsoft 365 cloud apps by users leaving your organization or whose account was deleted from Microsoft Entra ID.\nData theft from non-Microsoft 365 apps by users leaving your organization\n: Detects potential data theft from non-Microsoft 365 cloud apps, including Microsoft Fabric, by users leaving your organization or whose account was deleted from Microsoft Entra ID.\nEmail exfiltration\n: Detects when users email sensitive assets outside your organization. For example, users emailing sensitive assets to their personal email address.​\nTo get started, go to\nInsider Risk Management\n>\nPolicies\nand select\nCreate policy\n>\nQuick policy\n. If you're reviewing analytics reports, select\nView details\n>\nGet started\nto get started with a quick policy for the applicable area.\nWhen you start the quick policy workflow, review the policy settings and configure the policy with a single selection. If you need to customize a quick policy, change the conditions after the policy is created. Stay up to date with the detection results for a quick policy by configuring email notifications each time you have a policy warning or each time the policy generates a high severity alert.\nNote\nYou must be an unrestricted administrator to create quick policies.\nLearn how administrative groups affect permissions\n.\nPrioritize content in policies\nInsider Risk Management policies support specifying a higher priority for content depending on where you store it, the type of content, or how you classify it. You can also choose whether to assign risk scores to all activities detected by a policy or only activities that include priority content. Specifying content as a priority increases the risk score for any associated activity, which in turn increases the chance of generating a high severity alert. However, some activities don't generate an alert at all unless the related content contains built-in or custom sensitive info types or you specify it as a priority in the policy.\nFor example, your organization has a dedicated SharePoint site for a highly confidential project. Data leaks for information in this SharePoint site could compromise the project and would have a significant impact on its success. By prioritizing this SharePoint site in a Data leaks policy, you automatically increase risk scores for qualifying activities. This prioritization increases the likelihood that these activities generate an insider risk alert and raises the severity level for the alert.\nAdditionally, you can choose to focus this policy for SharePoint site activity that only includes priority content for this project. Risk scores are assigned and alerts are generated only when specified activities include priority content. Activities without priority content aren't scored, but you can still review them if an alert is generated.\nImportant\nCumulative exfiltration activity always receives a risk score, regardless of whether or not it contains prioritized content.\nThe following policy templates support priority content selection:\nData leaks\nData leaks by priority users\nData leaks by risky users\nData theft by departing users\nRisky AI usage\nNote\nIf you configure a policy to generate alerts only for activity that includes priority content, no changes are applied to risk score boosters.\nWhen you create an Insider Risk Management policy in the policy workflow, you can choose from the following priorities:\nSharePoint sites\n: Assign a higher risk score to any activity associated with all file types in defined SharePoint sites. Users configuring the policy and selecting priority SharePoint sites can select SharePoint sites that they have permission to access. If SharePoint sites aren't available for selection in the policy by the current user, another user with the required permissions can select the sites for the policy later, or the current user should be given access to the required sites.\nSensitive information types\n: Assign a higher risk score to any activity associated with content that contains\nsensitive information types\n.\nSensitivity labels\n: Assign a higher risk score to any activity associated with content that has specific\nsensitivity labels\napplied.\nFile extensions\n: Assign a higher risk score to any activity associated with content that has specific file extensions. Users configuring a data theft/leak policy that selects\nFile extensions to prioritize\nin the policy workflow can define up to 50 file extensions to prioritize in the policy. Entered extensions can include or omit a '.' as the first character of the prioritized extension.\nTrainable classifiers\n: Assign a higher risk score to any activity associated with content that is included in a\ntrainable classifier\n. Users configuring a policy that selects Trainable classifiers in the policy workflow can select up to 5 trainable classifiers to apply to the policy. These classifiers can be existing classifiers that identify patterns of sensitive information like social security, credit card, or bank account numbers or custom classifiers created in your organization.\nSequence detection\nRisk management activities might not occur as isolated events. These risks are frequently part of a larger sequence of events. A sequence is a group of two or more potentially risky activities performed one after the other that might suggest an elevated risk. Identifying these related user activities is an important part of evaluating overall risk. When you select sequence detection for data theft or data leaks policies, you see insights from sequence information activities on the\nUser activity\ntab within an Insider Risk Management case. The following policy templates support sequence detection:\nData leaks\nData leaks by priority users\nData leaks by risky users\nData theft by departing users\nRisky AI usage\nThese Insider Risk Management policies can use specific indicators and the order that they occur to detect each step in a sequence of risk. Selected sequence detections only detect sequences of risk in the order displayed in policy setup workflow. For policies created from the\nData leaks\nand\nData leaks by priority user\ntemplates, you can also select which sequences trigger the policy. File names are used when mapping activities across a sequence. These risks are organized into four main categories of activity:\nCollection\n: Detects download activities by in-scope policy users. Example risk management activities include downloading files from SharePoint sites, third-party cloud services, unallowed domains, or moving files into a compressed folder.\nExfiltration\n: Detects sharing or extraction activities to internal and external sources by in-scope policy users. An example risk management activity includes sending emails with attachments from your organization to external recipients.\nObfuscation\n: Detects the masking of potentially risky activities by in-scope policy users. An example risk management activity includes renaming files on a device.\nClean-up\n: Detects deletion activities by in-scope policy users. An example risk management activity includes deleting files from a device.\nNote\nSequence detection uses indicators that you enable in the global settings for Insider Risk Management. If you don't select appropriate indicators, you can turn on these indicators in the sequence detection step in the policy workflow.\nYou can customize individual threshold settings for each sequence detection type when you configure it in the policy. These threshold settings adjust alerts based on the volume of files associated with the sequence type.\nNote\nA\nsequence\nmight contain one or more events that are excluded from risk scoring based on your settings configuration. For example, your organization might use the\nGlobal exclusions\nsetting\nto exclude .png files from risk scoring since .png files aren't normally risky. But a .png file could be used to obfuscate a malicious activity. For this reason, if an event that's excluded from risk scoring is part of a sequence due to an obfuscation activity, the event is included in the sequence since it might be interesting in the context of the sequence.\nLearn more about how exclusions that are part of a sequence are shown in the Activity explorer\n.\nTo learn more about sequence detection management in the\nUser activity\nview, see\nInsider Risk Management cases: User activity\n.\nCumulative exfiltration detection\nWith privacy on by default, insider risk indicators help identify unusual levels of risk activities when evaluated daily for users that are in-scope for insider risk policies. Cumulative exfiltration detection uses machine learning models to help you identify when exfiltration activities that a user performs over a certain time exceed the normal amount performed by users in your organization for the past 30 days over multiple exfiltration activity types. For example, if a user shared more files than most users over the past month, this activity is detected and classified as a cumulative exfiltration activity.\nInsider Risk Management analysts and investigators might use cumulative exfiltration detection insights to help identify exfiltration activities that might not typically generate\nalerts\nbut are more than what is typical for their organization. Some examples might be departing users slowly exfiltrate data across a range of days, or when users repeatedly share data across multiple channels more than usual for data sharing for your organization, or compared to their peer groups.\nNote\nBy default, cumulative exfiltration detection generates risk scores based on a user's cumulative exfiltration activity compared to their organization norms. You can enable\nCumulative exfiltration detection\noptions in the\nPolicy indicators\nsection of the Insider Risk Management settings page.\nHigher risk scores are assigned to cumulative exfiltration activities for SharePoint sites, sensitive information types, and content with\nsensitivity labels\nconfigured as priority content in a policy or for activity involving labels configured as high priority in\nMicrosoft Purview Information Protection\n.\nCumulative exfiltration detection is enabled by default when using the following policy templates:\nData leaks\nData leaks by priority users\nData leaks by risky users\nData theft by departing users\nImportant\nThe option to only score activities containing priority content doesn't apply to cumulative exfiltration activities. Cumulative exfiltration activity always receives a risk score, regardless of whether or not it contains prioritized content.\nPeer groups for cumulative exfiltration detection\nInsider Risk Management identifies three types of peer groups for analyzing exfiltration activity performed by users. It defines peer groups for users based on the following criteria:\nSharePoint sites\n: Insider Risk Management identifies peer groups based on users who access similar SharePoint sites.\nSimilar organization\n: Users with reports and team members based on organization hierarchy. This option requires that your organization uses Microsoft Entra ID to maintain organization hierarchy.\nSimilar job title\n: Users with a combination of organizational distance and similar job titles. For example, a user with a Senior Sales Manager title with a similar role designation as a Lead Sales Manager in the same organization is identified as similar job title. This option requires that your organization uses Microsoft Entra ID to maintain organization hierarchy, role designations, and job titles. If you don't have Microsoft Entra ID configured for organization structure and job titles, then Insider Risk Management identifies peer groups based on common SharePoint sites.\nWhen you enable cumulative exfiltration detection, your organization agrees to share Microsoft Entra data with the Microsoft Purview portal, including organization hierarchy and job titles. If your organization doesn't use Microsoft Entra ID to maintain this information, detection might be less accurate.\nNote\nCumulative exfiltration detection uses exfiltration indicators that you enable in the global settings for Insider Risk Management and exfiltration indicators that you select in a policy. As such, cumulative exfiltration detection only evaluates the necessary exfiltration indicators. Cumulative exfiltration activities for\nsensitivity labels\nconfigured in priority content generate higher risk scores.\nWhen you enable cumulative exfiltration detection for data theft or data leak policies, insights from cumulative exfiltration activities appear on the\nUser activity\ntab within an Insider Risk Management case. For more information about user activity management, see\nInsider Risk Management cases: User activities\n.\nPolicy health\nNote\nIf your policy is\nscoped by one or more administrative units\n, you can only view policy health for the policies you're scoped for. If you're an unrestricted administrator, you can view policy health for all policies in the tenant.\nThe policy health status gives you insights into potential issues with your Insider Risk Management policies. The\nStatus\ncolumn on the\nPolicies\ntab alerts you to policy issues that might prevent user activity from being reported or explain why the number of activity alerts is unusual. The policy health status also confirms that the policy is healthy and doesn't need attention or configuration changes.\nImportant\nYou must have the\nInsider Risk Management\nor the\nInsider Risk Management Admins\nrole to access policy health.\nIf there are issues with a policy, the policy health status displays notification warnings and recommendations to help you take action to resolve policy issues. These notifications can help you resolve the following issues:\nPolicies with incomplete configuration\n. These issues might include missing users or groups in the policy or other incomplete policy configuration steps.\nPolicies with indicator configuration issues\n. Indicators are an important part of each policy. If you don't configure indicators, or if you select too few indicators, the policy might not evaluate risky activities as expected.\nPolicy triggers aren't working, or policy trigger requirements aren't properly configured\n. Policy functionality might depend on other services or configuration requirements to effectively detect triggering events to activate risk score assignment to users in the policy. These dependencies might include issues with connector configuration, Microsoft Defender for Endpoint alert sharing, or data loss prevention policy configuration settings.\nVolume limits are nearing or over limits\n. Insider Risk Management policies use numerous Microsoft 365 services and endpoints to aggregate risk activity signals. Depending on the number of users in your policies, volume limits might delay identification and reporting of risk activities. Learn more about these limits in the Policy template limits section of this article.\nTo quickly view the health status for a policy, go to the\nPolicy\ntab and check the\nStatus\ncolumn. You see the following policy health status options for each policy:\nHealthy\n: No issues are identified with the policy.\nRecommendations\n: An issue with the policy that might prevent the policy from operating as expected.\nWarnings\n: An issue with the policy that might prevent it from identifying potentially risky activities.\nFor more details about any recommendations or warnings, select a policy on the\nPolicy\ntab to open the policy details card. The\nNotifications\nsection of the details card displays more information about the recommendations and warnings, including guidance on how to address these issues.\nNotification messages\nUse the following table to learn more about recommendations and warning notifications and actions to take to resolve potential issues.\nNotification messages\nPolicy templates\nCauses / Try this action to fix\nDLP policy doesn't meet requirements\n- Data leaks\n- Data leaks by priority users\nDLP policies used as triggering events must be configured to generate high severity alerts.\n1. Edit your DLP policy to assign applicable alerts as\nHigh severity\n.\nOR\n2. Edit this policy and select\nUser performs an exfiltration activity\nas the triggering event.\nDLP policy isn't selected as the triggering event\n- Data leaks\n- Data leaks by priority users\nA DLP policy wasn't selected as a triggering event or the selected DLP policy was deleted.\nEdit the policy and either select an active DLP policy or 'User performs an exfiltration activity' as the triggering event in the policy configuration.\nDLP policy used in this policy is turned off\n- Data leaks\n- Data leaks by priority users\nDLP policy used in this policy is turned off.\n1. Turn the DLP policy assigned to this policy on.\nOR\n2. Edit this policy and either select a new DLP policy or 'User performs an exfiltration activity' as the triggering event in the policy configuration.\nHR connector hasn't uploaded data recently\n- Data theft by departing user\n- Security policy violations by departing user\n- Data leaks by risky users\n- Security policy violations by risky users\nHR connector didn't import data in more than 7 days.\nCheck that your HR connector is configured correctly and sending data.\nHR connector isn't configured or working as expected\n- Data theft by departing user\n- Security policy violations by departing user\n- Data leaks by risky users\n- Security policy violations by risky users\nThere's an issue with the HR connector.\n1. If you're using an HR connector, check that your HR connector is sending correct data\nOR\n2. Select the Microsoft Entra account deleted triggering event.\nMicrosoft Defender for Endpoint alerts aren't being shared with the Microsoft Purview portal\n- Security policy violations\n- Security policy violations by departing users\n- Security policy violations by risky users\n- Security policy violations by priority users\nMicrosoft Defender for Endpoint alerts aren't being shared with the Microsoft Purview portal.\nConfigure sharing of Microsoft Defender for Endpoint alerts.\nNo devices are onboarded\n- Data theft by departing users\n- Data leaks\n- Data leaks by risky users\n- Data Leaks by priority users\nDevice indicators are selected but there aren't any devices onboarded to the Microsoft Purview portal\nCheck whether devices are onboarded and meet requirements.\nNo indicators have been selected for this policy\nAll policy templates\nIndicators weren't selected for the policy\nEdit your policy and select appropriate policy indicators for the policy.\nNo priority user groups are included in this policy\n- Data leaks by priority users\n- Security policy violations by priority users\nPriority user groups aren't assigned to the policy.\nConfigure priority user groups in Insider Risk Management settings and assign priority user groups to the policy.\nNo triggering event has been selected for this policy\nAll policy templates\nA triggering event isn't configured for the policy\nRisk scores won't be assigned to user activities until you edit the policy and select a triggering event.\nNo users or groups are included in this policy\nAll policy templates\nUsers or groups aren't assigned to the policy.\nEdit your policy and select users or groups for the policy.\nPolicy hasn't generated any alerts\nAll policy templates\nYou might want to review your policy configuration so that you're analyzing the most relevant scoring activity.\n1. Confirm that you selected indicators that you want to score. The more indicators selected, the more activities are assigned risk scores.\n2. Review threshold customization for policy. If the thresholds selected don't align with your organization's risk tolerance, adjust the selections so that alerts are created based on your preferred thresholds.\n3. Review the users and groups selected for the policy. Confirm you selected all of the applicable users and groups.\n4. For security violation policies, confirm you selected the alert triage status that you want to score for Microsoft Defender for Endpoint alerts in Intelligent Detections in settings.\nPolicy isn't assigning risk scores to activity\nAll policy templates\nYou might want to review your policy scope and triggering event configuration so that the policy can assign risk scores to activities\n1. Review the users that are selected for the policy. If you have few users selected, you might want to select additional users.\n2. If you're using an HR connector, check that your HR connector is sending the correct data.\n3. If you're using a DLP policy as your triggering event, check your DLP policy configuration to ensure it's configured to be used in this policy.\n4. For security violation policies, review the Microsoft Defender for Endpoint alert triage status selected in Insider risk settings > Intelligent detections. Confirm that the alert filter isn't too narrow.\nTriggering event is repeatedly occurring for over 15% of users in this policy\nAll policy templates\nAdjust the triggering event to help reduce how often users are brought into the policy scope.\nWe're unable to check the status of your HR connector right now, please check again later\n- Data theft by departing user\n- Security policy violations by departing user\n- Data leaks by risky users\n- Security policy violations by risky users\nThe Insider Risk Management solution is unable to check the status of your HR connector.\nCheck that your HR connector is configured correctly and sending data, or come back and check the policy status.\nYou're approaching the maximum limit of users being actively scored for this policy template\nAll policy templates\nEach policy template has a maximum number of included users. See the template limit section details.\nReview the users in the Users tab and remove any users who don't need to be scored anymore.\nYour organization doesn't have a Microsoft Defender for Endpoint subscription\n- Security policy violations\n- Security policy violations by departing users\n- Security policy violations by risky users\n- Security policy violations by priority users\nAn active Microsoft Defender for Endpoint subscription wasn't detected for your organization.\nUntil a Microsoft Defender for Endpoint subscription is added, these policies won't assign risk scores to user activity.\nCreate a new policy\nTo create a new Insider Risk Management policy, generally use the policy workflow in the\nInsider Risk Management\nsolution in the Microsoft Purview portal. You can also create quick policies for general data leaks and data theft by departing users from Analytics checks if applicable.\nComplete\nStep 6: Create an Insider Risk Management policy\nto configure new insider risk policies.\nUpdate a policy\nSign in to the\nMicrosoft Purview portal\nwith credentials for an admin account in your Microsoft 365 organization.\nGo to the\nInsider Risk Management\nsolution.\nSelect\nPolicies\nin the left navigation.\nOn the policy dashboard, select the policy you want to update.\nOn the policy details page, select\nEdit policy\n.\nOn the\nName and description\npage, update the description for the policy if you want.\nNote\nYou can't edit the\nPolicy template\nor\nName\nfield.\nSelect\nNext\nto continue.\nFollow\nstep 7 of the Create a policy procedure\n.\nCopy a policy\nYou might need to create a new policy that's similar to an existing policy but needs just a few configuration changes. Instead of creating a new policy from scratch, you can copy an existing policy and then modify the areas that need to be updated in the new policy.\nSign in to the\nMicrosoft Purview portal\nwith credentials for an admin account in your Microsoft 365 organization.\nGo to the\nInsider Risk Management\nsolution.\nSelect\nPolicies\nin the left navigation.\nOn the policy dashboard, select the policy you want to copy.\nOn the policy details page, select\nCopy\n.\nIn the policy workflow, name the new policy, and then update the policy configuration as needed.\nImmediately start scoring user activity\nSome scenarios require assigning risk scores to users with insider risk policies outside of the Insider Risk Management triggering event workflow. Use\nStart scoring activity for users\non the\nPolicies\ntab to manually add one or more users to one or more insider risk policies for a specific amount of time. This action starts assigning risk scores to their activity and bypasses the requirement for a user to have a triggering indicator, like a DLP policy match or an Employment End Date from the HR Connector.\nThe value in the\nReason for scoring activity\nfield appears on the users' activity timeline. The\nUsers\ndashboard displays users you manually add to policies, and alerts are created if the activity meets the policy alert thresholds. You can have up to 4,000 users in scope that you manually add by using the\nStart scoring activity for users\nfeature.\nSome scenarios where you might want to immediately start scoring user activities include:\nYou identify users with risk concerns and want to immediately start assigning risk scores to their activity for one or more of your policies.\nThere's an incident that might require you to immediately start assigning risk scores to involved users' activity for one or more of your policies.\nYou haven't configured your HR connector yet, but you want to start assigning risk scores to user activities for HR events by uploading a .csv file.\nNote\nIt might take several hours for manually added users to appear in the\nUsers\ndashboard. Activities for the previous 90 days for these users might take up to 24 hours to display. To view activities for manually added users, go to the\nUsers\ntab, select the user on the\nUsers\ndashboard, and then open the\nUser activity\ntab on the details pane.\nManually start scoring activity for users in one or more Insider Risk Management policies\nSign in to the\nMicrosoft Purview portal\nwith credentials for an admin account in your Microsoft 365 organization.\nGo to the\nInsider Risk Management\nsolution.\nSelect\nPolicies\nin the left navigation.\nOn the policy dashboard, select the policies where you want to add users.\nSelect\nStart scoring activity for users\n.\nIn the\nAdd users to multiple policies\npane, enter a reason for adding the users in the\nReason\nfield.\nDefine the number of days to score the user's activity in the\nThis should last for (choose between 5 and 30 days)\nfield.\nEnter the name of the user you want to add, or use the\nSearch user to add to policies\nfield to search for a user, then select the user name. Repeat this process to assign additional users. The list of users you select appears in the users section of the\nAdd users to multiple policies\npane.\nNote\nIf the policy is scoped by\none or more administrative units\n, you can only see users that you're scoped for.\nSelect\nImport\nto import a .csv (comma-separated values) file to import a list of users. The file must be in the following format and must list the user principal names:\nuser principal name\nuser1@domain.com\nuser2@domain.com\nSelect\nAdd users to policies\nto accept the changes.\nStop scoring users in a policy\nTo stop scoring users in a policy, see the\nInsider Risk Management users: Remove users from in-scope assignment to policies\narticle.\nDelete a policy\nImportant\nYou can't undo a policy deletion.\nWhen you delete a policy, you have two options. You can:\nDelete just the policy.\nDelete the policy and all associated alerts and users.\nIf you choose the second option:\nAll alerts generated by that policy are deleted unless they're associated with a case. Associated cases are never deleted when you delete a policy.\nAny user associated with an alert from that policy is removed from the\nUsers\npage.\nIf a user is in scope of more than one policy, you remove the user only from the policy that you're deleting. You don't remove the user from other active policies.\nFor example, you might create a policy for test purposes before rolling it out to your organization. After you finish testing, you can quickly delete the policy and all associated test data so that you can start fresh when you're ready to push the policy live.\nIt can take up to 72 hours to complete a policy deletion.\nNote\nIf you delete a policy associated with Insider Risk Management\nAdaptive Protection\n, you see a warning that Adaptive Protection stops assigning insider risk levels to users until you choose a different policy in Adaptive Protection. This warning appears because Adaptive Protection must be associated with a policy to be in effect.\nTo delete a policy\nSign in to the\nMicrosoft Purview portal\nwith credentials for an admin account in your Microsoft 365 organization.\nGo to the\nInsider Risk Management\nsolution.\nSelect\nPolicies\nin the left navigation.\nOn the policy dashboard, select the policy you want to delete.\nSelect\nDelete\non the dashboard toolbar.\nChoose one of the following options:\nSelect\nDelete only the policy\n.\nSelect\nDelete the policy and all associated alerts and users\n.\nImportant\nYou can't undo a policy deletion.\nSelect\nConfirm\n.\nYou see a message in the upper part of the screen that tells you if the deletion was successful or whether it's pending.\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, - "last_changed": "2026-03-11T12:59:29.613756+00:00", + "last_changed": "2026-03-14T06:51:10.083468+00:00", "topic": "Create Insider Risk Policies", "section": "Microsoft Purview" }, "https://learn.microsoft.com/en-us/purview/insider-risk-management-settings-policy-indicators": { - "content_hash": "sha256:0558bbe30f14f7662444b30d344c93494e33ead4d367b41c5b4edfa9efa47f1e", - "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nConfigure policy indicators in Insider Risk Management\nFeedback\nSummarize this article for me\nImportant\nMicrosoft Purview Insider Risk Management\ncorrelates various signals to identify potential malicious or inadvertent insider risks, such as IP theft, data leakage, and security violations. Insider Risk Management enables customers to create policies to manage security and compliance. Built with privacy by design, users are pseudonymized by default, and role-based access controls and audit logs are in place to help ensure user-level privacy.\nInsider risk policy templates in Microsoft Purview Insider Risk Management define the type of risk activities that you want to detect and investigate. Each policy template is based on specific indicators that correspond to specific triggers and risk activities. All global indicators are disabled by default;\nyou must select one or more indicators to configure an Insider Risk Management policy\n.\nPolicies collect signals and trigger alerts when users perform activities related to the indicators.\nTypes of events and indicators\nInsider Risk Management uses different types of events and indicators to collect signals and create alerts:\nTriggering events\n: Events that determine if a user is active in an Insider Risk Management policy. If you add a user to an Insider Risk Management policy that doesn't have a triggering event, the policy doesn't evaluate the user as a potential risk. For example, User A is added to a policy created from the\nData theft by departing users\npolicy template and the policy and Microsoft 365 HR connector are properly configured. Until User A has a termination date reported by the HR connector, the policy doesn't evaluate User A for potential risk. Another example of a triggering event is if a user has a\nHigh\nseverity data loss prevention (DLP) policy alert when using\nData leaks\npolicies.\nGlobal settings indicators\n: Indicators enabled in global settings for Insider Risk Management define both the indicators available for configuration in policies and the types of events signals collected by Insider Risk Management. For example, if a user copies data to personal cloud storage services or portable storage devices and you select these indicators only in global settings, you can review the user's potentially risky activity in the Activity explorer. If you don't define this user in an Insider Risk Management policy, the policy doesn't evaluate the user as a potential risk and therefore doesn't assign a risk score or generate an alert.\nPolicy indicators\n: Indicators included in Insider Risk Management policies determine a risk score for an in-scope user. You enable policy indicators from indicators defined in global settings. The policy indicators activate only after a triggering event occurs for a user. Examples of policy indicators include:\nA user copies data to personal cloud storage services or portable storage devices.\nA user account is removed from Microsoft Entra ID.\nA user shares internal files and folders with unauthorized external parties.\nYou can use certain policy indicators and sequences to customize triggering events for specific policy templates. When you configure these indicators or sequences in the policy workflow for the\nGeneral data leaks\nor\nData leaks by priority users\ntemplates, you get more flexibility and customization for your policies and when users are in-scope for a policy. You can also define risk management activity thresholds for these triggering indicators for more fine-grained control in a policy.\nDefine the insider risk policy indicators that are enabled in all insider risk policies\nSelect\nSettings\n, then select\nPolicy indicators\n.\nSelect one or more policy indicators.\nThe indicators you select on the\nPolicy indicators\nsettings page can't be individually configured when creating or editing an insider risk policy in the policy workflow.\nNote\nIt might take several hours for new manually added users to appear in the\nUsers dashboard\n. Activities for the previous 90 days for these users might take up to 24 hours to display. To view activities for manually added users, select the user on the\nUsers dashboard\nand open the\nUser activity\ntab in the details pane.## Two types of policy indicators: built-in indicators and custom indicators\nIndicators and pay-as-you-go billing\nSome indicators included in Insider Risk Management require that you enable the\npay-as-you-go billing model\nfor your organization. Depending on your configured billing model, a notification might be displayed prompting you to configure pay-as-you-go billing to use these indicators.\nBuilt-in indicators vs. custom indicators\nPolicy indicators are organized into two tabs:\nBuilt-in indicators\n: Insider Risk Management includes many built-in indicators for various scenarios that you can use right away in your policies. Choose the indicators that you want to activate, then customize indicator thresholds for each indicator level when you create an insider risk policy. This article describes the built-in indicators in more detail.\nCustom indicators\n: Use custom indicators together with the\nInsider Risk Indicators (preview) connector\nto bring non-Microsoft detections to Insider Risk Management. For example, you might want to extend your detections to include Salesforce and Dropbox and use them alongside the built-in detections provided by the Insider Risk Management solution, which is focused on Microsoft workloads (SharePoint Online and Exchange Online, for example).\nLearn more about creating a custom indicator\nBuilt-in indicators\nInsider Risk Management includes the following built-in indicators.\nOffice indicators\nThese indicators include policy indicators for SharePoint sites, Microsoft Teams, and email messaging.\nCloud storage indicators\nImportant\nTo use this indicator, enable\npay-as-you-go billing\nin your organization.\nThese indicators include policy indicators for Google Drive, Box, and Dropbox that you can use to detect techniques used to determine the environment, gather and steal data, and disrupt the availability or compromise the integrity of a system. To select from\ncloud storage indicators\n, you must\nfirst connect to the relevant cloud storage apps in Microsoft Defender\n.\nAfter configuring these indicators, you can turn off indicators for the apps you don't want to use in settings. For example, you can select a content download indicator for Box and Google Drive, but not Dropbox.\nCloud service indicators\nImportant\nTo use this indicator, enable\npay-as-you-go billing\nin your organization.\nThese indicators include policy indicators for Amazon S3 and Azure (SQL Server and Storage) that you can use to detect techniques used to avoid detection or risky activities. These techniques might include:\nDisabling trace logs\nUpdating or deleting SQL Server firewall rules\nTechniques used to steal data, such as sensitive documents\nTechniques used to disrupt the availability or compromise the integrity of a system\nTechniques used to gain higher-level permissions to systems and data.\nTo select from\ncloud service indicators\n, you must\nfirst connect to the relevant source service apps in Microsoft Defender\n.\nNetwork indicators\nImportant\nTo use this indicator, enable\npay-as-you-go billing\nin your organization.\nThese indicators include HTTP and HTTPS network traffic from third party network security solutions. You can identify sensitive items that are being shared through these interactions. Detection of these activities requires creation of\ncollection policy\nas a pre-requisite. Learn more about\nnetwork data security\n.\nMicrosoft Entra ID indicators\nThese indicators include risk detections from\nMicrosoft Entra ID Protection\n. Risk detections are a powerful resource that can include any suspicious or anomalous activity related to a user account in the directory. Microsoft Entra ID Protection risk detections can be linked to an individual user or sign-in event.\nUser risk detections might flag a legitimate user account as at risk when a potential threat actor gains access to an account by compromising their credentials or when they detect some type of anomalous user activity. Sign-in risk detections represent the probability that a given authentication request isn't the authorized owner of the account.\nTo maintain relevance of the indicators to Insider Risk Management policies, only Microsoft Entra alerts in a\nConfirmedCompromised\nor\nRemediated\nstatus state are evaluated. To learn more about the risk detections in Microsoft Entra ID Protection, see\nRisks detections in Microsoft Entra ID Protection\n.\nMicrosoft Fabric indicators\nImportant\nTo use this indicator, enable\npay-as-you-go billing\nin your organization.\nThese indicators include policy indicators for Microsoft Fabric workloads such as Power BI and Lakehouse (preview). They help you detect techniques used to:\nFigure out the environment (for example, viewing Power BI reports and dashboards).\nGather data of interest (for example, downloading Power BI reports).\nObfuscate the data gathered or change protection (for example, downgrading or removing sensitivity labels of Power BI or Lakehouse assets).\nExfiltrate the data (for example, sharing Lakehouse data with people outside the organization).\nGenerative AI apps indicators (preview)\nThese indicators include policy indicators for numerous generative AI applications. Use these indicators in policies to analyze interactions (prompts and responses) entered into these applications and help detect inappropriate or risky interactions or sharing of confidential information. These indicators include the following generative AI applications:\nMicrosoft Copilot experiences\n: Support for user interactions in\nCopilot in Microsoft Fabric\n,\nMicrosoft Security Copilot\n,\nMicrosoft Copilot Studio\n, any connected or cloud AI application.\nImportant\nTo use this indicator for non-Microsoft 365 AI data, enable\npay-as-you-go billing\nin your organization. Non-Microsoft 365 AI data includes information from other generative AI applications from Microsoft and other connected external AI applications. This data type includes\nCopilot in Microsoft Fabric\n,\nMicrosoft Security Copilot\n,\nMicrosoft Copilot Studio\n, and any connected or cloud AI application. There aren't any pay-as-you-go billing requirements or charges for Microsoft 365 detecting inappropriate or risky interaction for Microsoft 365 Copilot data.\nEnterprise AI apps\n: Non-Copilot AI applications connected using\nMicrosoft Entra\nand\nMicrosoft Purview Data Map\nconnectors.\nImportant\nTo use this indicator, enable\npay-as-you-go billing\nin your organization.\nOther AI applications\n: AI applications that users in your organization discover from their browser activity.\nAzure AI Content Safety indicators\n: Support for\nCommunication Compliance indicators\nto identify prompts and responses matching classifiers provided by Azure AI Content Safety like\nPrompt shields\nand\nProtected Materials\n.\nImportant\nWhen you select this indicator, you create a Communication Compliance policy. If you modify this policy in Communication Compliance, you might need to pay for\npay-as-you-go billing\n.\nCommunication Compliance indicators\nThese indicators include policy indicators that detect employment stressor events, such as emotional outbursts, bullying, failure to take criticism, inability to work or communicate with a team or group, discrimination, violent threats, extremist behavior, and so on. Insider Risk Management works together with the\nMicrosoft Purview Communication Compliance solution\nto detect these types of stressors that indicate an unhealthy workplace environment. Employment stressor events can impact user behavior for risky personas (whether initiators or targets of bad behavior) in several ways that relate to insider risks. Counterproductive work behavior can be a precursor to more serious violations, such as sabotaging company assets or leaking sensitive information.\nAdditionally, you can choose to detect messages matching specific\nsensitivity information types (SITs)\n. Including sensitive information inadvertently or maliciously included in messages to user risk scores and their activity history provides investigators with more information to help quickly take actions to mitigate potential data leakage. You can select up to 30 SITs for a policy. Some scenarios might include helping to detect:\nForeign recruitment\nState actor poaching\nSharing sensitive information like secret formulas, financial reports, and other proprietary property\nSharing passwords\nNote\nYou can also\nuse a Communication Compliance policy as a trigger\n.\nHow it works\nYou can choose from these Communication Compliance indicators:\nSending inappropriate content\nSending financial regulatory text that might be risky\nSending inappropriate images\nYou can also choose to detect sensitive information types included in messages.\nWhen you select\nCreate policy\nfrom the\nCommunication Compliance indicators\nsection:\nA single policy is created in Communication Compliance that detects messages in Microsoft Exchange Online, Microsoft Teams, Microsoft Viva Engage, and Microsoft 365 Copilot and Microsoft 365 Copilot Chat. The Communication Compliance policy is based on the indicators and SITs you select. Each indicator is associated with specific\ntrainable classifiers\nused by Communication Compliance. For more information, see\nContent safety classifiers based on large language models\n.\nTip\nSelect the information icon next to each indicator to see the trainable classifiers that the indicator uses.\nIn Communication Compliance, the trainable classifiers and SITs are listed as conditions for the policy.\nThe Communication Compliance policy is named \"Insider risk indicator\" plus the timestamp, for example: \"Insider risk indicator 24-05-01T09.27.17Z\" or \"Insider risk SIT indicator 24-05-01T09.27.17Z\".\nAnyone with the\nInsider Risk Investigators\nrole in Insider Risk Management is automatically added as a reviewer for the Communication Compliance policy.\nNote\nAfter creating the Communication Compliance policy, to add a reviewer to the policy, you must\nadd the reviewer manually to the\nCommunication Compliance Investigators\nrole group\n.\nIf you turn off all of the indicators in the\nPolicy indicators\nsetting, you pause the Communication Compliance policy. The policy is reenabled if you turn any of the indicators back on.\nYou make the Communication Compliance indicators available for new and existing policies in Insider Risk Management that are based on the\nData theft\nor\nData leaks\ntemplates.\nIf content sent in a message matches any of the trainable classifiers, it results in a policy match in Communication Compliance that can be\nremediated from the\nPolicies\npage\n.\nIndicators included in Insider Risk Management policies determine a risk score for an in-scope user. They activate only after a triggering event occurs for a user.\nCommunication Risk\ninsights appear in the\nActivity explorer\nand\nUser activity\ntabs in Insider Risk Management. If you drill down into a policy match from the\nActivity explorer\nor\nUser activity\ntab, you can learn more about the activity and access a link that opens the Communication Compliance policy. In the Communication Compliance policy, you can see the content of the messages that were sent.\nNote\nYou must have the\nCommunication Compliance\nrole or the\nCommunication Compliance Investigators\nrole to access the Communication Compliance link.\nEnable Communication Compliance indicators in Insider Risk Management\nIn Insider Risk Management, go to\nSettings\n>\nPolicy indicators\n, then scroll to the\nCommunication Compliance indicators (preview)\nsection.\nUnder\nDetect messages matching specific trainable classifiers (preview)\n, select\nCreate policy\n.\nThe policy you create is in Communication Compliance, and the Communication Compliance indicators become available in the\nPolicy indicators\nsetting.\nNote\nIf you already created a Communication Compliance policy but paused it, selecting\nCreate policy\nresumes the Communication Compliance policy. In this case, the\nStatus\ncolumn in the Communication Compliance\nPolicies\nlist shows \"Resuming\".\nSelect one or more of the Communication Compliance indicators in the\nPolicy indicators\nsetting.\nNote\nIf you already created a Communication Compliance policy and you select different indicators, the Communication Compliance policy changes to reflect the appropriate trainable classifiers. Turning off all indicators pauses the Communication Compliance policy.\nSelect\nSave\n.\nTo use the indicators,\ncreate a new insider risk policy\nor\nedit an existing policy\n. The indicators appear on the\nIndicators\npage of the policy workflow. You can adjust thresholds for the indicators as you would for any other indicators in an Insider Risk Management policy.\nNote\nAt this time, real-time analytics for indicator threshold settings aren't available for the Communication Compliance indicators.\nData loss prevention alerts indicators\nThese indicators include policy indicators that integrate with\ndata loss prevention (DLP)\npolicies. By selecting DLP policies as indicators in Insider Risk Management policies, you can automatically detect if a user has existing alerts in connected DLP policies. DLP policies help protect sensitive information and reduce the risks of oversharing data with inappropriate users or organizations.\nWhen an Insider Risk Management alert is generated for a user, you can quickly determine if the user has any high risk alerts associated with DLP policies in your organization without having to navigate to DLP solution in the Microsoft Purview portal. You can review and evaluate the Insider Risk Management activity and associated DLP alerts within Insider Risk Management in a unified view.\nConfigure DLP alerts indicators\nStep 1\n: To enable the DLP alerts as indicators, complete the following steps:\nIn Insider Risk Management settings, select\nPolicy indicators\nand then select the\nBuilt-in Indicators\ntab.\nNavigate to\nData loss prevention (DLP) indicators\nSelect\nAdd DLP policies\nSelect the DLP policies that you want to see alerts for in Insider Risk Management.\nSelect\nAdd\n.\nSelect the\nGenerating alerts from selected DLP policies\ncheckbox.\nSelect\nSave\n.\nStep 2\n: To assign DLP alerts indicators to a specific Insider Risk Management policy, complete the following steps:\nCreate a custom policy\nusing one of the following templates:\nData theft by departing users\nData leaks\nData leaks by priority users\nData leaks by risky users\nRisky AI usage\nConfigure the policy as applicable until you reach the\nIndicators\npage.\nOn the\nIndicators\npage, navigate to\nData loss prevention (DLP) indicators\n.\nSelect the\nGenerating alerts from selected DLP policies\ncheckbox.\nComplete the policy configuration workflow and save the new policy.\nDevice indicators\nThese policy indicators include activities such as sharing files over the network or with devices. Indicators include activities involving all file types, excluding executable (.exe) and dynamic link library (.dll) file activity. If you select\nDevice indicators\n, the system processes activity for devices with Windows 10 Build 1809 or higher and macOS (three latest released versions) devices. For both Windows and macOS devices, you must\nfirst onboard devices\n. Device indicators also include browser signal detection to help your organization detect and act on exfiltration signals for nonexecutable files viewed, copied, shared, or printed in Microsoft Edge and Google Chrome. For more information on configuring Windows devices for integration with insider risk, see\nEnable device indicators and onboard Windows devices\nin this article. For more information on configuring macOS devices for integration with insider risk, see\nEnable device indicators and onboard macOS devices\nin this article. For more information about browser signal detection, see\nLearn about and configure Insider Risk Management browser signal detection\n.\nImportant\nDevice indicators are included in\ncollection policy\nevaluations. When you configure and deploy a collection policy, if there's a mismatch between an Insider Risk Management policy that includes device indicators and a collection policy in your organization, the collection policy configuration takes precedence. This configuration means that if you configure an Insider Risk Management policy to monitor a specific activity for devices, but you configure the collection policy to filter out that device activity, the device activity isn't collected and isn't available for review in Insider Risk Management.\nMicrosoft Defender for Endpoint indicators (preview)\nThese indicators come from Microsoft Defender for Endpoint and relate to unapproved or malicious software installation or bypassing security controls. To receive alerts in Insider Risk Management, you must have an active Defender for Endpoint license and insider risk integration enabled. For more information on configuring Defender for Endpoint for Insider Risk Management integration, see\nConfigure advanced features in Microsoft Defender for Endpoint\n.\nHealth record access indicators\nThese policy indicators cover patient medical record access. For example, attempted access to patient medical records in your electronic medical records (EMR) system logs can be shared with Insider Risk Management healthcare policies. To receive these types of alerts in Insider Risk Management, you must have a healthcare-specific data connector and the\nHR data connector\nconfigured.\nPhysical access indicators\nThese policy indicators cover physical access to sensitive assets. For example, attempted access to a restricted area in your physical badging system logs can be shared with Insider Risk Management policies. To receive these types of alerts in Insider Risk Management, you must have priority physical assets enabled in Insider Risk Management and the\nPhysical badging data connector\nconfigured. To learn more about configuring physical access, see the\nPriority physical access section\nin this article.\nMicrosoft Defender for Cloud Apps indicators\nThese policy indicators come from shared alerts from Defender for Cloud Apps. Automatically enabled anomaly detection in Defender for Cloud Apps immediately starts detecting and collating results, targeting numerous behavioral anomalies across your users and the machines and devices connected to your network. To include these activities in Insider Risk Management policy alerts, select one or more indicators in this section. To learn more about Defender for Cloud Apps analytics and anomaly detection, see\nGet behavioral analytics and anomaly detection\n.\nRisky Agents indicators (preview)\nThese policy indicators cover agent interactions with users and sensitive or risky resources. Risky agent prompts, agent generated sensitive responses, accessing sensitive or priority SharePoint files, agents accessing risky websites, or agents using tools with sensitive information are included.\nRisky AI usage indicators (preview)\nThese policy indicators cover Microsoft AI tools and applications. Risky prompt behavior from users and AI-generated responses that include sensitive information are both included in these indicators. For example, attempted sharing of sensitive information in an AI tool or application by a user is considered risky activity. Similarly, an AI tool or application returning a response that contains sensitive information is also considered risky behavior.\nRisky browsing indicators (preview)\nThese policy indicators cover browsing activity related to websites that are considered malicious or risky and pose potential insider risk that might lead to a security or compliance incident. Risky browsing activity refers to users who visit potentially risky websites, such as those associated with malware, pornography, violence, and other unallowed activities. To include these risk management activities in policy alerts, select one or more indicators in this section. To learn about configuring browser exfiltration signals, see\nInsider Risk Management browser signal detection\n.\nCumulative exfiltration detection indicators\nThese indicators detect when a user's exfiltration activities across all exfiltration channels over the last 30 days exceed organization or peer group norms. For example, if a user is in a sales role and communicates regularly with customers and partners outside of the organization, their external email activity is likely higher than the organization's average. However, the user's activity might not be unusual compared to the user's teammates, or others with similar job titles. A risk score is assigned if the user's cumulative exfiltration activity is unusual and exceeds organization or peer group norms.\nNote\nPeer groups are defined based on organization hierarchy, access to shared SharePoint resources, and job titles in Microsoft Entra ID. If you enable cumulative exfiltration detection, your organization agrees to share Microsoft Entra data with the Microsoft Purview portal, including organization hierarchy and job titles. If your organization doesn't use Microsoft Entra ID to maintain this information, detection might be less accurate.\nRisk score boosters\nThese indicators raise the risk score for activity for the following reasons:\nActivity that is above the user's usual activity for that day\n: Scores are boosted if the detected activity deviates from the user's typical behavior.\nUser had a previous case resolved as a policy violation\n: Scores are boosted if the user had a previous case in Insider Risk Management that was resolved as a policy violation.\nUser is a member of a priority user group\n: Scores are boosted if the user is a member of a priority user group.\nUser is detected as a potential high impact user\n: When you enable this indicator, users are automatically flagged as potential high-impact users based on the following criteria:\nThe user interacts with more sensitive content compared to others in the organization.\nThe user's level in the organization's Microsoft Entra hierarchy.\nThe total number of users reporting to the user based on the Microsoft Entra hierarchy.\nThe user is a member of a Microsoft Entra built-in role with elevated permissions.\nNote\nWhen you enable the potential high impact user risk score booster, you agree to share Microsoft Entra data with the Microsoft Purview portal. If your organization doesn't use sensitivity labels or has not configured organization hierarchy in Microsoft Entra ID, this detection might be less accurate. If a user is detected as both a member of a priority user group and also a potential high-impact user, their risk score is only boosted once.\nIn some cases, you might want to limit the insider risk policy indicators that apply to insider risk policies in your organization. You can turn off the policy indicators for specific areas by disabling them from all insider risk policies in global settings. You can only modify triggering events for policies created from the\nData leaks\nor\nData leaks by priority users\ntemplates. Policies created from all other templates don't have customizable triggering indicators or events.\nCustom indicators\nUse the\nCustom Indicators\ntab to create a custom indicator to use as a trigger or as a policy indicator in your policies.\nNote\nTo create a custom indicator to import third-party indicator data, you must first\ncreate an Insider Risk Indicators connector\n(preview).\nIn Insider Risk Management settings, select\nPolicy indicators\nand then select the\nCustom Indicators\ntab.\nSelect\nAdd custom indicator\n.\nEnter an indicator name and a description (optional).\nIn the\nData connector\nlist, select the Insider Risk Indicator connector that you created previously.\nWhen you select a data connector:\nThe name of the source column that you select when you create the connector appears in the\nSource column from mapping file\nfield. If you don't select a source column when you create the connector,\nNone\nappears in this field and you don't need to make a selection.\nIn the\nValues in source column\nlist, select the value that you want to assign to the custom indicator. These values relate to the source column that you specified when you created the connector. For example, if you create a single connector that includes data for two indicators (Salesforce and Dropbox), you see those values in the list.\nIf you want to use a column to set threshold values, in the\nData from mapping file\nlist, select the column that you want to use for the threshold setting; otherwise, select the\nUse only as a triggering event without any thresholds\noption.\nNote\nOnly fields that have a\nNumber\ndata type appear in the\nData from mapping file\nlist, since a\nNumber\ndata type is required to set a threshold value. The data type is specified when you set up the connector.\nSelect\nAdd indicator\n. The indicator is added to the\nCustom Indicators\nlist.\nNow you can\nuse the custom indicator\nin any\nData theft\nor\nData leaks\npolicies that you create or edit.\nIf you use the custom indicator as a trigger, select your custom trigger on the\nTriggers\npage when you create or edit the policy.\nIf you use the custom indicator as a policy indicator, select your custom indicator on the\nIndicators\npage when you create or edit the policy.\nNote\nAfter selecting your custom trigger or indicator, make sure to set a custom threshold (don't use the default thresholds). You can't set trigger thresholds on a custom indicator if you select the\nUse only as a triggering event without any thresholds\noption.\nAfter adding the custom indicator to your policies, the triggers and insights generated based on the custom indicators appear in the\nAlerts dashboard\n,\nActivity explorer\n, and\nUser timeline\n.\nImportant\nWait 24 hours before\nuploading the data\nafter you update the custom indicators and the associated policies. It can take several hours to sync all components. If you immediately upload the data while the updates are syncing, some data might not be scored for risk.\nCreate a variant of a built-in indicator\nYou can\ncreate detection groups\nand use them with variants of built-in indicators to tailor detections for different sets of users. For example, to reduce the number of false positives for email activities, you might want to create a variant of the\nSending email with attachments to recipients outside the organization\nbuilt-in indicator to only detect email sent to personal domains. A variant inherits all the properties of the built-in indicator. You can modify the variant with exclusions or inclusions.\nIn Insider Risk Management settings, select\nPolicy indicators\n.\nSelect\nNew indicator variant (preview)\n. This step opens the\nNew indicator variant (preview)\npane on the right side of the screen.\nIn the\nBase indicator\nlist, select the indicator that you want to create a variant for.\nNote\nYou can create up to ten variants for each built-in indicator and a total of 100 variants across all indicators. If you already created ten variants for a particular built-in indicator, the built-in indicator appears grayed out in the list. Some built-in indicators (Microsoft Defender for Endpoint indicators, for example) don't support variants.\nAdd a name for the variant (or accept the suggested name). A variant name can't be more than 110 characters.\nAdd a description for the variant (optional). The description appears in the policy to help you differentiate it from other indicators or indicator variants. A description for a variant can't be more than 256 characters.\nUnder\nDetection group\n, select one of the following options:\nIgnore activity involving items in selected groups\n. Select this option if you want to capture everything except for a few\nexclusions\n. For example, you might want to use this option to capture all outgoing email except for email sent to specific domains.\nOnly detect activity involving items in selected groups\n. Select this option if you want to specify\ninclusions\nto capture. For example, select this option if you want to capture only email sent to certain domains.\nNote\nIf you didn't already\ncreate a detection group\n, you can't select an option in the\nDetection group\nsection.\nIn the \nSelect one or more detection groups\n list, select the detection groups that you want to apply to the variant. Detection groups are listed under the appropriate detection type heading to help you find the appropriate group. For a single variant, you can add up to five detection groups of a single type. For example, you can add up to five groups of domains, five groups of file types, and so on.\nNote\nOnly detection groups that are applicable to the variant appear in the list. For example, a file type detection group won't appear for the\nSharing SharePoint folders with people outside the organization\nindicator since it's not applicable.\nSelect\nSave\n.\nIn the\nNext steps\ndialog box, if you want to apply the new variant to a specific policy, select the\nPolicies page\nlink.\nTip\nTo make sure that a variant captures all the important activities that you want to detect, you can apply the built-in indicator and the variant of the built-in indicator in the same policy. You can then observe the activities that each indicator captures in alerts and then use only the variant indicator after making sure everything is detected.\nUse a variant in a policy\nGo to the\nIndicators\npage of the policy workflow.\nFind the built-in indicator that includes one or more variants. A small blue box in the variant checkbox marks built-in indicators that have variants. A list appears at the end of the indicator descriptor text to show the number of selected variants. Open the list to see the variants.\nNote\nIf you select one or more checkboxes in the variant list, the first-level checkbox for the built-in indicator becomes a solid blue checkbox. If you don't select any of the boxes in the variant list, the first-level checkbox is blank.\nSelect\nNext\n.\nIn the\nCustomize thresholds\npage, you can customize threshold values for variants individually.\nInvestigate insights provided by variants\nWhen you add variants to policies, the dashboard generates alerts. An investigator can view more details in the\nActivity explorer\nand\nUser activity\ntabs.\nEdit a variant\nSelect the blue text at the end of the indicator description text. For example, select\n+2\nvariants as shown in the following screenshot example.\nIn the\nView/edit indicators\npage, select\nEdit\n.\nMake your changes.\nVariant limitations\nYou can create up to three variants for each built-in indicator.\nYou can add up to five detection groups of a single type for a single variant. For example, you can add a maximum of five groups of domains, five groups of file types, and so on.\nVariants don't support sequences, cumulative exfiltration activities, the risk score booster, or\nreal-time analytics\nfor the detections group preview.\nHow variants are prioritized against global exclusions and priority content\nInsider Risk Management scopes activities in the following priority order:\nGlobal exclusions\nVariant scoping exclusion/inclusion\nPriority content\nEnable device indicators and onboard Windows devices\nTo enable the detection of risk activities on Windows devices and include policy indicators for these activities, your Windows devices must meet the following requirements, and you must complete the following onboarding steps.\nLearn more about device onboarding requirements\nStep 1: Prepare your endpoints\nMake sure that the Windows 10 devices you plan to report in Insider Risk Management meet these requirements.\nThe device must run Windows 10 x64 build 1809 or later and have the\nWindows 10 update (OS Build 17763.1075)\nfrom February 20, 2020 installed.\nThe user account used to sign in to the Windows 10 device must be an active Microsoft Entra account. The Windows 10 device might be\nMicrosoft Entra ID\n, Microsoft Entra hybrid, joined, or registered.\nInstall the Microsoft Edge browser on the endpoint device to detect actions for the cloud upload activity. See\nDownload the new Microsoft Edge based on Chromium\n.\nNote\nEndpoint DLP now supports virtualized environments, which means that the Insider Risk Management solution supports virtualized environments through endpoint DLP.\nLearn more about support for virtualized environments in endpoint DLP\nStep 2: Onboard devices\nTo detect Insider Risk Management activities on a device, you must enable device checking and onboard your endpoints. You perform both actions in Microsoft Purview.\nTo enable devices that you didn't onboard yet, download the appropriate script and deploy it as outlined in this article.\nIf you already onboarded devices into\nMicrosoft Defender for Endpoint\n, they appear in the managed devices list.\nOnboard devices\nUse this deployment scenario to enable devices that aren't onboarded yet when you want to detect insider risk activities on Windows devices.\nSign in to the\nMicrosoft Purview portal\nwith an admin account in your Microsoft 365 organization.\nSelect\nSettings\nin the upper-right corner of the page.\nUnder\nDevice onboarding\n, select\nDevices\n. The list is empty until you onboard devices.\nSelect\nTurn on device onboarding\n.\nNote\nWhile it usually takes about 60 seconds to enable device onboarding, allow up to 30 minutes before engaging with Microsoft Support.\nSelect how you want to deploy to these devices from the\nDeployment method\nlist, then select\nDownload package\n.\nFollow the appropriate procedures in\nOnboarding tools and methods for Windows machines\n. This link takes you to a landing page where you can access Microsoft Defender for Endpoint procedures that match the deployment package you selected in step 5:\nOnboard Windows machines using Group Policy\nOnboard Windows machines using Microsoft Endpoint Configuration Manager\nOnboard Windows machines using Mobile Device Management tools\nOnboard Windows machines using a local script\nOnboard non-persistent virtual desktop infrastructure (VDI) machines\nWhen you finish and onboard the endpoint device, it appears in the devices list. The endpoint device starts reporting audit activity logs to Insider Risk Management.\nNote\nThis experience is under license enforcement. Without the required license, data isn't visible or accessible.\nIf devices are already onboarded to Microsoft Defender for Endpoint\nIf you already deployed Microsoft Defender for Endpoint and endpoint devices are reporting in, the endpoint devices appear in the managed devices list. To expand coverage, onboard new devices into Insider Risk Management by going to\nStep 2: Onboarding devices\n.\nEnable device indicators and onboard macOS devices\nYou can onboard macOS devices (Catalina 10.15 or later) into Microsoft 365 to support Insider Risk Management policies by using either Intune or JAMF Pro. For more information and configuration guidance, see\nOnboard macOS devices into Microsoft 365 overview (preview)\n.\nIndicator level settings\nWhen you create a policy by using the policy workflow, you can configure how the daily number of risk events influences the risk score for insider risk alerts. These indicator settings help you control how the number of occurrences of risk events in your organization affect the risk score and the associated alert severity for these events.\nFor example, suppose you decide to enable SharePoint indicators in the insider risk policy settings and select custom thresholds for SharePoint events when configuring indicators for a new insider risk\nData leaks\npolicy. In the insider risk policy workflow, you configure three different daily event levels for each SharePoint indicator to influence the risk score for alerts associated with these events.\nFor the first daily event level, set the threshold to:\n10 or more events per day\nfor a lower impact to the risk score for the events\n20 or more events per day\nfor a medium impact to the risk score for the events\n30 or more events per day\nfor a higher impact to the risk score for the events\nThese settings mean:\nIf there are 1-9 SharePoint events that take place after the triggering event, risk scores are minimally impacted and tend not to generate an alert.\nIf there are 10-19 SharePoint events that take place after a triggering event, the risk score is inherently lower and alert severity levels tend to be at a low level.\nIf there are 20-29 SharePoint events that take place after a triggering event, the risk score is inherently higher and alert severity levels tend to be at a medium level.\nIf there are 30 or more SharePoint events that take place after a triggering event, the risk score is inherently higher and alert severity levels tend to be at a high level.\nAnother option for policy thresholds is to assign the policy triggering event to risk management activity that is over the typical daily number of users. Instead of being defined by specific threshold settings, each threshold is dynamically customized for anomalous activities detected for in-scope policy users. If threshold activity for anomalous activities is supported for an individual indicator, you can select\nActivity is above user's usual activity for the day\nin the policy workflow for that indicator. If this option isn't listed, anomalous activity triggering isn't available for the indicator. If the\nActivity is above user's usual activity for the day\noption is listed for an indicator, but isn't selectable, you need to enable this option in\nInsider risk settings\n>\nPolicy indicators\n.\nUse real-time analytics recommendations to set thresholds\nUse real-time analytics (preview) to get a guided, data-driven threshold configuration experience. With this experience, you can quickly select the right thresholds for policy indicators. The guided experience helps you efficiently adjust the selection of indicators and thresholds for activity occurrences so you don't have too few or too many policy alerts.\nWhen you turn on analytics:\nThe\nApply thresholds specific to your users' activity\noption is enabled on the\nIndicators\npage of the policy workflow. Select this option if you want Insider Risk Management to provide indicator threshold recommendations based on the previous 10 days of user activity in your organization.\nNote\nTo use this option, you must select at least one built-in policy indicator. Insider Risk Management doesn't provide recommended thresholds for\ncustom indicators\nor\nvariants of built-in indicators\n.\nIf you select the\nChoose your own thresholds\noption on the\nIndicators\npage of the policy workflow, the defaults for the threshold settings are based on recommended threshold values (based on activity in your organization) instead of the built-in default values. You also see a gauge, a list of the top five indicators, and insights for each indicator.\nA.\nThe gauge shows the approximate number of scoped users whose activities from the previous 10 days exceeded the\nlowest daily thresholds\nfor at least one of the selected built-in indicators for the policy. This gauge helps you estimate the number of alerts that might be generated if all users included in the policy were assigned risk scores.\nB.\nThe top five indicators list is sorted by the number of users exceeding the\nlowest daily thresholds\n. If your policies generate too many alerts, focus on these indicators to reduce \"noise.\"\nC.\nInsights for each indicator are displayed for the set of threshold settings for that indicator. The insight shows the approximate number of users whose activities from the previous 10 days exceeded the specified\nlow thresholds\nfor the indicator. For example, if the low threshold setting for\nDownloading content from SharePoint\nis set to 100, the insight shows the number of users in the policy who performed more than 100 download activities on average in the previous 10 days.\nNote\nGlobal exclusions (intelligent detections)\nare taken into account for real-time analytics.\nAdjust threshold settings manually\nIf you select the\nChoose your own thresholds\noption and manually adjust a threshold setting for a specific indicator, the insight for the indicator updates in real time. This feature helps you configure the appropriate thresholds for each indicator to achieve the highest level of alert effectiveness before activating your policies.\nTo save time and make it easier to understand the impact of manual changes to threshold values, select the\nView impact\nlink in the insight to display the\nUsers exceeding daily thresholds for indicator\ngraph. This graph provides sensitivity analysis for each policy indicator.\nImportant\nThis graph isn't available if you select the\nInclude specific users\noption when you create the policy. You must select the\nInclude all users and groups\noption.\nUse this graph to analyze the activity patterns of users in your organization for the selected indicator. For example, in the preceding illustration, the threshold indicator for\nSharing SharePoint files with people outside the organization\nis set to\n38\n. The graph shows how many users performed actions that exceeded that threshold value and the distribution of low, medium, and high severity alerts for those actions. Select a bar to see insights for each value. For example, in the previous illustration, the bar for the\n50\nvalue is selected and the insight shows that at that threshold value, approximately 22 users each performed at least 50 events on at least one day in the past 10 days.\nPrerequisites for using real-time analytics\nTo use real-time analytics (preview), you must\nenable insider risk analytics insights\n. After you enable analytics, it can take 24 to 48 hours for insights and recommendations to appear.\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "content_hash": "sha256:a4a1a267f07e6b5bcc6b651372dbcb36c08edb3637a146093385a21287d00ccd", + "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nConfigure policy indicators in Insider Risk Management\nFeedback\nSummarize this article for me\nImportant\nMicrosoft Purview Insider Risk Management\ncorrelates various signals to identify potential malicious or inadvertent insider risks, such as IP theft, data leakage, and security violations. Insider Risk Management enables customers to create policies to manage security and compliance. Built with privacy by design, users are pseudonymized by default, and role-based access controls and audit logs are in place to help ensure user-level privacy.\nInsider risk policy templates in Microsoft Purview Insider Risk Management define the type of risk activities that you want to detect and investigate. Each policy template is based on specific indicators that correspond to specific triggers and risk activities. All global indicators are disabled by default;\nyou must select one or more indicators to configure an Insider Risk Management policy\n.\nPolicies collect signals and trigger alerts when users perform activities related to the indicators.\nTypes of events and indicators\nInsider Risk Management uses different types of events and indicators to collect signals and create alerts:\nTriggering events\n: Events that determine if a user is active in an Insider Risk Management policy. If you add a user to an Insider Risk Management policy that doesn't have a triggering event, the policy doesn't evaluate the user as a potential risk. For example, User A is added to a policy created from the\nData theft by departing users\npolicy template and the policy and Microsoft 365 HR connector are properly configured. Until User A has a termination date reported by the HR connector, the policy doesn't evaluate User A for potential risk. Another example of a triggering event is if a user has a\nHigh\nseverity data loss prevention (DLP) policy alert when using\nData leaks\npolicies.\nGlobal settings indicators\n: Indicators enabled in global settings for Insider Risk Management define both the indicators available for configuration in policies and the types of events signals collected by Insider Risk Management. For example, if a user copies data to personal cloud storage services or portable storage devices and you select these indicators only in global settings, you can review the user's potentially risky activity in the Activity explorer. If you don't define this user in an Insider Risk Management policy, the policy doesn't evaluate the user as a potential risk and therefore doesn't assign a risk score or generate an alert.\nPolicy indicators\n: Indicators included in Insider Risk Management policies determine a risk score for an in-scope user. You enable policy indicators from indicators defined in global settings. The policy indicators activate only after a triggering event occurs for a user. Examples of policy indicators include:\nA user copies data to personal cloud storage services or portable storage devices.\nA user account is removed from Microsoft Entra ID.\nA user shares internal files and folders with unauthorized external parties.\nYou can use certain policy indicators and sequences to customize triggering events for specific policy templates. When you configure these indicators or sequences in the policy workflow for the\nGeneral data leaks\nor\nData leaks by priority users\ntemplates, you get more flexibility and customization for your policies and when users are in-scope for a policy. You can also define risk management activity thresholds for these triggering indicators for more fine-grained control in a policy.\nDefine the insider risk policy indicators that are enabled in all insider risk policies\nSelect\nSettings\n, then select\nPolicy indicators\n.\nSelect one or more policy indicators.\nThe indicators you select on the\nPolicy indicators\nsettings page can't be individually configured when creating or editing an insider risk policy in the policy workflow.\nNote\nIt might take several hours for new manually added users to appear in the\nUsers dashboard\n. Activities for the previous 90 days for these users might take up to 24 hours to display. To view activities for manually added users, select the user on the\nUsers dashboard\nand open the\nUser activity\ntab in the details pane.## Two types of policy indicators: built-in indicators and custom indicators\nIndicators and pay-as-you-go billing\nSome indicators included in Insider Risk Management require that you enable the\npay-as-you-go billing model\nfor your organization. Depending on your configured billing model, a notification might be displayed prompting you to configure pay-as-you-go billing to use these indicators.\nBuilt-in indicators vs. custom indicators\nPolicy indicators are organized into two tabs:\nBuilt-in indicators\n: Insider Risk Management includes many built-in indicators for various scenarios that you can use right away in your policies. Choose the indicators that you want to activate, then customize indicator thresholds for each indicator level when you create an insider risk policy. This article describes the built-in indicators in more detail.\nCustom indicators\n: Use custom indicators together with the\nInsider Risk Indicators (preview) connector\nto bring non-Microsoft detections to Insider Risk Management. For example, you might want to extend your detections to include Salesforce and Dropbox and use them alongside the built-in detections provided by the Insider Risk Management solution, which is focused on Microsoft workloads (SharePoint Online and Exchange Online, for example).\nLearn more about creating a custom indicator\nBuilt-in indicators\nInsider Risk Management includes the following built-in indicators.\nOffice indicators\nThese indicators include policy indicators for SharePoint sites, Microsoft Teams, and email messaging.\nCloud storage indicators\nImportant\nTo use this indicator, enable\npay-as-you-go billing\nin your organization.\nThese indicators include policy indicators for Google Drive, Box, and Dropbox that you can use to detect techniques used to determine the environment, gather and steal data, and disrupt the availability or compromise the integrity of a system. To select from\ncloud storage indicators\n, you must\nfirst connect to the relevant cloud storage apps in Microsoft Defender\n.\nAfter configuring these indicators, you can turn off indicators for the apps you don't want to use in settings. For example, you can select a content download indicator for Box and Google Drive, but not Dropbox.\nCloud service indicators\nImportant\nTo use this indicator, enable\npay-as-you-go billing\nin your organization.\nThese indicators include policy indicators for Amazon S3 and Azure (SQL Server and Storage) that you can use to detect techniques used to avoid detection or risky activities. These techniques might include:\nDisabling trace logs\nUpdating or deleting SQL Server firewall rules\nTechniques used to steal data, such as sensitive documents\nTechniques used to disrupt the availability or compromise the integrity of a system\nTechniques used to gain higher-level permissions to systems and data.\nTo select from\ncloud service indicators\n, you must\nfirst connect to the relevant source service apps in Microsoft Defender\n.\nNetwork indicators\nImportant\nTo use this indicator, enable\npay-as-you-go billing\nin your organization.\nThese indicators include HTTP and HTTPS network traffic from third party network security solutions. You can identify sensitive items that are being shared through these interactions. Detection of these activities requires creation of\ncollection policy\nas a pre-requisite. Learn more about\nnetwork data security\n.\nMicrosoft Entra ID indicators\nThese indicators include risk detections from\nMicrosoft Entra ID Protection\n. Risk detections are a powerful resource that can include any suspicious or anomalous activity related to a user account in the directory. Microsoft Entra ID Protection risk detections can be linked to an individual user or sign-in event.\nUser risk detections might flag a legitimate user account as at risk when a potential threat actor gains access to an account by compromising their credentials or when they detect some type of anomalous user activity. Sign-in risk detections represent the probability that a given authentication request isn't the authorized owner of the account.\nTo maintain relevance of the indicators to Insider Risk Management policies, only Microsoft Entra alerts in a\nConfirmedCompromised\nor\nRemediated\nstatus state are evaluated. To learn more about the risk detections in Microsoft Entra ID Protection, see\nRisks detections in Microsoft Entra ID Protection\n.\nMicrosoft Fabric indicators\nImportant\nTo use this indicator, enable\npay-as-you-go billing\nin your organization.\nThese indicators include policy indicators for Microsoft Fabric workloads such as Power BI and Lakehouse. They help you detect techniques used to:\nFigure out the environment (for example, viewing Power BI reports and dashboards).\nGather data of interest (for example, downloading Power BI reports).\nObfuscate the data gathered or change protection (for example, downgrading or removing sensitivity labels of Power BI or Lakehouse assets).\nExfiltrate the data (for example, sharing Lakehouse data with people outside the organization).\nGenerative AI apps indicators (preview)\nThese indicators include policy indicators for numerous generative AI applications. Use these indicators in policies to analyze interactions (prompts and responses) entered into these applications and help detect inappropriate or risky interactions or sharing of confidential information. These indicators include the following generative AI applications:\nMicrosoft Copilot experiences\n: Support for user interactions in\nCopilot in Microsoft Fabric\n,\nMicrosoft Security Copilot\n,\nMicrosoft Copilot Studio\n, any connected or cloud AI application.\nImportant\nTo use this indicator for non-Microsoft 365 AI data, enable\npay-as-you-go billing\nin your organization. Non-Microsoft 365 AI data includes information from other generative AI applications from Microsoft and other connected external AI applications. This data type includes\nCopilot in Microsoft Fabric\n,\nMicrosoft Security Copilot\n,\nMicrosoft Copilot Studio\n, and any connected or cloud AI application. There aren't any pay-as-you-go billing requirements or charges for Microsoft 365 detecting inappropriate or risky interaction for Microsoft 365 Copilot data.\nEnterprise AI apps\n: Non-Copilot AI applications connected using\nMicrosoft Entra\nand\nMicrosoft Purview Data Map\nconnectors.\nImportant\nTo use this indicator, enable\npay-as-you-go billing\nin your organization.\nOther AI applications\n: AI applications that users in your organization discover from their browser activity.\nAzure AI Content Safety indicators\n: Support for\nCommunication Compliance indicators\nto identify prompts and responses matching classifiers provided by Azure AI Content Safety like\nPrompt shields\nand\nProtected Materials\n.\nImportant\nWhen you select this indicator, you create a Communication Compliance policy. If you modify this policy in Communication Compliance, you might need to pay for\npay-as-you-go billing\n.\nCommunication Compliance indicators\nThese indicators include policy indicators that detect employment stressor events, such as emotional outbursts, bullying, failure to take criticism, inability to work or communicate with a team or group, discrimination, violent threats, extremist behavior, and so on. Insider Risk Management works together with the\nMicrosoft Purview Communication Compliance solution\nto detect these types of stressors that indicate an unhealthy workplace environment. Employment stressor events can impact user behavior for risky personas (whether initiators or targets of bad behavior) in several ways that relate to insider risks. Counterproductive work behavior can be a precursor to more serious violations, such as sabotaging company assets or leaking sensitive information.\nAdditionally, you can choose to detect messages matching specific\nsensitivity information types (SITs)\n. Including sensitive information inadvertently or maliciously included in messages to user risk scores and their activity history provides investigators with more information to help quickly take actions to mitigate potential data leakage. You can select up to 30 SITs for a policy. Some scenarios might include helping to detect:\nForeign recruitment\nState actor poaching\nSharing sensitive information like secret formulas, financial reports, and other proprietary property\nSharing passwords\nNote\nYou can also\nuse a Communication Compliance policy as a trigger\n.\nHow it works\nYou can choose from these Communication Compliance indicators:\nSending inappropriate content\nSending financial regulatory text that might be risky\nSending inappropriate images\nYou can also choose to detect sensitive information types included in messages.\nWhen you select\nCreate policy\nfrom the\nCommunication Compliance indicators\nsection:\nA single policy is created in Communication Compliance that detects messages in Microsoft Exchange Online, Microsoft Teams, Microsoft Viva Engage, and Microsoft 365 Copilot and Microsoft 365 Copilot Chat. The Communication Compliance policy is based on the indicators and SITs you select. Each indicator is associated with specific\ntrainable classifiers\nused by Communication Compliance. For more information, see\nContent safety classifiers based on large language models\n.\nTip\nSelect the information icon next to each indicator to see the trainable classifiers that the indicator uses.\nIn Communication Compliance, the trainable classifiers and SITs are listed as conditions for the policy.\nThe Communication Compliance policy is named \"Insider risk indicator\" plus the timestamp, for example: \"Insider risk indicator 24-05-01T09.27.17Z\" or \"Insider risk SIT indicator 24-05-01T09.27.17Z\".\nAnyone with the\nInsider Risk Investigators\nrole in Insider Risk Management is automatically added as a reviewer for the Communication Compliance policy.\nNote\nAfter creating the Communication Compliance policy, to add a reviewer to the policy, you must\nadd the reviewer manually to the\nCommunication Compliance Investigators\nrole group\n.\nIf you turn off all of the indicators in the\nPolicy indicators\nsetting, you pause the Communication Compliance policy. The policy is reenabled if you turn any of the indicators back on.\nYou make the Communication Compliance indicators available for new and existing policies in Insider Risk Management that are based on the\nData theft\nor\nData leaks\ntemplates.\nIf content sent in a message matches any of the trainable classifiers, it results in a policy match in Communication Compliance that can be\nremediated from the\nPolicies\npage\n.\nIndicators included in Insider Risk Management policies determine a risk score for an in-scope user. They activate only after a triggering event occurs for a user.\nCommunication Risk\ninsights appear in the\nActivity explorer\nand\nUser activity\ntabs in Insider Risk Management. If you drill down into a policy match from the\nActivity explorer\nor\nUser activity\ntab, you can learn more about the activity and access a link that opens the Communication Compliance policy. In the Communication Compliance policy, you can see the content of the messages that were sent.\nNote\nYou must have the\nCommunication Compliance\nrole or the\nCommunication Compliance Investigators\nrole to access the Communication Compliance link.\nEnable Communication Compliance indicators in Insider Risk Management\nIn Insider Risk Management, go to\nSettings\n>\nPolicy indicators\n, then scroll to the\nCommunication Compliance indicators (preview)\nsection.\nUnder\nDetect messages matching specific trainable classifiers (preview)\n, select\nCreate policy\n.\nThe policy you create is in Communication Compliance, and the Communication Compliance indicators become available in the\nPolicy indicators\nsetting.\nNote\nIf you already created a Communication Compliance policy but paused it, selecting\nCreate policy\nresumes the Communication Compliance policy. In this case, the\nStatus\ncolumn in the Communication Compliance\nPolicies\nlist shows \"Resuming\".\nSelect one or more of the Communication Compliance indicators in the\nPolicy indicators\nsetting.\nNote\nIf you already created a Communication Compliance policy and you select different indicators, the Communication Compliance policy changes to reflect the appropriate trainable classifiers. Turning off all indicators pauses the Communication Compliance policy.\nSelect\nSave\n.\nTo use the indicators,\ncreate a new insider risk policy\nor\nedit an existing policy\n. The indicators appear on the\nIndicators\npage of the policy workflow. You can adjust thresholds for the indicators as you would for any other indicators in an Insider Risk Management policy.\nNote\nAt this time, real-time analytics for indicator threshold settings aren't available for the Communication Compliance indicators.\nData loss prevention alerts indicators\nThese indicators include policy indicators that integrate with\ndata loss prevention (DLP)\npolicies. By selecting DLP policies as indicators in Insider Risk Management policies, you can automatically detect if a user has existing alerts in connected DLP policies. DLP policies help protect sensitive information and reduce the risks of oversharing data with inappropriate users or organizations.\nWhen an Insider Risk Management alert is generated for a user, you can quickly determine if the user has any high risk alerts associated with DLP policies in your organization without having to navigate to DLP solution in the Microsoft Purview portal. You can review and evaluate the Insider Risk Management activity and associated DLP alerts within Insider Risk Management in a unified view.\nConfigure DLP alerts indicators\nStep 1\n: To enable the DLP alerts as indicators, complete the following steps:\nIn Insider Risk Management settings, select\nPolicy indicators\nand then select the\nBuilt-in Indicators\ntab.\nNavigate to\nData loss prevention (DLP) indicators\nSelect\nAdd DLP policies\nSelect the DLP policies that you want to see alerts for in Insider Risk Management.\nSelect\nAdd\n.\nSelect the\nGenerating alerts from selected DLP policies\ncheckbox.\nSelect\nSave\n.\nStep 2\n: To assign DLP alerts indicators to a specific Insider Risk Management policy, complete the following steps:\nCreate a custom policy\nusing one of the following templates:\nData theft by departing users\nData leaks\nData leaks by priority users\nData leaks by risky users\nRisky AI usage\nConfigure the policy as applicable until you reach the\nIndicators\npage.\nOn the\nIndicators\npage, navigate to\nData loss prevention (DLP) indicators\n.\nSelect the\nGenerating alerts from selected DLP policies\ncheckbox.\nComplete the policy configuration workflow and save the new policy.\nDevice indicators\nThese policy indicators include activities such as sharing files over the network or with devices. Indicators include activities involving all file types, excluding executable (.exe) and dynamic link library (.dll) file activity. If you select\nDevice indicators\n, the system processes activity for devices with Windows 10 Build 1809 or higher and macOS (three latest released versions) devices. For both Windows and macOS devices, you must\nfirst onboard devices\n. Device indicators also include browser signal detection to help your organization detect and act on exfiltration signals for nonexecutable files viewed, copied, shared, or printed in Microsoft Edge and Google Chrome. For more information on configuring Windows devices for integration with insider risk, see\nEnable device indicators and onboard Windows devices\nin this article. For more information on configuring macOS devices for integration with insider risk, see\nEnable device indicators and onboard macOS devices\nin this article. For more information about browser signal detection, see\nLearn about and configure Insider Risk Management browser signal detection\n.\nImportant\nDevice indicators are included in\ncollection policy\nevaluations. When you configure and deploy a collection policy, if there's a mismatch between an Insider Risk Management policy that includes device indicators and a collection policy in your organization, the collection policy configuration takes precedence. This configuration means that if you configure an Insider Risk Management policy to monitor a specific activity for devices, but you configure the collection policy to filter out that device activity, the device activity isn't collected and isn't available for review in Insider Risk Management.\nMicrosoft Defender for Endpoint indicators (preview)\nThese indicators come from Microsoft Defender for Endpoint and relate to unapproved or malicious software installation or bypassing security controls. To receive alerts in Insider Risk Management, you must have an active Defender for Endpoint license and insider risk integration enabled. For more information on configuring Defender for Endpoint for Insider Risk Management integration, see\nConfigure advanced features in Microsoft Defender for Endpoint\n.\nHealth record access indicators\nThese policy indicators cover patient medical record access. For example, attempted access to patient medical records in your electronic medical records (EMR) system logs can be shared with Insider Risk Management healthcare policies. To receive these types of alerts in Insider Risk Management, you must have a healthcare-specific data connector and the\nHR data connector\nconfigured.\nPhysical access indicators\nThese policy indicators cover physical access to sensitive assets. For example, attempted access to a restricted area in your physical badging system logs can be shared with Insider Risk Management policies. To receive these types of alerts in Insider Risk Management, you must have priority physical assets enabled in Insider Risk Management and the\nPhysical badging data connector\nconfigured. To learn more about configuring physical access, see the\nPriority physical access section\nin this article.\nMicrosoft Defender for Cloud Apps indicators\nThese policy indicators come from shared alerts from Defender for Cloud Apps. Automatically enabled anomaly detection in Defender for Cloud Apps immediately starts detecting and collating results, targeting numerous behavioral anomalies across your users and the machines and devices connected to your network. To include these activities in Insider Risk Management policy alerts, select one or more indicators in this section. To learn more about Defender for Cloud Apps analytics and anomaly detection, see\nGet behavioral analytics and anomaly detection\n.\nRisky Agents indicators (preview)\nThese policy indicators cover agent interactions with users and sensitive or risky resources. Risky agent prompts, agent generated sensitive responses, accessing sensitive or priority SharePoint files, agents accessing risky websites, or agents using tools with sensitive information are included.\nRisky AI usage indicators (preview)\nThese policy indicators cover Microsoft AI tools and applications. Risky prompt behavior from users and AI-generated responses that include sensitive information are both included in these indicators. For example, attempted sharing of sensitive information in an AI tool or application by a user is considered risky activity. Similarly, an AI tool or application returning a response that contains sensitive information is also considered risky behavior.\nRisky browsing indicators (preview)\nThese policy indicators cover browsing activity related to websites that are considered malicious or risky and pose potential insider risk that might lead to a security or compliance incident. Risky browsing activity refers to users who visit potentially risky websites, such as those associated with malware, pornography, violence, and other unallowed activities. To include these risk management activities in policy alerts, select one or more indicators in this section. To learn about configuring browser exfiltration signals, see\nInsider Risk Management browser signal detection\n.\nCumulative exfiltration detection indicators\nThese indicators detect when a user's exfiltration activities across all exfiltration channels over the last 30 days exceed organization or peer group norms. For example, if a user is in a sales role and communicates regularly with customers and partners outside of the organization, their external email activity is likely higher than the organization's average. However, the user's activity might not be unusual compared to the user's teammates, or others with similar job titles. A risk score is assigned if the user's cumulative exfiltration activity is unusual and exceeds organization or peer group norms.\nNote\nPeer groups are defined based on organization hierarchy, access to shared SharePoint resources, and job titles in Microsoft Entra ID. If you enable cumulative exfiltration detection, your organization agrees to share Microsoft Entra data with the Microsoft Purview portal, including organization hierarchy and job titles. If your organization doesn't use Microsoft Entra ID to maintain this information, detection might be less accurate.\nRisk score boosters\nThese indicators raise the risk score for activity for the following reasons:\nActivity that is above the user's usual activity for that day\n: Scores are boosted if the detected activity deviates from the user's typical behavior.\nUser had a previous case resolved as a policy violation\n: Scores are boosted if the user had a previous case in Insider Risk Management that was resolved as a policy violation.\nUser is a member of a priority user group\n: Scores are boosted if the user is a member of a priority user group.\nUser is detected as a potential high impact user\n: When you enable this indicator, users are automatically flagged as potential high-impact users based on the following criteria:\nThe user interacts with more sensitive content compared to others in the organization.\nThe user's level in the organization's Microsoft Entra hierarchy.\nThe total number of users reporting to the user based on the Microsoft Entra hierarchy.\nThe user is a member of a Microsoft Entra built-in role with elevated permissions.\nNote\nWhen you enable the potential high impact user risk score booster, you agree to share Microsoft Entra data with the Microsoft Purview portal. If your organization doesn't use sensitivity labels or has not configured organization hierarchy in Microsoft Entra ID, this detection might be less accurate. If a user is detected as both a member of a priority user group and also a potential high-impact user, their risk score is only boosted once.\nIn some cases, you might want to limit the insider risk policy indicators that apply to insider risk policies in your organization. You can turn off the policy indicators for specific areas by disabling them from all insider risk policies in global settings. You can only modify triggering events for policies created from the\nData leaks\nor\nData leaks by priority users\ntemplates. Policies created from all other templates don't have customizable triggering indicators or events.\nCustom indicators\nUse the\nCustom Indicators\ntab to create a custom indicator to use as a trigger or as a policy indicator in your policies.\nNote\nTo create a custom indicator to import third-party indicator data, you must first\ncreate an Insider Risk Indicators connector\n(preview).\nIn Insider Risk Management settings, select\nPolicy indicators\nand then select the\nCustom Indicators\ntab.\nSelect\nAdd custom indicator\n.\nEnter an indicator name and a description (optional).\nIn the\nData connector\nlist, select the Insider Risk Indicator connector that you created previously.\nWhen you select a data connector:\nThe name of the source column that you select when you create the connector appears in the\nSource column from mapping file\nfield. If you don't select a source column when you create the connector,\nNone\nappears in this field and you don't need to make a selection.\nIn the\nValues in source column\nlist, select the value that you want to assign to the custom indicator. These values relate to the source column that you specified when you created the connector. For example, if you create a single connector that includes data for two indicators (Salesforce and Dropbox), you see those values in the list.\nIf you want to use a column to set threshold values, in the\nData from mapping file\nlist, select the column that you want to use for the threshold setting; otherwise, select the\nUse only as a triggering event without any thresholds\noption.\nNote\nOnly fields that have a\nNumber\ndata type appear in the\nData from mapping file\nlist, since a\nNumber\ndata type is required to set a threshold value. The data type is specified when you set up the connector.\nSelect\nAdd indicator\n. The indicator is added to the\nCustom Indicators\nlist.\nNow you can\nuse the custom indicator\nin any\nData theft\nor\nData leaks\npolicies that you create or edit.\nIf you use the custom indicator as a trigger, select your custom trigger on the\nTriggers\npage when you create or edit the policy.\nIf you use the custom indicator as a policy indicator, select your custom indicator on the\nIndicators\npage when you create or edit the policy.\nNote\nAfter selecting your custom trigger or indicator, make sure to set a custom threshold (don't use the default thresholds). You can't set trigger thresholds on a custom indicator if you select the\nUse only as a triggering event without any thresholds\noption.\nAfter adding the custom indicator to your policies, the triggers and insights generated based on the custom indicators appear in the\nAlerts dashboard\n,\nActivity explorer\n, and\nUser timeline\n.\nImportant\nWait 24 hours before\nuploading the data\nafter you update the custom indicators and the associated policies. It can take several hours to sync all components. If you immediately upload the data while the updates are syncing, some data might not be scored for risk.\nCreate a variant of a built-in indicator\nYou can\ncreate detection groups\nand use them with variants of built-in indicators to tailor detections for different sets of users. For example, to reduce the number of false positives for email activities, you might want to create a variant of the\nSending email with attachments to recipients outside the organization\nbuilt-in indicator to only detect email sent to personal domains. A variant inherits all the properties of the built-in indicator. You can modify the variant with exclusions or inclusions.\nIn Insider Risk Management settings, select\nPolicy indicators\n.\nSelect\nNew indicator variant (preview)\n. This step opens the\nNew indicator variant (preview)\npane on the right side of the screen.\nIn the\nBase indicator\nlist, select the indicator that you want to create a variant for.\nNote\nYou can create up to ten variants for each built-in indicator and a total of 100 variants across all indicators. If you already created ten variants for a particular built-in indicator, the built-in indicator appears grayed out in the list. Some built-in indicators (Microsoft Defender for Endpoint indicators, for example) don't support variants.\nAdd a name for the variant (or accept the suggested name). A variant name can't be more than 110 characters.\nAdd a description for the variant (optional). The description appears in the policy to help you differentiate it from other indicators or indicator variants. A description for a variant can't be more than 256 characters.\nUnder\nDetection group\n, select one of the following options:\nIgnore activity involving items in selected groups\n. Select this option if you want to capture everything except for a few\nexclusions\n. For example, you might want to use this option to capture all outgoing email except for email sent to specific domains.\nOnly detect activity involving items in selected groups\n. Select this option if you want to specify\ninclusions\nto capture. For example, select this option if you want to capture only email sent to certain domains.\nNote\nIf you didn't already\ncreate a detection group\n, you can't select an option in the\nDetection group\nsection.\nIn the \nSelect one or more detection groups\n list, select the detection groups that you want to apply to the variant. Detection groups are listed under the appropriate detection type heading to help you find the appropriate group. For a single variant, you can add up to five detection groups of a single type. For example, you can add up to five groups of domains, five groups of file types, and so on.\nNote\nOnly detection groups that are applicable to the variant appear in the list. For example, a file type detection group won't appear for the\nSharing SharePoint folders with people outside the organization\nindicator since it's not applicable.\nSelect\nSave\n.\nIn the\nNext steps\ndialog box, if you want to apply the new variant to a specific policy, select the\nPolicies page\nlink.\nTip\nTo make sure that a variant captures all the important activities that you want to detect, you can apply the built-in indicator and the variant of the built-in indicator in the same policy. You can then observe the activities that each indicator captures in alerts and then use only the variant indicator after making sure everything is detected.\nUse a variant in a policy\nGo to the\nIndicators\npage of the policy workflow.\nFind the built-in indicator that includes one or more variants. A small blue box in the variant checkbox marks built-in indicators that have variants. A list appears at the end of the indicator descriptor text to show the number of selected variants. Open the list to see the variants.\nNote\nIf you select one or more checkboxes in the variant list, the first-level checkbox for the built-in indicator becomes a solid blue checkbox. If you don't select any of the boxes in the variant list, the first-level checkbox is blank.\nSelect\nNext\n.\nIn the\nCustomize thresholds\npage, you can customize threshold values for variants individually.\nInvestigate insights provided by variants\nWhen you add variants to policies, the dashboard generates alerts. An investigator can view more details in the\nActivity explorer\nand\nUser activity\ntabs.\nEdit a variant\nSelect the blue text at the end of the indicator description text. For example, select\n+2\nvariants as shown in the following screenshot example.\nIn the\nView/edit indicators\npage, select\nEdit\n.\nMake your changes.\nVariant limitations\nYou can create up to three variants for each built-in indicator.\nYou can add up to five detection groups of a single type for a single variant. For example, you can add a maximum of five groups of domains, five groups of file types, and so on.\nVariants don't support sequences, cumulative exfiltration activities, the risk score booster, or\nreal-time analytics\nfor the detections group preview.\nHow variants are prioritized against global exclusions and priority content\nInsider Risk Management scopes activities in the following priority order:\nGlobal exclusions\nVariant scoping exclusion/inclusion\nPriority content\nEnable device indicators and onboard Windows devices\nTo enable the detection of risk activities on Windows devices and include policy indicators for these activities, your Windows devices must meet the following requirements, and you must complete the following onboarding steps.\nLearn more about device onboarding requirements\nStep 1: Prepare your endpoints\nMake sure that the Windows 10 devices you plan to report in Insider Risk Management meet these requirements.\nThe device must run Windows 10 x64 build 1809 or later and have the\nWindows 10 update (OS Build 17763.1075)\nfrom February 20, 2020 installed.\nThe user account used to sign in to the Windows 10 device must be an active Microsoft Entra account. The Windows 10 device might be\nMicrosoft Entra ID\n, Microsoft Entra hybrid, joined, or registered.\nInstall the Microsoft Edge browser on the endpoint device to detect actions for the cloud upload activity. See\nDownload the new Microsoft Edge based on Chromium\n.\nNote\nEndpoint DLP now supports virtualized environments, which means that the Insider Risk Management solution supports virtualized environments through endpoint DLP.\nLearn more about support for virtualized environments in endpoint DLP\nStep 2: Onboard devices\nTo detect Insider Risk Management activities on a device, you must enable device checking and onboard your endpoints. You perform both actions in Microsoft Purview.\nTo enable devices that you didn't onboard yet, download the appropriate script and deploy it as outlined in this article.\nIf you already onboarded devices into\nMicrosoft Defender for Endpoint\n, they appear in the managed devices list.\nOnboard devices\nUse this deployment scenario to enable devices that aren't onboarded yet when you want to detect insider risk activities on Windows devices.\nSign in to the\nMicrosoft Purview portal\nwith an admin account in your Microsoft 365 organization.\nSelect\nSettings\nin the upper-right corner of the page.\nUnder\nDevice onboarding\n, select\nDevices\n. The list is empty until you onboard devices.\nSelect\nTurn on device onboarding\n.\nNote\nWhile it usually takes about 60 seconds to enable device onboarding, allow up to 30 minutes before engaging with Microsoft Support.\nSelect how you want to deploy to these devices from the\nDeployment method\nlist, then select\nDownload package\n.\nFollow the appropriate procedures in\nOnboarding tools and methods for Windows machines\n. This link takes you to a landing page where you can access Microsoft Defender for Endpoint procedures that match the deployment package you selected in step 5:\nOnboard Windows machines using Group Policy\nOnboard Windows machines using Microsoft Endpoint Configuration Manager\nOnboard Windows machines using Mobile Device Management tools\nOnboard Windows machines using a local script\nOnboard non-persistent virtual desktop infrastructure (VDI) machines\nWhen you finish and onboard the endpoint device, it appears in the devices list. The endpoint device starts reporting audit activity logs to Insider Risk Management.\nNote\nThis experience is under license enforcement. Without the required license, data isn't visible or accessible.\nIf devices are already onboarded to Microsoft Defender for Endpoint\nIf you already deployed Microsoft Defender for Endpoint and endpoint devices are reporting in, the endpoint devices appear in the managed devices list. To expand coverage, onboard new devices into Insider Risk Management by going to\nStep 2: Onboarding devices\n.\nEnable device indicators and onboard macOS devices\nYou can onboard macOS devices (Catalina 10.15 or later) into Microsoft 365 to support Insider Risk Management policies by using either Intune or JAMF Pro. For more information and configuration guidance, see\nOnboard macOS devices into Microsoft 365 overview (preview)\n.\nIndicator level settings\nWhen you create a policy by using the policy workflow, you can configure how the daily number of risk events influences the risk score for insider risk alerts. These indicator settings help you control how the number of occurrences of risk events in your organization affect the risk score and the associated alert severity for these events.\nFor example, suppose you decide to enable SharePoint indicators in the insider risk policy settings and select custom thresholds for SharePoint events when configuring indicators for a new insider risk\nData leaks\npolicy. In the insider risk policy workflow, you configure three different daily event levels for each SharePoint indicator to influence the risk score for alerts associated with these events.\nFor the first daily event level, set the threshold to:\n10 or more events per day\nfor a lower impact to the risk score for the events\n20 or more events per day\nfor a medium impact to the risk score for the events\n30 or more events per day\nfor a higher impact to the risk score for the events\nThese settings mean:\nIf there are 1-9 SharePoint events that take place after the triggering event, risk scores are minimally impacted and tend not to generate an alert.\nIf there are 10-19 SharePoint events that take place after a triggering event, the risk score is inherently lower and alert severity levels tend to be at a low level.\nIf there are 20-29 SharePoint events that take place after a triggering event, the risk score is inherently higher and alert severity levels tend to be at a medium level.\nIf there are 30 or more SharePoint events that take place after a triggering event, the risk score is inherently higher and alert severity levels tend to be at a high level.\nAnother option for policy thresholds is to assign the policy triggering event to risk management activity that is over the typical daily number of users. Instead of being defined by specific threshold settings, each threshold is dynamically customized for anomalous activities detected for in-scope policy users. If threshold activity for anomalous activities is supported for an individual indicator, you can select\nActivity is above user's usual activity for the day\nin the policy workflow for that indicator. If this option isn't listed, anomalous activity triggering isn't available for the indicator. If the\nActivity is above user's usual activity for the day\noption is listed for an indicator, but isn't selectable, you need to enable this option in\nInsider risk settings\n>\nPolicy indicators\n.\nUse real-time analytics recommendations to set thresholds\nUse real-time analytics (preview) to get a guided, data-driven threshold configuration experience. With this experience, you can quickly select the right thresholds for policy indicators. The guided experience helps you efficiently adjust the selection of indicators and thresholds for activity occurrences so you don't have too few or too many policy alerts.\nWhen you turn on analytics:\nThe\nApply thresholds specific to your users' activity\noption is enabled on the\nIndicators\npage of the policy workflow. Select this option if you want Insider Risk Management to provide indicator threshold recommendations based on the previous 10 days of user activity in your organization.\nNote\nTo use this option, you must select at least one built-in policy indicator. Insider Risk Management doesn't provide recommended thresholds for\ncustom indicators\nor\nvariants of built-in indicators\n.\nIf you select the\nChoose your own thresholds\noption on the\nIndicators\npage of the policy workflow, the defaults for the threshold settings are based on recommended threshold values (based on activity in your organization) instead of the built-in default values. You also see a gauge, a list of the top five indicators, and insights for each indicator.\nA.\nThe gauge shows the approximate number of scoped users whose activities from the previous 10 days exceeded the\nlowest daily thresholds\nfor at least one of the selected built-in indicators for the policy. This gauge helps you estimate the number of alerts that might be generated if all users included in the policy were assigned risk scores.\nB.\nThe top five indicators list is sorted by the number of users exceeding the\nlowest daily thresholds\n. If your policies generate too many alerts, focus on these indicators to reduce \"noise.\"\nC.\nInsights for each indicator are displayed for the set of threshold settings for that indicator. The insight shows the approximate number of users whose activities from the previous 10 days exceeded the specified\nlow thresholds\nfor the indicator. For example, if the low threshold setting for\nDownloading content from SharePoint\nis set to 100, the insight shows the number of users in the policy who performed more than 100 download activities on average in the previous 10 days.\nNote\nGlobal exclusions (intelligent detections)\nare taken into account for real-time analytics.\nAdjust threshold settings manually\nIf you select the\nChoose your own thresholds\noption and manually adjust a threshold setting for a specific indicator, the insight for the indicator updates in real time. This feature helps you configure the appropriate thresholds for each indicator to achieve the highest level of alert effectiveness before activating your policies.\nTo save time and make it easier to understand the impact of manual changes to threshold values, select the\nView impact\nlink in the insight to display the\nUsers exceeding daily thresholds for indicator\ngraph. This graph provides sensitivity analysis for each policy indicator.\nImportant\nThis graph isn't available if you select the\nInclude specific users\noption when you create the policy. You must select the\nInclude all users and groups\noption.\nUse this graph to analyze the activity patterns of users in your organization for the selected indicator. For example, in the preceding illustration, the threshold indicator for\nSharing SharePoint files with people outside the organization\nis set to\n38\n. The graph shows how many users performed actions that exceeded that threshold value and the distribution of low, medium, and high severity alerts for those actions. Select a bar to see insights for each value. For example, in the previous illustration, the bar for the\n50\nvalue is selected and the insight shows that at that threshold value, approximately 22 users each performed at least 50 events on at least one day in the past 10 days.\nPrerequisites for using real-time analytics\nTo use real-time analytics (preview), you must\nenable insider risk analytics insights\n. After you enable analytics, it can take 24 to 48 hours for insights and recommendations to appear.\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, - "last_changed": "2026-03-11T12:59:29.613756+00:00", + "last_changed": "2026-03-14T06:51:10.083468+00:00", "topic": "Insider Risk Indicators", "section": "Microsoft Purview" }, "https://learn.microsoft.com/en-us/purview/insider-risk-management-activities": { "content_hash": "sha256:204bdf6736deacb071ef8ef0677725e6b51f4f02e2dab5e71803238509a5d9a4", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nInvestigate Insider Risk Management activities\nFeedback\nSummarize this article for me\nImportant\nMicrosoft Purview Insider Risk Management\ncorrelates various signals to identify potential malicious or inadvertent insider risks, such as IP theft, data leakage, and security violations. Insider Risk Management enables customers to create policies to manage security and compliance. Built with privacy by design, users are pseudonymized by default, and role-based access controls and audit logs are in place to help ensure user-level privacy.\nTo minimize insider risks in your organization, investigate potentially risky user activities. These risks might be activities that generate alerts from Insider Risk Management policies. They can also be risks from compliance-related activities that the policies detect but don't immediately create Insider Risk Management alerts for users.\nYou can investigate these types of activities by using\nUser activity reports\nor with the\nTriage Agent in Insider Risk Management\nand\nStandard alert\ndashboards.\nTriaging alerts\nInvestigate and act on alerts in Insider Risk Management by following these steps:\nReview the dashboards for alerts\n. On the Standard dashboard,\nfilter\nby alert\nStatus\nto locate\nNeeds review\nalerts. You can also use\nspotlighted alerts\non the dashboard to quickly see prioritized alerts. On the Alert Triage Agent dashboard, select the\nNeeds attention\nfilter to view alerts with the highest prioritization.\nStart with the alerts with the highest severity\n.\nFilter\nby alert\nSeverity\nif needed to help locate these types of alerts.\nSelect an alert to discover more information and to review the alert details\n. Review details and connections associated with the alert:\nUse the\nActivity explorer tab\nto review a timeline of the associated potentially risky behavior and to identify all risk activities for the alert.\nUse the\nData risk graph\nto review connections and details about users and files for the alert.\nAct on the alert\n. You can either confirm and\ncreate a case\nfor the alert or dismiss and resolve the alert.\nYou can triage alerts by going to the\nAlert details\npage for an alert in either dashboard. On the\nAlert details\npage, you can review information about the alert. You can confirm the alert and create a new case, confirm the alert and add to an existing case, or dismiss the alert.\nThis page also includes the current status for the alert and the alert risk severity level, listed as\nHigh\n,\nMedium\n, or\nLow\n. The severity level might increase or decrease over time if you don't triage the alert. For false positives, you can select multiple alerts and dismiss them in bulk by selecting\nDismiss alerts\n.\nUse Copilot to summarize an alert\nSelect\nSummarize with Copilot\nor the Copilot icon to quickly summarize an alert and prioritize the alerts that need further investigation. You can summarize selected alerts without opening the alert or after viewing the details of the alert. When you summarize an alert with Microsoft Copilot in Microsoft Purview, a\nCopilot\npane appears on the right side of the screen with an alert summary.\nThe alert summary includes all the essential details about the alert, such as the policy that triggered the alert, the activity that generated the alert, the triggering event, the user involved, their last working date (if applicable), any key user attributes, and the user's top risk factors. Copilot in Microsoft Purview consolidates information about the user from all their alerts and in-scope policies and emphasizes the user's top risk factors.\nSuggested prompts automatically show to help further refine your summary and provide additional insights to the activities associated with the alert. Choose from the following suggested prompts:\nList all the data exfiltration activities involving this user\n.\nList all the sequential activities involving this user\n.\nDid the user engage in any unusual behavior?\nShow key actions performed by the user in the last 10 days\n.\nSummarize user's last 30 days of activity\n.\nTip\nYou can also use the\nstandalone version of Microsoft Security Copilot to investigate Insider Risk Management, Microsoft Purview Data Loss Prevention (DLP), and Microsoft Defender XDR alerts\n.\nSpotlight (preview)\nThe alert\nSpotlight\non the\nAlerts\ndashboard helps you prioritize alerts to triage. Every generated alert has a risk score, a list of activities performed, tags, and triggers. Alert spotlighting uses this information to determine if an alert is spotlighted.\nAn alert is automatically spotlighted if it has a risk score of 85 or higher and if at least three of the following conditions are met:\nOne insight from the following table matches the highest insight of the alert.\nThe alert contains priority content or the user is detected as a\npotential high impact user\n.\nThe user is\nmanually brought into scope\n.\nAlert activity contains two or more high-confidence insights from the following table.\nCategory\nIndicator (high confidence insights)\nDevice indicator\n- Creating or transferring files to a network share\n- Creating or copying files to USB\n- Using a browser to upload files to the web\nOffice indicator\n- Sharing SharePoint files with people outside the organization\n- Sharing SharePoint folders with people outside the organization\n- Sharing file links with people outside organization in a Teams chat\nPolicy\n- Content to prioritize\n- Start scoring activity for users\nRisk score booster\n- User is detected as a potential high impact user\nSequence detection\n- Download from Microsoft 365 location then exfiltrate\n- Download from Microsoft 365 location, exfiltrate, then delete\n- Download from Microsoft 365 location, obfuscate, then exfiltrate\nTriage Agent in Insider Risk Management dashboard\nWhen you enable the\nTriage Agent in Insider Risk Management\nin your organization, the agent reviews alerts and displays them on the Triage Agent dashboard. This dashboard automatically includes prioritized alerts and filters to help Insider Risk Management analysts quickly investigate and resolve issues.\nYou can customize the dashboard columns and filter alerts triaged by the agent by priority, date, alert status, or alert scope. To view the Triage Agent dashboard, select\nTriage Agent\nat the top of the dashboard page.\nSelect an alert to display the Agent summary and overview details for the alert. Select\nView details\nto view details for the alert and for access to the details sections like the risk factors, the Activity explorer, and more.\nIf you disagree with the\nAgent categorization\n, select\nIs this incorrect?\n(preview) to provide feedback on why the categorization is incorrect.\nImportant\nThe file risk section of the Triage Agent is deprecated.\nFor an overview of how alerts provide details, context, and related content for risky activity and how to make your investigation process more effective, see the\nInsider Risk Management Alerts Triage Experience video\n.\nStandard alert dashboard\nInsider Risk Management policies automatically generate alerts based on risk indicators you define, and display them on the Standard alert dashboard. These alerts give compliance analysts and investigators an all-up view of the current risk status and allow your organization to triage and take actions for discovered potential risks. By default, policies generate a certain number of low, medium, and high severity alerts, but you can\nincrease or decrease the alert volume\nto suit your needs. Additionally, you can configure the\nalert threshold for policy indicators\nwhen creating a new policy with the policy creation tool. To view the Standard dashboard, select\nStandard\nat the top of the dashboard page.\nNote\nFor any generated alerts, Insider Risk Management generates a single aggregated alert per user. The system adds any new insights for that user to the same alert.\nImportant\nIf you\nscope your policies by one or more administrative units\n, you can only see alerts for the users you're scoped for. For example, if an administrative scope applies to just users in Germany, you can only see alerts for users in Germany. Unrestricted administrators can see all alerts for all users in the organization.\nRestricted administrators can't access alerts for the users assigned to them through security groups or distribution groups added in administrative units. Such user alerts are visible only to unrestricted administrators. Microsoft recommends adding users directly to administrative units to ensure their alerts are also visible to restricted administrators with administrative units assigned.\nFilter alerts, save a view of a filter set, customize columns, or search for alerts\nDepending on the number and type of active Insider Risk Management policies in your organization, reviewing a large queue of alerts can be challenging. To help you keep track of alerts, you can:\nFilter alerts by various attributes.\nSave a view of a filter set to reuse later.\nDisplay or hide columns.\nSearch for an alert.\nView alert reports.\nFilter alerts\nSelect\nAdd filter\n.\nSelect one or more of the following attributes:\nAttribute\nDescription\nActivity that generated the alert\nDisplays the top potentially risky activity and policy match during the activity evaluation period that leads to the alert. This value can update over time.\nAlert dismissal reason\nThe reason for dismissing the alert.\nAssigned to\nThe admin that the alert is assigned to for triaging (if assigned).\nPolicy\nThe name of the policy.\nRisk factors\nThe risk factors that help determine how risky a user's activity might be. The possible values are\nCumulative exfiltration activities\n,\nActivities include priority content\n,\nSequence activities\n,\nActivities include unallowed domains\n,\nMember of a priority user group\n, and\nPotential high impact user\n.\nSeverity\nThe user's risk severity level. The options are\nHigh\n,\nMedium\n, and\nLow\n.\nStatus\nStatus of the alert. The options are\nConfirmed\n,\nDismissed\n,\nNeeds review\n, and\nResolved\n.\nTime detected (UTC)\nThe start and end dates for when the alert was created. The filter searches for alerts between UTC 00:00 on the start date and UTC 00:00 on the end date.\nTriggering event\nThe event that brought the user into scope of the policy. The triggering event can change over time.\nThe attributes that you select are added to the filter bar.\nSelect an attribute in the filter bar, and then select a value to filter by. For example, select the\nTime detected (UTC)\nattribute, enter or select the dates in the\nStart date\nand\nEnd date\nfields, and then select\nApply\n.\nTip\nTo start over at any point, select\nReset all\non the filter bar.\nSave a view of a filter set to reuse later\nAfter applying the filters as described in the preceding procedure, select\nSave\n, enter a name for the filter set, and then select\nSave\n.\nThe filter set is added as a card. It includes a number that shows the count of alerts that meet the criteria in the filter set.\nNote\nYou can save up to five filter sets. To delete a filter set, select the ellipsis (three dots) in the upper-right corner of the card, and then select\nDelete\n.\nTo reapply a saved filter set, select the card for the filter set.\nDisplay or hide columns\nOn the right side of the page, select\nCustomize columns\n.\nSelect or clear the checkboxes for the columns you want to display or hide.\nThe column settings are saved across sessions and across browsers.\nSearch for alerts\nUse the\nSearch\ncontrol to search for a user principal name (UPN), an assigned admin name, or an Alert ID.\nView alert reports\nGo to\nInsider Risk Management\n>\nReports\n>\nAlerts\nto view\nreports\nfor generated alerts, alerts by region, alerts by triggering event, and more.\nHeader/summary section of the Alert details page\nThis section in the\nAlert details\npage is only available when selecting an alert from the Standard dashboard. It contains general information about the user and alert. Use this information for context while reviewing detailed information about the detected risk management activity included in the alert for the user:\nActivity that generated this alert\n: Displays the top potentially risky activity and policy match during the activity evaluation period that led to the alert being generated.\nTriggering event\n: Displays the most recent triggering event that prompted the policy to start assigning risk scores to the user's activity. If you configure\nintegration with Communication Compliance\nfor\nData leaks by risky users\nor\nSecurity policy violations by risky users\npolicies, the triggering event for these alerts is scoped to Communication Compliance activity.\nUser details\n: Displays general information about the user assigned to the alert. If anonymization is enabled, the username, email address, alias, and organization fields are anonymized.\nUser alert history\n: Displays a list of alerts for the user for the last 30 days. Includes a link to view the complete alert history for the user.\nNote\nWhen a user is detected as a potential high impact user, this information is highlighted in the alert header in the\nUser details\npage. The user details also include a summary with the reasons the user is detected as such. To learn more about setting policy indicators for potential high impact users, see\nInsider Risk Management settings\n.\nAlerts generated from policies scoped to only activities that include\npriority content\ninclude the\nOnly activity with priority content was scored for this alert\nnotification in this section.\nTip\nTo get a quick overview of an alert, select\nSummarize\non the alert details page. When you select\nSummarize\n, a\nCopilot\npane appears on the right side of the page with an alert summary. The alert summary includes all the essential details about the alert, such as the policy that was triggered, the activity that generated the alert, the triggering event, the user involved, their last working date (if applicable), any key user attributes, and the user's top risk factors. Copilot in Microsoft Purview consolidates information about the user from all their alerts and in-scope policies and emphasizes the user's top risk factors. You can also\nsummarize the alert from the Alerts queue without having to open the alert by using\nCopilot\n. Or use the\nstandalone version of Microsoft Security Copilot to investigate Insider Risk Management, Microsoft Purview Data Loss Prevention (DLP), and Microsoft Defender XDR alerts\n.\nAgent summary tab\nThis section in the\nAlert details\npage is only available when selecting an alert from the Triage Agent in Insider Risk Management dashboard. Agent summary information includes details on the categorization for the alert and details about the associated risks used in the triage process.\nAll risk factors tab\nYou can access this tab in\nAlert details\nfor alerts in both dashboard views. It opens the summary of risk factors for the user's alert activity. Risk factors help you determine how risky the user's risk management activity is during your review. The risk factors include summaries for:\nCumulative exfiltration activities\n: Events associated with cumulative exfiltration activities.\nHealth record access\n: Potentially risky activities for events associated with accessing health records.\nPriority content\n: Potentially risky activities associated with priority content.\nRisky browser usage\n: Potentially risky activities for events associated with browsing to potentially inappropriate websites.\nSequences of activities\n: Detected potentially risky activities associated with risk sequences.\nTop exfiltration activities\n: Exfiltration activities with the highest number of events for the alert.\nUnallowed domains\n: Potentially risky activities for events associated with unallowed domains.\nUnusual activity for this user\n: Specific activities for the user that are considered potentially risky, as they're unusual and a departure from their typical activities.\nWith these filters, you see only alerts with these risk factors, but the activity that generated an alert might not fall into any of these categories. For example, an alert containing sequence activities might be generated simply because the user copied a file to a USB device.\nContent detected\nThis section on the\nAll risk factors\ntab includes content associated with the risk activities for the alert and summarizes activity events by key areas. When you select an activity link, you open the Activity explorer and see more details about the activity.\nActivity explorer tab\nNote\nUsers can access Activity explorer in the alert management area when their organization enables the feature and after the users trigger events.\nThe\nActivity explorer\ntab is available for alerts in both dashboard views. It provides risk investigators and analysts with a comprehensive analytics tool that provides detailed information about alerts. With the Activity explorer, reviewers can quickly review a timeline of detected potentially risky activity and identify and filter all risk activities associated with alerts.\nUse the Activity explorer\nWhen reviewing activities in the Activity explorer, investigators and analysts can select a specific activity and open the activity details pane. The pane displays detailed information about the activity that investigators and analysts can use during the alert triage process. Detailed information might provide context for the alert and assist with identifying the full scope of the risk activity that triggered the alert.\nWhen selecting an activity's events from the activity timeline, the number of activities displayed in the explorer might not match the number of activity events listed in the timeline. Examples of why this difference might occur include:\nCumulative exfiltration detection\n: Cumulative exfiltration detection analyzes event logs but applies a model that includes deduplicating similar activities to compute cumulative exfiltration risk. Additionally, you might see a difference in the number of potentially risky activities displayed in the Activity explorer if you make changes to your existing policy or settings. For example, if you modify allowed/unallowed domains or add new file type exclusions after a policy is created and potentially risky activity matches occur, the cumulative exfiltration detection activities differ from the results before the policy or settings changes. Cumulative exfiltration detection activity totals are based on the policy and settings configuration at the time of computation and don't include activities prior to the policy and settings changes.\nEmails to external recipients\n: Potentially risky activity for emails sent to external recipients is assigned a risk score based on the number of emails sent, which might not match the activity event logs.\nSequences that contain events excluded from risk scoring\nA\nsequence\nmight contain one or more events that are excluded from risk scoring based on your settings configuration. For example, your organization might use the\nGlobal exclusions\nsetting\nto exclude .png files from risk scoring since .png files aren't normally risky. But a .png file could be used to obfuscate a malicious activity. For this reason, if an event that's excluded from risk scoring is part of a sequence due to an obfuscation activity, the event is included in the sequence since it might be interesting in the context of the sequence.\nThe Activity explorer displays the following information for excluded events in sequences:\nIf a sequence contains a step where\nall\nevents are excluded, the insight includes just the activity name and date. Select the\nView the excluded events\nlink to filter for the excluded events in the Activity explorer. The User activity scatter plot icon has a risk score of 0 if all events are excluded.\nIf a sequence has an insight where\nsome\nevents are excluded, the event information for the nonexcluded events is displayed, but the event count doesn't include the excluded events. Select the\nView the excluded events\nlink to filter for the excluded events in the Activity explorer.\nIf you select a\nsequence link\nfor an insight, you can drill down into the sequence of events in the activity details pane, including any events that were excluded from scoring. An event excluded from scoring is marked as\nExcluded\n.\nFilter alerts in the Activity explorer\nTo filter alerts in the Activity explorer for column information, select\nFilters\n. You can filter alerts by one or more attributes listed in the details pane for the alert. Activity explorer also supports customizable columns to help investigators and analysts focus the dashboard on the information most important to them.\nUse the\nActivity scope\n,\nRisk factor\n, and\nReview status\nfilters to display and sort activities and insights for the following areas.\nActivity scope\n: Filters all scored activities for the user.\nAll scored activity for this user\nOnly scored activity in this alert\nRisk factor\n: Filters for risk factor activity applicable for all policies assigning risk scores. This filter includes all activity for all policies for included users.\nUnusual activity\nIncludes events with priority content\nIncludes events with unallowed domain\nSequence activities\nCumulative exfiltration activities\nHealth record access activities\nRisky browser usage\nReview status\n: Filters activity review status.\nAll\nNot yet reviewed (filters out any activity that was part of a dismissed or resolved alert)\nUser activity tab\nBoth dashboard views provide the\nUser activity\ntab for alerts. It's one of the most powerful tools for internal risk analysis and investigation for alerts and cases in the Insider Risk Management solution. You can quickly review all activities for a user, including a historical timeline of all alerts, alert details, the current risk score for the user, and the sequence of risk events.\nCase actions\n: The case action toolbar provides options for resolving the case. When viewing a case, you can resolve the case, send an email notice to the user, or escalate the case for a data or user investigation.\nRisk activity chronology\n: The full chronology of all risk alerts associated with the case, including all the details available in the corresponding alert bubble.\nFilters and sorting (preview)\n:\nRisk category\n: Filter activities by the following risk categories:\nActivities with risk scores > 15 (unless in a sequence)\nand\nSequence activities\n.\nActivity Type\n: Filter activities by the following types:\nAccess\n,\nDeletion\n,\nCollection\n,\nExfiltration\n,\nInfiltration\n,\nObfuscation\n,\nSecurity\n,\nCustom Indicator\n,\nDefense Evasion\n,\nPrivilege Escalation\n,\nCommunication Risk\n,\nUser Compromise Risk\n, and\nAI Usage\n.\nSort by\n: List the timeline of potentially risky activities by\nDate occurred\nor\nRisk score\n.\nTime filters\n: By default, the User activity chart displays the last three months of potentially risky activities. You can easily filter the chart view by selecting the\n6 Months\n,\n3 Months\n, or\n1 Month\ntabs on the bubble chart.\nRisk sequence\n: The chronological order of potentially risky activities is an important aspect of risk investigation. Identifying these related activities is an important part of evaluating overall risk for your organization. Alert activities that are related are displayed with connecting lines to highlight that these activities are associated with a larger risk area. An icon positioned over the sequence activities identifies sequences in this view relative to the risk score for the sequence. Hover over the icon to see the date and time of the risky activity associated with this sequence. This view of activities can help investigators 'connect the dots' for risk activities that could have been viewed as isolated or one-off events. Select the icon or any bubble in the sequence to display details for all the associated risk activities. Details include:\nName\nof the sequence.\nDate\nor\nDate range\nof the sequence.\nRisk score\nfor the sequence. This score is the numerical score for the sequence of the combined alert risk severity levels for each related activity in the sequence.\nNumber of events associated with each alert in the sequence\n. Links to each file or email associated with each potentially risky activity are also available.\nShow activities in sequence\n. Displays the sequence as a highlight line on the bubble chart and expands the alert details to display all related alerts in the sequence.\nRisk alert activity and details\n: The User activity chart visually displays potentially risky activities as colored bubbles. The chart creates bubbles for different categories of risk. Select a bubble to display the details for each potentially risky activity. Details include:\nDate\nof the risk activity.\nThe\nrisk activity category\n. For example,\nEmail(s) with attachments sent outside the organization\nor\nFile(s) downloaded from SharePoint Online\n.\nRisk score\nfor the alert. This score is the numerical score for the alert risk severity level.\nNumber of events associated with the alert. Links to each file or email associated with the risk activity are also available.\nCumulative exfiltration activities\n: Select to view a visual chart of how activity is building over time for the user.\nRisk activity legend\n: Across the bottom of the user activity chart, a color-coded legend helps you quickly determine risk category for each alert.\nData risk graph tab\nImportant\nYou can't use this feature with the\nanonymized usernames privacy setting\nenabled.\nBoth dashboard views provide access to the\nData risk graph\ntab for alerts. By integrating with\nMicrosoft Sentinel\n, the data risk graph helps you view connections between impacted assets, users, and their activities in an interactive graph experience.\nFor more information about data risk graphs, see\nData risk graph in Insider Risk Management\n.\nSave a view of a filter to reuse later\nIf you create a filter and customize columns for the filter, you can save a view of your changes so that you or others can quickly filter for the same changes again later. When you save a view, you save both the filters and columns. When you load the view, it loads both saved filters and columns.\nCreate a filter and customize columns.\nTip\nTo start over at any point, select\nReset\n. To change columns that you customized, select\nReset columns\n.\nWhen you have the filter the way you want it, select\nSave this view\n, enter a name for the view, and then select\nSave\n.\nNote\nThe maximum length for a view name is 40 characters and you can't use any special characters.\nTo reuse the view of the filter later, select\nViews\n, and then select the view you want to open from the\nRecommended views\ntab (shows the most-used views) or the\nCustom views\ntab (displays the most frequently used filters).\nWhen you select a view this way, it resets all the existing filters and replaces them with the view that you selected.\nAlert status and severity\nNote\nInsider Risk Management throttles trigger processing to help protect and optimize your risk investigation and review experience. This throttling guards against issues that might result in an overload of policy alerts, such as misconfigured data connectors or data loss prevention policies. Insider Risk Management doesn't process signals received beyond the throttling limits.\nLearn more about limits in Insider Risk Management\n.\nYou can triage alerts into one of the following statuses:\nConfirmed\n: An alert you confirm and assign to a new or existing case.\nDismissed\n: An alert you dismiss as benign in the triage process. You can provide a reason for the alert dismissal and include notes that are available in the user's alert history to provide additional context for future reference or for other reviewers. Reasons could range from expected activities, nonimpactful events, simply reducing the number of alert activities for the user, or a reason related to the alert notes. Reason classification choices include\nActivity is expected for this user\n,\nActivity is impactful enough for me to investigate further\n, and\nAlerts for this user contain too much activity\n.\nNeeds review\n: A new alert where triage actions haven't yet been taken.\nResolved\n: An alert that's part of a closed and resolved case.\nThe system automatically calculates alert risk scores from several risk activity indicators. These indicators include the type of risk activity, the number and frequency of the activity occurrence, the history of users' risk activity, and the addition of activity risks that might boost the seriousness of the potentially risky activity. The alert risk score drives the programmatic assignment of a risk severity level for each alert and can't be customized. If you don't triage alerts and risk activities continue to accrue to the alert, the risk severity level can increase. Risk analysts and investigators can use alert risk severity to help triage alerts in accordance with your organization's risk policies and standards.\nAlert risk severity levels are:\nHigh severity\n: The potentially risky activities and indicators for the alert pose significant risk. The associated risk activities are serious, repetitive, and correlate strongly to other significant risk factors.\nMedium severity\n: The potentially risky activities and indicators for the alert pose a moderate risk. The associated risk activities are moderate, frequent, and have some correlation to other risk factors.\nLow severity\n: The potentially risky activities and indicators for the alert pose a minor risk. The associated risk activities are minor, more infrequent, and don't correlate to other significant risk factors.\nDismiss multiple alerts (preview)\nTo save triage time, analysts and investigators can dismiss multiple alerts at once. With the\nDismiss alerts\ncommand bar option, you can select one or more alerts with a\nNeeds review\nstatus on the dashboard and quickly dismiss these alerts as benign. You can select up to 400 alerts to dismiss at one time.\nDismiss an insider risk alert\nSign in to the\nMicrosoft Purview portal\nwith credentials for an admin account in your Microsoft 365 organization.\nGo to the\nInsider Risk Management\nsolution.\nSelect\nAlerts\nin the left navigation.\nOn the\nAlerts dashboard\n, select the alerts that have a\nNeeds review\nstatus.\nOn the Alerts command bar, select\nDismiss alerts\n.\nOn the\nDismiss alerts\ndetail pane, review the user and policy details associated with the selected alerts.\nSelect\nDismiss alerts\nto resolve the alerts as benign.\nReports for alerts\nTo see reports for alerts, go to the\nReports\npage. Each report widget on the\nReports\npage displays information for the last 30 days:\nTotal alerts that need review\n: The total number of alerts needing review and triage, including a breakdown by alert severity.\nOpen alerts over past 30 days\n: The total number of alerts created by policy matches over the last 30 days, sorted by high, medium, and low alert severity levels.\nAverage time to resolve alerts\n: A summary of useful alert statistics:\nAverage time to resolve high severity alerts, listed in hours, days, or months.\nAverage time to resolve medium severity alerts, listed in hours, days, or months.\nAverage time to resolve low severity alerts, listed in hours, days, or months.\nAssign an alert\nIf you're an administrator and a member of the\nInsider Risk Management\n,\nInsider Risk Management Analysts\n, or\nInsider Risk Management Investigators\nrole group, you can assign ownership of an alert to yourself or to an Insider Risk Management user with one of the same roles. After you assign an alert, you can reassign it to a user with any of the same roles. You can only assign an alert to one admin at a time.\nNote\nIf you\nscope your policies by one or more administrative units\n, you can only give ownership of an alert to Insider Risk Management users with the appropriate role group permissions. The user highlighted in the alert must be in scope of the admin unit. For example, if an administrative scope applies to just users in Germany, the Insider Risk Management user can only see alerts for users in Germany. Unrestricted administrators can see all alerts for all users in the organization.\nAfter you assign an admin, you can search by admin.\nNote\nAdmins contained within a Microsoft Entra security group aren't supported for alert assignment. Admins must be directly assigned to one of the required roles.\nIf you're using a custom group, make sure that the custom group contains the\nCase management\nrole\n. The\nInsider Risk Management Analysts\nand the\nInsider Risk Management Investigators\nrole groups both contain the\nCase management\nrole. If you're using a custom group, you must explicitly add the\nCase management\nrole to the group.\nAssign an alert from the Alerts dashboard\nSign in to the\nMicrosoft Purview portal\nwith credentials for an admin account in your Microsoft 365 organization.\nGo to the\nInsider Risk Management\nsolution.\nSelect\nAlerts\nin the left navigation.\nOn the\nAlerts dashboard\n, select the alerts that you want to assign.\nIn the command bar over the alerts queue, select\nAssign\n.\nIn the\nAssign owner\npane on the right side of the screen, search for an admin with the appropriate permissions, and then select the checkbox for that admin.\nSelect\nAssign\n.\nAssign an alert from the Alerts detail page\nSign in to the\nMicrosoft Purview portal\nwith credentials for an admin account in your Microsoft 365 organization.\nGo to the\nInsider Risk Management\nsolution.\nSelect\nAlerts\nin the left navigation.\nSelect an alert.\nIn the detail pane for the alert, in the upper-right corner of the page, select\nAssign\n.\nIn the\nSuggested contacts\nlist, select the appropriate admin.\nCreate a case for an alert\nCreate a case for an alert to investigate potentially risky activity.\nSign in to the\nMicrosoft Purview portal\nwith credentials for an admin account in your Microsoft 365 organization.\nGo to the\nInsider Risk Management\nsolution.\nSelect\nAlerts\nin the left navigation.\nOn the\nAlerts dashboard\n, select the alert you want to confirm and create a new case for.\nOn the\nAlerts details pane\n, select\nActions\n>\nConfirm alerts & create case\n.\nIn the\nConfirm alert and create insider risk case\ndialog box, enter a name for the case, select users to add as contributors, and add comments as applicable. Comments automatically add to the case as a case note.\nTo download the content for the case, select\nCreate case with content download on\n(preview). After you reach the content download limit, you can still create cases, but you must leave the box unchecked. If you don't enable content download when you create the case, you can enable downloads when the case is active.\nSelect\nCreate case\nto create a new case.\nAfter you create the case, investigators and analysts can manage and act on the case. For more information, see the\nInsider Risk Management case\narticle.\nRetention and item limits\nAs Insider Risk Management alerts age, their value to minimize potentially risky activity diminishes for most organizations. Conversely, active cases and associated artifacts (alerts, insights, activities) always provide value and don't have an automatic expiration date. This retention policy includes all future alerts and artifacts in an active status for any user associated with an active case.\nTo minimize the number of older items that provide limited current value, the following retention and limits apply for Insider Risk Management alerts, cases, and user reports:\nItem\nRetention/Limit\nActive cases (and associated artifacts)\n2,000 active cases, retained indefinitely and never expire\nAlerts with Needs review status\n120 days from alert creation, then automatically deleted\nMaximum number of active cases\n100\nResolved cases (and associated artifacts)\n120 days from case resolution, then automatically deleted\nUser activities reports\n120 days from report creation, then automatically deleted\nBest practices for managing your alert volume\nTo minimize insider risks in your organization, regularly review, investigate, and act on potentially risky insider alerts. Quickly taking action to reduce these risks can potentially save your organization time, money, and potential regulatory or legal consequences.\nLearn about best practices for managing your Insider Risk Management alert queue\n.\nSee also\nBest practices for managing your Insider Risk Management alert queue\nTake action on insider risk cases\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Investigate Alerts", "section": "Microsoft Purview" }, "https://learn.microsoft.com/en-us/purview/import-hr-data": { - "content_hash": "sha256:5c8cfb9ad7a98bdc611a7746806c1bbcc35ddfb760d84b2bf1d73b7b871cfdfa", - "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nSet up a connector to import HR data\nFeedback\nSummarize this article for me\nYou can set up a data connector to import human resources (HR) data related to events such as a user's resignation or a change in a user's job level. The insider risk management solution uses the HR data to generate risk indicators that help you identify possible malicious activity or data theft by users inside your organization.\nSetting up a connector for HR data that insider risk management policies use to generate risk indicators involves creating a CSV file that contains the HR data, creating an app in Microsoft Entra for authentication, creating an HR data connector in the\nMicrosoft Purview portal\n, and running a script (on a scheduled basis) that ingests the HR data in CSV files to the Microsoft cloud so it's available to the insider risk management solution.\nBefore you begin\nDetermine which HR scenarios and data to import to Microsoft 365. This determination helps you decide how many CSV files and HR connectors you need to create, and how to generate and structure the CSV files. The insider risk management policies you want to implement determine the HR data you import. For more information, see Step 1.\nDetermine how to retrieve or export the data from your organization's HR system (regularly) and add it to the CSV files that you create in Step 1. The script that you run in Step 4 uploads the HR data in the CSV files to the Microsoft cloud.\nAssign the Data Connector Admin role to the user who creates the HR connector in Step 3. This role is required to add connectors on the\nData connectors\npage in the Microsoft Purview portal. Multiple role groups include this role by default. For a list of these role groups, see\nRoles in Microsoft Defender for Office 365 and Microsoft Purview compliance\n. Alternatively, an admin in your organization can create a custom role group, assign the Data Connector Admin role, and add the appropriate users as members. For instructions, see:\nPermissions in the Microsoft Purview portal\nRoles and role groups in Microsoft Defender for Office 365 and Microsoft Purview compliance\nUnderstand that the sample script you run in Step 4 uploads your HR data to the Microsoft cloud so that the insider risk management solution can use it. This sample script isn't supported under any Microsoft standard support program or service. It's provided AS IS without warranty of any kind. Microsoft further disclaims all implied warranties including, without limitation, any implied warranties of merchantability or of fitness for a particular purpose. You assume all risk arising from the use or performance of the sample script and documentation. In no event shall Microsoft, its authors, or anyone else involved in the creation, production, or delivery of the scripts be liable for any damages whatsoever (including, without limitation, damages for loss of business profits, business interruption, loss of business information, or other pecuniary loss) arising out of the use of or inability to use the sample scripts or documentation, even if Microsoft has been advised of the possibility of such damages.\nKnow that this connector is available in GCC environments in the Microsoft 365 US Government cloud. Third-party applications and services might involve storing, transmitting, and processing your organization's customer data on third-party systems that are outside of the Microsoft 365 infrastructure and therefore aren't covered by the Microsoft Purview and data protection commitments. Microsoft makes no representation that use of this product to connect to third-party applications implies that those third-party applications are FEDRAMP compliant. For step-by-step instructions for setting up an HR connector in a GCC environment, see\nSet up a connector to import HR data in US Government\n.\nAdd the\nwebhook.ingestion.office.com\ndomain to your firewall allowlist for your organization.\nStep 1: Prepare a CSV file with your HR data\nFirst, create a CSV file that contains the HR data the connector imports to Microsoft 365. The insider risk solution uses this data to generate potential risk indicators. You can import data for the following HR scenarios to Microsoft 365:\nEmployee resignation. Information about employees who leave your organization.\nJob level changes. Information about job level changes for employees, such as promotions and demotions.\nPerformance reviews. Information about employee performance.\nPerformance improvement plans. Information about performance improvement plans for employees.\nEmployee profile (preview). General information about an employee.\nThe type of HR data to import depends on the insider risk management policy and corresponding policy template you want to implement. The following table shows which HR data type each policy template requires:\nPolicy template\nHR data type\nData theft by departing users\nEmployee resignations\nData leaks\nNot applicable\nData leaks by priority users\nNot applicable\nData leaks by risky users\nJob level changes, Performance reviews, Performance improvement plans\nSecurity policy violations\nNot applicable\nSecurity policy violations by departing users\nEmployee resignations\nSecurity policy violations by priority users\nNot applicable\nSecurity policy violations by risky users\nJob level changes, Performance reviews, Performance improvement plans\nOffensive language in email\nNot applicable\nHealthcare policy\nEmployee profile\nFor more information about policy templates for insider risk management, see\nInsider risk management policies\n.\nFor each HR scenario, provide the corresponding HR data in one or more CSV files. The number of CSV files to use for your insider risk management implementation is discussed later in this section.\nAfter you create the CSV file with the required HR data, store it on the local computer where you run the script in Step 4. Implement an update strategy to make sure the CSV file always contains the most current information so that whatever you run the script, the most current HR data is uploaded to the Microsoft cloud and accessible to the insider risk management solution.\nImportant\nThe column names described in the following sections aren't required parameters, but only examples. You can use any column name in your CSV files. However, you\nmust\nmap the column names you use in a CSV file to the data type when you create the HR connector in Step 3. Also note that the sample CSV files in the following sections are shown in NotePad view. It's much easier to view and edit CSV files in Microsoft Excel.\nThe following sections describe the required CSV data for each HR scenario.\nCSV file for employee resignation data\nHere's an example of a CSV file for employee resignation data.\nUserPrincipalName,ResignationDate,LastWorkingDate\nsarad@contoso.com,2019-04-23T15:18:02.4675041+05:30,2019-04-29T15:18:02.4675041+05:30\npilarp@contoso.com,2019-04-24T09:15:49Z,2019-04-29T15:18:02.7117540\nThe following table describes each column in the CSV file for employee resignation data.\nColumn\nDescription\nUserPrincipalName\nThe Microsoft Entra UserPrincipalName (UPN) used to identify the terminated user.\nResignationDate\nSpecifies the date the user's employment is officially terminated or the user resigns from your organization. For example, this date might be when the user gives their notice about leaving your organization. This date might be different from the date of the person's last day of work. Use the following date format:\nyyyy-mm-ddThh:mm:ss.nnnnnn| -hh:mm\n, which is the\nISO 8601 date and time format\n.\nLastWorkingDate\nSpecifies the last day of work for the terminated user. This date can't be more than six months prior or one year in advance from the time of upload. Use the following date format:\nyyyy-mm-ddThh:mm:ss.nnnnnn| -hh:mm\n, which is the\nISO 8601 date and time format\n.\nCSV file for job level changes data\nHere's an example of a CSV file for job level changes data.\nUserPrincipalName,EffectiveDate,OldLevel,NewLevel\nsarad@contoso.com,2019-04-23T15:18:02.4675041+05:30,Level 61 - Sr. Manager,Level 60- Manager\npillar@contoso.com,2019-04-23T15:18:02.4675041+05:30,Level 62 - Director,Level 60- Sr. Manager\nThe following table describes each column in the CSV file for job level changes data.\nColumn\nDescription\nUserPrincipalName\nThe Microsoft Entra UserPrincipalName (UPN) used to specify the user's email address.\nEffectiveDate\nSpecifies the date that the user's job level is officially changed. Use the following date format:\nyyyy-mm-ddThh:mm:ss.nnnnnn+ | -hh:mm\n, which is the\nISO 8601 date and time format\n.\nRemarks\nSpecifies the remarks that evaluator provides about the change of job level. You can enter a limit of 200 characters. This parameter is optional. You don't have to include it in the CSV file.\nOldLevel\nSpecifies the user's job level before it was changed. This is a free-text parameter and can contain hierarchical taxonomy for your organization. This parameter is optional. You don't have to include it in the CSV file.\nNewLevel\nSpecifies the user's job level after it was changed. This is a free-text parameter and can contain hierarchical taxonomy for your organization. This parameter is optional. You don't have to include it in the CSV file.\nCSV file for performance review data\nHere's an example of a CSV file for performance data.\nUserPrincipalName,EffectiveDate,Remarks,Rating\nsarad@contoso.com,2019-04-23T15:18:02.4675041+05:30,Met expectations but bad attitude,2-Below expectation\npillar@contoso.com,2019-04-23T15:18:02.4675041+05:30, Multiple conflicts with the team\nThe following table describes each column in the CSV file for performance review data.\nColumn\nDescription\nUserPrincipalName\nThe Microsoft Entra UserPrincipalName (UPN) used to specify the user's email address.\nEffectiveDate\nSpecifies the date that the user is officially informed about the result of their performance review. This date can be the date when the performance review cycle ends. Use the following date format:\nyyyy-mm-ddThh:mm:ss.nnnnnn+ | -hh:mm\n, which is the\nISO 8601 date and time format\n.\nRemarks\nSpecifies any remarks that evaluator provides to the user for the performance review. This is a text parameter with a limit of 200 characters. This parameter is optional. You don't have to include it in the CSV file.\nRating\nSpecifies the rating provided for the performance review. This is a text parameter and can contain any free-form text that your organization uses to recognize the evaluation. For example, \"3 Met expectations\" or \"2 Below average\". This is a text parameter with a limit of 25 characters. This parameter is optional. You don't have to include it in the CSV file.\nCSV file for performance improvement plan data\nHere's an example of a CSV file for the data for the performance improvement plan data.\nUserPrincipalName,EffectiveDate,ImprovementRemarks,PerformanceRating\nsarad@contoso.com,2019-04-23T15:18:02.4675041+05:30,Met expectation but bad attitude,2-Below expectation\npillar@contoso.com,2019-04-23T15:18:02.4675041+05:30, Multiple conflicts with the team\nThe following table describes each column in the CSV file for performance review data.\nColumn\nDescription\nUserPrincipalName\nThe Microsoft Entra UserPrincipalName (UPN) used to specify the user's email address.\nEffectiveDate\nSpecifies the date when the user is officially informed about their performance improvement plan. You must use the following date format:\nyyyy-mm-ddThh:mm:ss.nnnnnn+ | -hh:mm\n, which is the\nISO 8601 date and time format\n.\nRemarks\nSpecifies any remarks that evaluator provides about the performance improvement plan. This is a text parameter with a limit of 200 characters. This is an optional parameter. You don't have to include it in the CSV file.\nRating\nSpecifies any rating or other information related to the performance review. This is a text parameter and can contain any free form text that your organization uses to recognize the evaluation. For example, \"3 Met expectations\" or \"2 Below average\". This is a text parameter with limit of 25 characters. This is an optional parameter. You don't have to include it in the CSV file.\nCSV file for employee profile data (preview)\nNote\nThe capability to create an HR connector for employee profile data is in public preview. To create an HR connector that supports employee profile data, go to the\nData connectors\npage in the Microsoft Purview portal, select the\nConnectors\ntab, then select\nAdd a connector\n>\nHR\n. Follow the steps to create a connector in\nStep 3: Create the HR connector\n.\nHere's an example of a CSV file for employee profile data.\nUserPrincipalName,UserName,EmployeeFirstName,EmployeeLastName,EmployeeAddLine1,EmployeeAddLine2,EmployeeCity,EmployeeState,EmployeeZipCode,EmployeeDept,EmployeeType,EmployeeRole\njackq@contoso.com,jackq,jack,qualtz,50 Oakland Ave,#206,City,Florida,32104,Orthopaedic,Regular,Nurse\nThe following table describes each column in the CSV file for employee profile data.\nColumn\nDescription\nUserPrincipalName\n*\nThe Microsoft Entra UserPrincipalName (UPN) used to specify the user's email address.\nEmployeeFirstName\n*\nFirst name of the employee.\nEmployeeLastName\n*\nLast name of the employee.\nEmployeeAddressLine1\n*\nStreet address of the employee.\nEmployeeAddressLine2\nSecondary address information, such as apartment number, for employee.\nEmployeeCity\nCity of residence for employee.\nEmployeeState\nState of residence for employee.\nEmployeeZipCode\n*\nZip code of residence for employee.\nEmployeeCountry\nCountry of residence for employee.\nEmployeeDepartment\nEmployee's department in the organization.\nEmployeeType\nEmployment type for employee, such as Regular, Exempt, or Contractor.\nEmployeeRole\nEmployees's role, designation, or job title in the organization.\nNote\n*\nThis column is mandatory. If a mandatory column is missing, the CSV file isn't validated and other data in the file isn't imported.\nWe recommend that you create an HR connector that only imports employee profile data. For this connector, frequently refresh the employee profile data, preferably every 15 to 20 days. Employee profile records are deleted if you don't update them in the past 30 days.\nDetermining how many CSV files to use for HR data\nIn Step 3, you can choose to create separate connectors for each HR data type or a single connector for all data types. You can use separate CSV files that contain data for one HR scenario (like the examples of the CSV files described in the previous sections). Alternatively, you can use a single CSV file that contains data for two or more HR scenarios. Here are some guidelines to help you determine how many CSV files to use for HR data.\nIf the insider risk management policy that you want to implement requires multiple HR data types, consider using a single CSV file that contains all the required data types.\nThe method for generating or collecting the HR data might determine the number of CSV files. For example, if the different types of HR data used to configure an HR connector are located in a single HR system in your organization, then you might be able to export the data to a single CSV file. But if data is distributed across different HR systems, then it might be easier to export data to different CSV files. For example, employee resignation data might be located in a different HR system than job level or performance review data. In this case, it might be easier to have separate CSV files rather than having to manually combine the data into a single CSV file. So, how you retrieve or export data from your HR systems might determine how many CSV files you need.\nAs a general rule, the data types in a CSV file determine the number of HR connectors that you need to create. For example, if a CSV file contains all the data types required to support your insider risk management implementation, then you only need one HR connector. But if you have two separate CSV files that each contain a single data type, then you have to create two HR connectors. An exception to this rule is that if you add an\nHRScenario\ncolumn to a CSV file (see the next section), you can configure a single HR connector that can process different CSV files.\nFor each CSV file, you can ingest up to 500 records at once. To ingest a larger number of records, upload multiple CSV files, each with fewer than 500 records.\nConfiguring a single CSV file for multiple HR data types\nYou can add multiple HR data types to a single CSV file. This configuration is useful if the insider risk management solution you're implementing requires multiple HR data types or if the data types are located in a single HR system in your organization. Having fewer CSV files always allows you to have fewer HR connectors to create and manage.\nHere are requirements for configuring a CSV file with multiple data types:\nAdd the required columns (and optional columns if you use them) for each data type and the corresponding column name in the header row. If a data type doesn't correspond to a column, leave the value blank.\nAdd an\nHRScenario\ncolumn to the CSV file. The HR connector uses this column to identify which rows in the CSV file contain which type HR data. The values in this column identify the type of HR data in each row. For example, values that correspond to the HR scenarios could be `Resignation`, `Job level change`, `Performance review`, `Performance improvement plan`, and `Employee profile`.\nIf you have multiple CSV files that contain an\nHRScenario\ncolumn, make sure that each file uses the same column name and the same values that identify the specific HR scenarios.\nThe following example shows a CSV file that contains the\nHRScenario\ncolumn. The values in the HRScenario column identify the type of data in the corresponding row. The following sample covers four HR scenarios `Resignation`, `Job level change`, `Performance review`, and `Performance improvement plan`.\nHRScenario,EmailAddress,ResignationDate,LastWorkingDate,EffectiveDate,Remarks,Rating,OldLevel,NewLevel\nResignation,sarad@contoso.com,2019-04-23T15:18:02.4675041+05:30,2019-04-29T15:18:02.4675041+05:30,,,,\nResignation,pilarp@contoso.com,2019-04-24T09:15:49Z,2019-04-29T15:18:02.7117540,,,,\nJob level change,sarad@contoso.com,2019-04-23T15:18:02.4675041+05:30,,,,,Level 61 Sr. Manager, Level 60 Manager\nJob level change,pillarp@contoso.com,2019-04-23T15:18:02.4675041+05:30,,,,,Level 62 Director,Level 60 Sr Manager\nPerformance review,sarad@contoso.com,,,2019-04-23T15:18:02.4675041+05:30,Met expectation but bad attitude,2 Below expectations,,\nPerformance review,pillarp@contoso.com,,,2019-04-23T15:18:02.4675041+05:30, Multiple conflicts with the team,,\nPerformance improvement plan,sarad@contoso.com,,,2019-04-23T15:18:02.4675041+05:30,Met expectations but bad attitude,2 Below expectations,,\nPerformance improvement plan,pillarp@contoso.com,,,2019-04-23T15:18:02.4675041+05:30,Multiple conflicts with the team,,\nNote\nYou can use any name for the column that identifies HR data type because you map the name of the column in your CSV file as the column that identifies the HR data type when you set up the connector in Step 3. You also map the values used for the data type column when you set up the connector.\nAdding the HRScenario column to a CSV file that contains a single data type\nBased on your organization's HR systems and how you export HR data to a CSV file, you might need to create multiple CSV files that each contain a single HR data type. In this case, you can still create a single HR connector to import data from different CSV files. To do this, add an HRScenario column to the CSV file and specify the HR data type. Then, you can run the script for each CSV file but use the same job ID for the connector. For more information, see\nStep 4\n.\nStep 2: Create an app in Microsoft Entra ID\nNext, create and register a new app in Microsoft Entra ID. The app corresponds to the HR connector that you create in Step 3. When you create this app, Microsoft Entra ID can authenticate the HR connector when it runs and attempts to access your organization. This app also authenticates the script that you run in Step 4 to upload your HR data to the Microsoft cloud. During the creation of this Microsoft Entra app, save the following information. You use these values in Step 3 and Step 4.\nMicrosoft Entra application ID (also called the\napp ID\nor\nclient ID\n)\nMicrosoft Entra application secret (also called the\nclient secret\n)\nTenant ID (also called the\ndirectory Id\n)\nFor step-by-step instructions for creating an app in Microsoft Entra ID, see\nRegister an application with the Microsoft identity platform\n.\nStep 3: Create the HR connector\nThe next step is to create an HR connector in the Microsoft Purview portal. After you run the script in Step 4, the HR connector that you create will ingest the HR data from the CSV file to your Microsoft 365 organization. Before you create a connector, be sure that you have a list of the HR scenarios and the corresponding CSV column names for each one. You have to map the data required for each scenario to the actual column names in your CSV file when configuring the connector. Alternatively, you can upload a sample CSV file when configuring the connector and the workflow help you map the name of the columns to the required data types.\nAfter you complete this step, be sure to copy the job ID that's generated when you create the connector. You use the job ID when you run the script.\nSign in to the\nMicrosoft Purview portal\n.\nSelect\nSettings\n>\nData connectors\n.\nSelect\nMy connectors\n, then select\nAdd connector\n.\nFrom the list, choose\nHR (preview)\n.\nOn the\nSetup the connection\npage, do the following and then select\nNext\n:\nType or paste the Microsoft Entra application ID for the Azure app that you created in Step 2.\nType a name for the HR connector.\nOn the HR scenarios page, select one or more HR scenarios that you want to import data for and then select\nNext\n.\nOn the file mapping method page, select a file type if necessary, and then select one of the following options and then select\nNext\n.\nUpload a sample file\n. If you select this option, select\nUpload sample file\nto upload the CSV file that you prepared in Step 1. This option allows you to quickly select column names in your CSV file from a drop-down list to map them to the data types for the HR scenarios that you previously selected.\nOR\nManually provide the mapping details\n. If you select this option, you have to type the name of the columns in your CSV file to map them to the data types for the HR scenarios that you previously selected.\nOn the File mapping details page, do one of the following, depending on whether you uploaded a sample CSV file and whether you're configuring the connector for a single HR scenario or for multiple scenarios. If you uploaded a sample file, you don't have to type the column names. You pick them from a dropdown list.\nIf you selected a single HR scenario in the previous step, then type the column header names (also called\nparameters\n) from the CSV file that you created in Step 1 in each of the appropriate boxes. The column names that you type aren't case-sensitive, but be sure to include spaces if the column names in your CSV file include spaces. As previously explained, the names you type in these boxes must match the parameter names in your CSV file. For example, the following screenshot shows the parameter names from the sample CSV file for the employee resignation HR scenario shown in Step 1.\nIf you selected multiple data types in the previous step, then you need to enter identifier column name that will identify the HR data type in your CSV file. After entering the identifier column name, type the value that identifies this HR data type, and type the column header names for selected data types from the CSV file(s) that you created in Step 1 in each of the appropriate boxes for each selected data type. As previously explained, the names that you type in these boxes must match the column names in your CSV file.\nOn the\nReview\npage, review your settings and then select\nFinish\nto create the connector.\nA status page is displayed that confirms the connector was created. This page contains two important things that you need to complete the next step to run the sample script to upload your HR data.\nJob ID.\nYou'll need this job ID to run the script in the next step. You can copy it from this page or from the connector flyout page.\nLink to sample script.\nSelect the\nhere\nlink to go to the GitHub site to access the sample script (the link opens a new window). Keep this window open so that you can copy the script in Step 4. Alternatively, you can bookmark the destination or copy the URL so you can access it again when you run the script. This link is also available on the connector flyout page.\nSelect\nDone\n.\nThe new connector is displayed in the list on the\nConnectors\ntab.\nSelect the HR connector that you just created to display the flyout page, which contains properties and other information about the connector.\nIf you haven't already done so, you can copy the values for the\nAzure App ID\nand\nConnector job ID\n. You'll need these to run the script in the next step. You can also download the script from the flyout page (or download it using the link in the next step.)\nYou can also select\nEdit\nto change the Azure App ID or the column header names that you defined on the\nFile mapping\npage.\nStep 4: Run the sample script to upload your HR data\nImportant\nYou must add the\nwebhook.ingestion.office.com\ndomain to your firewall allowlist for your organization. If you block this domain, the script doesn't run.\nThe last step in setting up an HR connector is to run a sample script that uploads the HR data in the CSV file (that you created in Step 1) to the Microsoft cloud. Specifically, the script uploads the data to the HR connector. After you run the script, the HR connector that you created in Step 3 imports the HR data to your Microsoft 365 organization where other compliance tools, such as the Insider risk management solution, can access it. After you run the script, consider scheduling a task to run it automatically on a daily basis so the most current employee termination data is uploaded to the Microsoft cloud. For more information, see\nSchedule the script to run automatically\n.\nGo to the window that you left open from the previous step to access the GitHub site with the sample script. Alternatively, open the bookmarked site or use the URL that you copied. You can also access the script\nhere\n.\nSelect the\nRaw\nbutton to display the script in text view.\nCopy all the lines in the sample script and save them to a text file.\nModify the sample script for your organization, if necessary.\nSave the text file as a Windows PowerShell script file by using a filename suffix of\n.ps1\n; for example,\nHRConnector.ps1\n. Alternatively, you can use the GitHub filename for the script, which is\nupload_termination_records.ps1\n.\nOpen a command prompt on your local computer, and go to the directory where you saved the script.\nRun the following command to upload the HR data in the CSV file to the Microsoft cloud; for example:\n.\\HRConnector.ps1 -tenantId -appId -appSecret -jobId -filePath ''\nThe following table describes the parameters to use with this script and their required values. The information you obtained in the previous steps is used in the values for these parameters.\nParameter\nDescription\ntenantId\nThis is the ID for your Microsoft 365 organization that you obtained in Step 2. You can also obtain the tenant ID for your organization on the\nOverview\nblade in the Microsoft Entra admin center. This value identifies your organization.\nappId\nThis is the Microsoft Entra application ID for the app that you created in Microsoft Entra ID in Step 2. This value is used by Microsoft Entra ID for authentication when the script attempts to access your Microsoft 365 organization.\nappSecret\nThis is the Microsoft Entra application secret for the app that you created in Microsoft Entra ID in Step 2. This value is also used for authentication.\njobId\nThis is the job ID for the HR connector that you created in Step 3. This value associates the HR data that is uploaded to the Microsoft cloud with the HR connector.\nfilePath\nThis is the file path for the file (stored on the same system as the script) that you created in Step 1. Try to avoid spaces in the file path; otherwise use single quotation marks.\nHere's an example of the syntax for the HR connector script using actual values for each parameter:\n.\\HRConnector.ps1 -tenantId aaaabbbb-0000-cccc-1111-dddd2222eeee -appId 00001111-aaaa-2222-bbbb-3333cccc4444 -appSecret Aa1Bb~2Cc3.-Dd4Ee5Ff6Gg7Hh8Ii9_Jj0Kk1Ll2 -jobId 00001111-aaaa-2222-bbbb-3333cccc4444 -filePath 'C:\\Users\\contosoadmin\\Desktop\\Data\\employee_termination_data.csv'\nIf the upload is successful, the script displays the\nUpload Successful\nmessage.\nNote\nIf you have problems running the previous command because of execution policies, see\nAbout Execution Policies\nand\nSet-ExecutionPolicy\nfor guidance about setting execution policies.\nStep 5: Monitor the HR connector\nAfter you create the HR connector and run the script to upload your HR data, you can view the connector and upload status in the Microsoft Purview portal. If you schedule the script to run automatically on a regular basis, you can also view the current status after the last time the script ran.\nSign in to the\nMicrosoft Purview portal\n.\nSelect\nSettings\n>\nData connectors\n.\nSelect\nMy connectors\n, then select the HR connector that you created to display the flyout page. This page contains the properties and information about the connector.\nUnder\nProgress\n, select the\nDownload log\nlink to open (or save) the status log for the connector. This log contains information about each time the script runs and uploads the data from the CSV file to the Microsoft cloud.\nThe\nRecordsSaved\nfield indicates the number of rows in the CSV file that you uploaded. For example, if the CSV file contains four rows, the value of the\nRecordsSaved\nfields is 4, if the script successfully uploads all the rows in the CSV file.\nIf you didn't run the script in Step 4, a link to download the script is displayed under\nLast import\n. You can download the script and then follow the steps to run the script.\n(Optional) Step 6: Schedule the script to run automatically\nTo make sure tools like the insider risk management solution always have access to the latest HR data from your organization, schedule the script to run automatically on a recurring basis, such as once a day. This schedule requires updating the HR data in the CSV file on a similar (if not the same) schedule so that it contains the latest information about employees who leave your organization. The goal is to upload the most current HR data so that the HR connector can make it available to the insider risk management solution.\nUse the Task Scheduler app in Windows to automatically run the script every day.\nOn your local computer, select the Windows\nStart\nbutton and then type\nTask Scheduler\n.\nSelect the\nTask Scheduler\napp to open it.\nIn the\nActions\nsection, select\nCreate Task\n.\nOn the\nGeneral\ntab, type a descriptive name for the scheduled task, such as\nHR Connector Script\n. You can also add an optional description.\nUnder\nSecurity options\n, do the following steps:\nDecide whether to run the script only when you're logged on to the computer or run it when you're logged on or not.\nMake sure that the\nRun with the highest privileges\ncheckbox is selected.\nSelect the\nTriggers\ntab, select\nNew\n, and then do the following steps:\nUnder\nSettings\n, select the\nDaily\noption, then choose a date and time to run the script for the first time. The script runs every day at the same specified time.\nUnder\nAdvanced settings\n, make sure the\nEnabled\ncheckbox is selected.\nSelect\nOk\n.\nSelect the\nActions\ntab, select\nNew\n, and then do the following steps:\nIn the\nAction\ndropdown list, make sure that\nStart a program\nis selected.\nIn the\nProgram/script\nbox, select\nBrowse\n, go to the following location, and select it so the path displays in the box:\nC:\\Windows\\System32\\WindowsPowerShell\\v1.0\\powershell.exe\n.\nIn the\nAdd arguments (optional)\nbox, paste the same script command that you ran in Step 4. For example,\n.\\HRConnector.ps1 -tenantId \"aaaabbbb-0000-cccc-1111-dddd2222eeee\" -appId \"00001111-aaaa-2222-bbbb-3333cccc4444\" -appSecret \"Aa1Bb~2Cc3.-Dd4Ee5Ff6Gg7Hh8Ii9_Jj0Kk1Ll2\" -jobId \"00001111-aaaa-2222-bbbb-3333cccc4444\" -filePath \"C:\\Users\\contosoadmin\\Desktop\\Data\\employee_termination_data.csv\"\nIn the\nStart in (optional)\nbox, paste the folder location of the script that you ran in Step 4. For example,\nC:\\Users\\contosoadmin\\Desktop\\Scripts\n.\nSelect\nOk\nto save the settings for the new action.\nIn the\nCreate Task\nwindow, select\nOk\nto save the scheduled task. You might be prompted to enter your user account credentials.\nThe new task appears in the Task Scheduler Library.\nThe last time the script ran and the next time it's scheduled to run displays. You can double-select the task to edit it.\nYou can also verify the last time the script ran on the flyout page of the corresponding HR connector in the compliance center.\n(Optional) Step 7: Upload data by using Power Automate templates\nYou can upload HR data by using Power Automate templates and define triggers. For example, you can configure a Power Automate template to trigger when new HR connector files are available in SharePoint or OneDrive locations. You can also streamline this process by storing confidential information like Microsoft Entra application secret (created in\nStep 2\n) in Azure Key Vault and use it with Power Automate for authentication.\nComplete the following steps to automatically upload HR data when new files become available on OneDrive for Business:\nDownload the\nImportHRDataforIRM.zip\npackage from the\nGitHub site\n.\nIn\nPower Automate\n, go to\nMy flows\n.\nSelect\nImport\nand upload the\nImportHRDataforIRM.zip\npackage.\nAfter the package is uploaded, update the content (name and OneDrive for Business connection), and select\nImport\n.\nSelect\nOpen flow\nand update the parameters. The following table describes the parameters to use in this Power Automate Flow and their required values. Use the information you obtained in the previous steps for these values.\nParameter\nDescription\nApp ID\nThis is the Microsoft Entra application ID for the app that you created in Microsoft Entra ID in\nStep 2\n. Microsoft Entra ID uses this application ID for authentication when the script attempts to access your Microsoft 365 organization.\nApp Secret\nThis is the Microsoft Entra application secret for the app that you created in Microsoft Entra ID in\nStep 2\n. This application secret is used for authentication.\nFile location\nThis is the OneDrive for Business location where Power Automate monitors for 'new file created' activities to trigger this flow.\nJob ID\nIdentifier for the HR connector created in\nStep 3\n. This identifier associates the HR data uploaded to the Microsoft cloud with the HR connector.\nTenant ID\nIdentifier for your Microsoft 365 organization obtained in\nStep 2\n. You can also obtain the tenant ID for your organization on the\nOverview\nblade in the Microsoft Entra admin center. This identifier is used to identify your organization.\nURI\nVerify that the value for this parameter is\nhttps://webhook.ingestion.office.com/api/signals\nSelect\nSave\n.\nGo to\nFlow overview\nand select\nTurn on\n.\nTest the flow manually by uploading a new file to your OneDrive for Business folder and verify that it ran successfully. This process might take a few minutes after the upload before the flow is triggered.\nYou can now monitor the HR connector as described in\nStep 5\n.\nIf needed, you can update the flow to create triggers based on file availability and modification events on SharePoint and other data sources supported by Power Automate Flows.\nExisting HR connectors\nOn December 13, 2021, we released the employee profile data scenario for HR connectors. If you created an HR connector before this date, we migrate the existing instances or your organization's HR connectors so your HR data continues to be imported to the Microsoft cloud. You don't have to do anything to maintain this functionality. You can keep using these connectors without disruption.\nIf you want to implement the employee profile data scenario, create a new HR connector and configure it as required. After you create a new HR connector, run the script by using the job ID of the new connector and CSV files with\nemployee profile data\npreviously described in this article.\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "content_hash": "sha256:4f85e521065add98626a220ea63b68abfec146dd62f1e1ad3334f0e596905aae", + "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nSet up a connector to import HR data\nFeedback\nSummarize this article for me\nYou can set up a data connector to import human resources (HR) data related to events such as a user's resignation or a change in a user's job level. The insider risk management solution uses the HR data to generate risk indicators that help you identify possible malicious activity or data theft by users inside your organization.\nSetting up a connector for HR data that insider risk management policies use to generate risk indicators involves creating a CSV file that contains the HR data, creating an app in Microsoft Entra for authentication, creating an HR data connector in the\nMicrosoft Purview portal\n, and running a script (on a scheduled basis) that ingests the HR data in CSV files to the Microsoft cloud so it's available to the insider risk management solution.\nBefore you begin\nDetermine which HR scenarios and data to import to Microsoft 365. This determination helps you decide how many CSV files and HR connectors you need to create, and how to generate and structure the CSV files. The insider risk management policies you want to implement determine the HR data you import. For more information, see Step 1.\nDetermine how to retrieve or export the data from your organization's HR system (regularly) and add it to the CSV files that you create in Step 1. The script that you run in Step 4 uploads the HR data in the CSV files to the Microsoft cloud.\nAssign the Data Connector Admin role to the user who creates the HR connector in Step 3. This role is required to add connectors on the\nData connectors\npage in the Microsoft Purview portal. Multiple role groups include this role by default. For a list of these role groups, see\nRoles in Microsoft Defender for Office 365 and Microsoft Purview\n. Alternatively, an admin in your organization can create a custom role group, assign the Data Connector Admin role, and then add the appropriate users as members. For instructions, see:-\nPermissions in the Microsoft Purview portal\n-\nRoles and role groups in Microsoft Defender for Office 365 and Microsoft Purview compliance\nUnderstand that the sample script you run in Step 4 uploads your HR data to the Microsoft cloud so that the insider risk management solution can use it. This sample script isn't supported under any Microsoft standard support program or service. The sample script is provided AS IS without warranty of any kind. Microsoft further disclaims all implied warranties including, without limitation, any implied warranties of merchantability or of fitness for a particular purpose. You assume all risk arising from the use or performance of the sample script and documentation. In no event shall Microsoft, its authors, or anyone else involved in the creation, production, or delivery of the scripts be liable for any damages whatsoever (including, without limitation, damages for loss of business profits, business interruption, loss of business information, or other pecuniary loss) arising out of the use of or inability to use the sample scripts or documentation, even if Microsoft has been advised of the possibility of such damages.\nThis data connector is available in GCC environments in the Microsoft 365 US Government cloud. Third-party applications and services might involve storing, transmitting, and processing your organization's customer data on third-party systems that are outside of the Microsoft 365 infrastructure and therefore aren't covered by the Microsoft Purview and data protection commitments. Microsoft makes no representation that use of this product to connect to third-party applications implies that those third-party applications are FedRAMP compliant. For step-by-step instructions for setting up an HR connector in a GCC environment, see\nSet up a connector to import HR data in US Government\n.\nAdd the\nwebhook.ingestion.office.com\ndomain to your firewall allowlist for your organization.\nStep 1: Prepare a CSV file with your HR data\nFirst, create a CSV file that contains the HR data the connector imports to Microsoft 365. The insider risk solution uses this data to generate potential risk indicators. You can import data for the following HR scenarios to Microsoft 365:\nEmployee resignation. Information about employees who leave your organization.\nJob level changes. Information about job level changes for employees, such as promotions and demotions.\nPerformance reviews. Information about employee performance.\nPerformance improvement plans. Information about performance improvement plans for employees.\nEmployee profile (preview). General information about an employee.\nThe type of HR data to import depends on the insider risk management policy and corresponding policy template you want to implement. The following table shows which HR data type each policy template requires:\nPolicy template\nHR data type\nData theft by departing users\nEmployee resignations\nData leaks\nNot applicable\nData leaks by priority users\nNot applicable\nData leaks by risky users\nJob level changes, Performance reviews, Performance improvement plans\nSecurity policy violations\nNot applicable\nSecurity policy violations by departing users\nEmployee resignations\nSecurity policy violations by priority users\nNot applicable\nSecurity policy violations by risky users\nJob level changes, Performance reviews, Performance improvement plans\nOffensive language in email\nNot applicable\nHealthcare policy\nEmployee profile\nFor more information about policy templates for insider risk management, see\nInsider risk management policies\n.\nFor each HR scenario, provide the corresponding HR data in one or more CSV files. The number of CSV files to use for your insider risk management implementation is discussed later in this section.\nAfter you create the CSV file with the required HR data, store it on the local computer where you run the script in Step 4. Implement an update strategy to make sure the CSV file always contains the most current information so that whatever you run the script, the most current HR data is uploaded to the Microsoft cloud and accessible to the insider risk management solution.\nImportant\nThe column names described in the following sections aren't required parameters, but only examples. You can use any column name in your CSV files. However, you\nmust\nmap the column names you use in a CSV file to the data type when you create the HR connector in Step 3. Also note that the sample CSV files in the following sections are shown in NotePad view. It's much easier to view and edit CSV files in Microsoft Excel.\nThe following sections describe the required CSV data for each HR scenario.\nCSV file for employee resignation data\nHere's an example of a CSV file for employee resignation data.\nUserPrincipalName,ResignationDate,LastWorkingDate\nsarad@contoso.com,2019-04-23T15:18:02.4675041+05:30,2019-04-29T15:18:02.4675041+05:30\npilarp@contoso.com,2019-04-24T09:15:49Z,2019-04-29T15:18:02.7117540\nThe following table describes each column in the CSV file for employee resignation data.\nColumn\nDescription\nUserPrincipalName\nThe Microsoft Entra UserPrincipalName (UPN) used to identify the terminated user.\nResignationDate\nSpecifies the date the user's employment is officially terminated or the user resigns from your organization. For example, this date might be when the user gives their notice about leaving your organization. This date might be different from the date of the person's last day of work. Use the following date format:\nyyyy-mm-ddThh:mm:ss.nnnnnn| -hh:mm\n, which is the\nISO 8601 date and time format\n.\nLastWorkingDate\nSpecifies the last day of work for the terminated user. This date can't be more than six months prior or one year in advance from the time of upload. Use the following date format:\nyyyy-mm-ddThh:mm:ss.nnnnnn| -hh:mm\n, which is the\nISO 8601 date and time format\n.\nCSV file for job level changes data\nHere's an example of a CSV file for job level changes data.\nUserPrincipalName,EffectiveDate,OldLevel,NewLevel\nsarad@contoso.com,2019-04-23T15:18:02.4675041+05:30,Level 61 - Sr. Manager,Level 60- Manager\npillar@contoso.com,2019-04-23T15:18:02.4675041+05:30,Level 62 - Director,Level 60- Sr. Manager\nThe following table describes each column in the CSV file for job level changes data.\nColumn\nDescription\nUserPrincipalName\nThe Microsoft Entra UserPrincipalName (UPN) used to specify the user's email address.\nEffectiveDate\nSpecifies the date that the user's job level is officially changed. Use the following date format:\nyyyy-mm-ddThh:mm:ss.nnnnnn+ | -hh:mm\n, which is the\nISO 8601 date and time format\n.\nRemarks\nSpecifies the remarks that evaluator provides about the change of job level. You can enter a limit of 200 characters. This parameter is optional. You don't have to include it in the CSV file.\nOldLevel\nSpecifies the user's job level before it was changed. This is a free-text parameter and can contain hierarchical taxonomy for your organization. This parameter is optional. You don't have to include it in the CSV file.\nNewLevel\nSpecifies the user's job level after it was changed. This is a free-text parameter and can contain hierarchical taxonomy for your organization. This parameter is optional. You don't have to include it in the CSV file.\nCSV file for performance review data\nHere's an example of a CSV file for performance data.\nUserPrincipalName,EffectiveDate,Remarks,Rating\nsarad@contoso.com,2019-04-23T15:18:02.4675041+05:30,Met expectations but bad attitude,2-Below expectation\npillar@contoso.com,2019-04-23T15:18:02.4675041+05:30, Multiple conflicts with the team\nThe following table describes each column in the CSV file for performance review data.\nColumn\nDescription\nUserPrincipalName\nThe Microsoft Entra UserPrincipalName (UPN) used to specify the user's email address.\nEffectiveDate\nSpecifies the date that the user is officially informed about the result of their performance review. This date can be the date when the performance review cycle ends. Use the following date format:\nyyyy-mm-ddThh:mm:ss.nnnnnn+ | -hh:mm\n, which is the\nISO 8601 date and time format\n.\nRemarks\nSpecifies any remarks that evaluator provides to the user for the performance review. This is a text parameter with a limit of 200 characters. This parameter is optional. You don't have to include it in the CSV file.\nRating\nSpecifies the rating provided for the performance review. This is a text parameter and can contain any free-form text that your organization uses to recognize the evaluation. For example, \"3 Met expectations\" or \"2 Below average\". This is a text parameter with a limit of 25 characters. This parameter is optional. You don't have to include it in the CSV file.\nCSV file for performance improvement plan data\nHere's an example of a CSV file for the data for the performance improvement plan data.\nUserPrincipalName,EffectiveDate,ImprovementRemarks,PerformanceRating\nsarad@contoso.com,2019-04-23T15:18:02.4675041+05:30,Met expectation but bad attitude,2-Below expectation\npillar@contoso.com,2019-04-23T15:18:02.4675041+05:30, Multiple conflicts with the team\nThe following table describes each column in the CSV file for performance review data.\nColumn\nDescription\nUserPrincipalName\nThe Microsoft Entra UserPrincipalName (UPN) used to specify the user's email address.\nEffectiveDate\nSpecifies the date when the user is officially informed about their performance improvement plan. You must use the following date format:\nyyyy-mm-ddThh:mm:ss.nnnnnn+ | -hh:mm\n, which is the\nISO 8601 date and time format\n.\nRemarks\nSpecifies any remarks that evaluator provides about the performance improvement plan. This is a text parameter with a limit of 200 characters. This is an optional parameter. You don't have to include it in the CSV file.\nRating\nSpecifies any rating or other information related to the performance review. This is a text parameter and can contain any free form text that your organization uses to recognize the evaluation. For example, \"3 Met expectations\" or \"2 Below average\". This is a text parameter with limit of 25 characters. This is an optional parameter. You don't have to include it in the CSV file.\nCSV file for employee profile data (preview)\nNote\nThe capability to create an HR connector for employee profile data is in public preview. To create an HR connector that supports employee profile data, go to the\nData connectors\npage in the Microsoft Purview portal, select the\nConnectors\ntab, then select\nAdd a connector\n>\nHR\n. Follow the steps to create a connector in\nStep 3: Create the HR connector\n.\nHere's an example of a CSV file for employee profile data.\nUserPrincipalName,UserName,EmployeeFirstName,EmployeeLastName,EmployeeAddLine1,EmployeeAddLine2,EmployeeCity,EmployeeState,EmployeeZipCode,EmployeeDept,EmployeeType,EmployeeRole\njackq@contoso.com,jackq,jack,qualtz,50 Oakland Ave,#206,City,Florida,32104,Orthopaedic,Regular,Nurse\nThe following table describes each column in the CSV file for employee profile data.\nColumn\nDescription\nUserPrincipalName\n*\nThe Microsoft Entra UserPrincipalName (UPN) used to specify the user's email address.\nEmployeeFirstName\n*\nFirst name of the employee.\nEmployeeLastName\n*\nLast name of the employee.\nEmployeeAddressLine1\n*\nStreet address of the employee.\nEmployeeAddressLine2\nSecondary address information, such as apartment number, for employee.\nEmployeeCity\nCity of residence for employee.\nEmployeeState\nState of residence for employee.\nEmployeeZipCode\n*\nZip code of residence for employee.\nEmployeeCountry\nCountry of residence for employee.\nEmployeeDepartment\nEmployee's department in the organization.\nEmployeeType\nEmployment type for employee, such as Regular, Exempt, or Contractor.\nEmployeeRole\nEmployees's role, designation, or job title in the organization.\nNote\n*\nThis column is mandatory. If a mandatory column is missing, the CSV file isn't validated and other data in the file isn't imported.\nWe recommend that you create an HR connector that only imports employee profile data. For this connector, frequently refresh the employee profile data, preferably every 15 to 20 days. Employee profile records are deleted if you don't update them in the past 30 days.\nDetermining how many CSV files to use for HR data\nIn Step 3, you can choose to create separate connectors for each HR data type or a single connector for all data types. You can use separate CSV files that contain data for one HR scenario (like the examples of the CSV files described in the previous sections). Alternatively, you can use a single CSV file that contains data for two or more HR scenarios. Here are some guidelines to help you determine how many CSV files to use for HR data.\nIf the insider risk management policy that you want to implement requires multiple HR data types, consider using a single CSV file that contains all the required data types.\nThe method for generating or collecting the HR data might determine the number of CSV files. For example, if the different types of HR data used to configure an HR connector are located in a single HR system in your organization, then you might be able to export the data to a single CSV file. But if data is distributed across different HR systems, then it might be easier to export data to different CSV files. For example, employee resignation data might be located in a different HR system than job level or performance review data. In this case, it might be easier to have separate CSV files rather than having to manually combine the data into a single CSV file. So, how you retrieve or export data from your HR systems might determine how many CSV files you need.\nAs a general rule, the data types in a CSV file determine the number of HR connectors that you need to create. For example, if a CSV file contains all the data types required to support your insider risk management implementation, then you only need one HR connector. But if you have two separate CSV files that each contain a single data type, then you have to create two HR connectors. An exception to this rule is that if you add an\nHRScenario\ncolumn to a CSV file (see the next section), you can configure a single HR connector that can process different CSV files.\nFor each CSV file, you can ingest up to 500 records at once. To ingest a larger number of records, upload multiple CSV files, each with fewer than 500 records.\nConfiguring a single CSV file for multiple HR data types\nYou can add multiple HR data types to a single CSV file. This configuration is useful if the insider risk management solution you're implementing requires multiple HR data types or if the data types are located in a single HR system in your organization. Having fewer CSV files always allows you to have fewer HR connectors to create and manage.\nHere are requirements for configuring a CSV file with multiple data types:\nAdd the required columns (and optional columns if you use them) for each data type and the corresponding column name in the header row. If a data type doesn't correspond to a column, leave the value blank.\nAdd an\nHRScenario\ncolumn to the CSV file. The HR connector uses this column to identify which rows in the CSV file contain which type HR data. The values in this column identify the type of HR data in each row. For example, values that correspond to the HR scenarios could be `Resignation`, `Job level change`, `Performance review`, `Performance improvement plan`, and `Employee profile`.\nIf you have multiple CSV files that contain an\nHRScenario\ncolumn, make sure that each file uses the same column name and the same values that identify the specific HR scenarios.\nThe following example shows a CSV file that contains the\nHRScenario\ncolumn. The values in the HRScenario column identify the type of data in the corresponding row. The following sample covers four HR scenarios `Resignation`, `Job level change`, `Performance review`, and `Performance improvement plan`.\nHRScenario,EmailAddress,ResignationDate,LastWorkingDate,EffectiveDate,Remarks,Rating,OldLevel,NewLevel\nResignation,sarad@contoso.com,2019-04-23T15:18:02.4675041+05:30,2019-04-29T15:18:02.4675041+05:30,,,,\nResignation,pilarp@contoso.com,2019-04-24T09:15:49Z,2019-04-29T15:18:02.7117540,,,,\nJob level change,sarad@contoso.com,2019-04-23T15:18:02.4675041+05:30,,,,,Level 61 Sr. Manager, Level 60 Manager\nJob level change,pillarp@contoso.com,2019-04-23T15:18:02.4675041+05:30,,,,,Level 62 Director,Level 60 Sr Manager\nPerformance review,sarad@contoso.com,,,2019-04-23T15:18:02.4675041+05:30,Met expectation but bad attitude,2 Below expectations,,\nPerformance review,pillarp@contoso.com,,,2019-04-23T15:18:02.4675041+05:30, Multiple conflicts with the team,,\nPerformance improvement plan,sarad@contoso.com,,,2019-04-23T15:18:02.4675041+05:30,Met expectations but bad attitude,2 Below expectations,,\nPerformance improvement plan,pillarp@contoso.com,,,2019-04-23T15:18:02.4675041+05:30,Multiple conflicts with the team,,\nNote\nYou can use any name for the column that identifies HR data type because you map the name of the column in your CSV file as the column that identifies the HR data type when you set up the connector in Step 3. You also map the values used for the data type column when you set up the connector.\nAdding the HRScenario column to a CSV file that contains a single data type\nBased on your organization's HR systems and how you export HR data to a CSV file, you might need to create multiple CSV files that each contain a single HR data type. In this case, you can still create a single HR connector to import data from different CSV files. To do this, add an HRScenario column to the CSV file and specify the HR data type. Then, you can run the script for each CSV file but use the same job ID for the connector. For more information, see\nStep 4\n.\nStep 2: Create an app in Microsoft Entra ID\nNext, create and register a new app in Microsoft Entra ID. The app corresponds to the HR connector that you create in Step 3. When you create this app, Microsoft Entra ID can authenticate the HR connector when it runs and attempts to access your organization. This app also authenticates the script that you run in Step 4 to upload your HR data to the Microsoft cloud. During the creation of this Microsoft Entra app, save the following information. You use these values in Step 3 and Step 4.\nMicrosoft Entra application ID (also called the\napp ID\nor\nclient ID\n)\nMicrosoft Entra application secret (also called the\nclient secret\n)\nTenant ID (also called the\ndirectory Id\n)\nFor step-by-step instructions for creating an app in Microsoft Entra ID, see\nRegister an application with the Microsoft identity platform\n.\nStep 3: Create the HR connector\nThe next step is to create an HR connector in the Microsoft Purview portal. After you run the script in Step 4, the HR connector that you create will ingest the HR data from the CSV file to your Microsoft 365 organization. Before you create a connector, be sure that you have a list of the HR scenarios and the corresponding CSV column names for each one. You have to map the data required for each scenario to the actual column names in your CSV file when configuring the connector. Alternatively, you can upload a sample CSV file when configuring the connector and the workflow help you map the name of the columns to the required data types.\nAfter you complete this step, be sure to copy the job ID that's generated when you create the connector. You use the job ID when you run the script.\nSign in to the\nMicrosoft Purview portal\n.\nSelect\nSettings\n>\nData connectors\n.\nSelect\nMy connectors\n, then select\nAdd connector\n.\nFrom the list, choose\nHR (preview)\n.\nOn the\nSetup the connection\npage, do the following and then select\nNext\n:\nType or paste the Microsoft Entra application ID for the Azure app that you created in Step 2.\nType a name for the HR connector.\nOn the HR scenarios page, select one or more HR scenarios that you want to import data for and then select\nNext\n.\nOn the file mapping method page, select a file type if necessary, and then select one of the following options and then select\nNext\n.\nUpload a sample file\n. If you select this option, select\nUpload sample file\nto upload the CSV file that you prepared in Step 1. This option allows you to quickly select column names in your CSV file from a drop-down list to map them to the data types for the HR scenarios that you previously selected.\nOR\nManually provide the mapping details\n. If you select this option, you have to type the name of the columns in your CSV file to map them to the data types for the HR scenarios that you previously selected.\nOn the File mapping details page, do one of the following, depending on whether you uploaded a sample CSV file and whether you're configuring the connector for a single HR scenario or for multiple scenarios. If you uploaded a sample file, you don't have to type the column names. You pick them from a dropdown list.\nIf you selected a single HR scenario in the previous step, then type the column header names (also called\nparameters\n) from the CSV file that you created in Step 1 in each of the appropriate boxes. The column names that you type aren't case-sensitive, but be sure to include spaces if the column names in your CSV file include spaces. As previously explained, the names you type in these boxes must match the parameter names in your CSV file. For example, the following screenshot shows the parameter names from the sample CSV file for the employee resignation HR scenario shown in Step 1.\nIf you selected multiple data types in the previous step, then you need to enter identifier column name that will identify the HR data type in your CSV file. After entering the identifier column name, type the value that identifies this HR data type, and type the column header names for selected data types from the CSV file(s) that you created in Step 1 in each of the appropriate boxes for each selected data type. As previously explained, the names that you type in these boxes must match the column names in your CSV file.\nOn the\nReview\npage, review your settings and then select\nFinish\nto create the connector.\nA status page is displayed that confirms the connector was created. This page contains two important things that you need to complete the next step to run the sample script to upload your HR data.\nJob ID.\nYou'll need this job ID to run the script in the next step. You can copy it from this page or from the connector flyout page.\nLink to sample script.\nSelect the\nhere\nlink to go to the GitHub site to access the sample script (the link opens a new window). Keep this window open so that you can copy the script in Step 4. Alternatively, you can bookmark the destination or copy the URL so you can access it again when you run the script. This link is also available on the connector flyout page.\nSelect\nDone\n.\nThe new connector is displayed in the list on the\nConnectors\ntab.\nSelect the HR connector that you just created to display the flyout page, which contains properties and other information about the connector.\nIf you haven't already done so, you can copy the values for the\nAzure App ID\nand\nConnector job ID\n. You'll need these to run the script in the next step. You can also download the script from the flyout page (or download it using the link in the next step.)\nYou can also select\nEdit\nto change the Azure App ID or the column header names that you defined on the\nFile mapping\npage.\nStep 4: Run the sample script to upload your HR data\nImportant\nYou must add the\nwebhook.ingestion.office.com\ndomain to your firewall allowlist for your organization. If you block this domain, the script doesn't run.\nThe last step in setting up an HR connector is to run a sample script that uploads the HR data in the CSV file (that you created in Step 1) to the Microsoft cloud. Specifically, the script uploads the data to the HR connector. After you run the script, the HR connector that you created in Step 3 imports the HR data to your Microsoft 365 organization where other compliance tools, such as the Insider risk management solution, can access it. After you run the script, consider scheduling a task to run it automatically on a daily basis so the most current employee termination data is uploaded to the Microsoft cloud. For more information, see\nSchedule the script to run automatically\n.\nGo to the window that you left open from the previous step to access the GitHub site with the sample script. Alternatively, open the bookmarked site or use the URL that you copied. You can also access the script\nhere\n.\nSelect the\nRaw\nbutton to display the script in text view.\nCopy all the lines in the sample script and save them to a text file.\nModify the sample script for your organization, if necessary.\nSave the text file as a Windows PowerShell script file by using a filename suffix of\n.ps1\n; for example,\nHRConnector.ps1\n. Alternatively, you can use the GitHub filename for the script, which is\nupload_termination_records.ps1\n.\nOpen a command prompt on your local computer, and go to the directory where you saved the script.\nRun the following command to upload the HR data in the CSV file to the Microsoft cloud; for example:\n.\\HRConnector.ps1 -tenantId -appId -appSecret -jobId -filePath ''\nThe following table describes the parameters to use with this script and their required values. The information you obtained in the previous steps is used in the values for these parameters.\nParameter\nDescription\ntenantId\nThis is the ID for your Microsoft 365 organization that you obtained in Step 2. You can also obtain the tenant ID for your organization on the\nOverview\nblade in the Microsoft Entra admin center. This value identifies your organization.\nappId\nThis is the Microsoft Entra application ID for the app that you created in Microsoft Entra ID in Step 2. This value is used by Microsoft Entra ID for authentication when the script attempts to access your Microsoft 365 organization.\nappSecret\nThis is the Microsoft Entra application secret for the app that you created in Microsoft Entra ID in Step 2. This value is also used for authentication.\njobId\nThis is the job ID for the HR connector that you created in Step 3. This value associates the HR data that is uploaded to the Microsoft cloud with the HR connector.\nfilePath\nThis is the file path for the file (stored on the same system as the script) that you created in Step 1. Try to avoid spaces in the file path; otherwise use single quotation marks.\nHere's an example of the syntax for the HR connector script using actual values for each parameter:\n.\\HRConnector.ps1 -tenantId aaaabbbb-0000-cccc-1111-dddd2222eeee -appId 00001111-aaaa-2222-bbbb-3333cccc4444 -appSecret Aa1Bb~2Cc3.-Dd4Ee5Ff6Gg7Hh8Ii9_Jj0Kk1Ll2 -jobId 00001111-aaaa-2222-bbbb-3333cccc4444 -filePath 'C:\\Users\\contosoadmin\\Desktop\\Data\\employee_termination_data.csv'\nIf the upload is successful, the script displays the\nUpload Successful\nmessage.\nNote\nIf you have problems running the previous command because of execution policies, see\nAbout Execution Policies\nand\nSet-ExecutionPolicy\nfor guidance about setting execution policies.\nStep 5: Monitor the HR connector\nAfter you create the HR connector and run the script to upload your HR data, you can view the connector and upload status in the Microsoft Purview portal. If you schedule the script to run automatically on a regular basis, you can also view the current status after the last time the script ran.\nSign in to the\nMicrosoft Purview portal\n.\nSelect\nSettings\n>\nData connectors\n.\nSelect\nMy connectors\n, then select the HR connector that you created to display the flyout page. This page contains the properties and information about the connector.\nUnder\nProgress\n, select the\nDownload log\nlink to open (or save) the status log for the connector. This log contains information about each time the script runs and uploads the data from the CSV file to the Microsoft cloud.\nThe\nRecordsSaved\nfield indicates the number of rows in the CSV file that you uploaded. For example, if the CSV file contains four rows, the value of the\nRecordsSaved\nfields is 4, if the script successfully uploads all the rows in the CSV file.\nIf you didn't run the script in Step 4, a link to download the script is displayed under\nLast import\n. You can download the script and then follow the steps to run the script.\n(Optional) Step 6: Schedule the script to run automatically\nTo make sure tools like the insider risk management solution always have access to the latest HR data from your organization, schedule the script to run automatically on a recurring basis, such as once a day. This schedule requires updating the HR data in the CSV file on a similar (if not the same) schedule so that it contains the latest information about employees who leave your organization. The goal is to upload the most current HR data so that the HR connector can make it available to the insider risk management solution.\nUse the Task Scheduler app in Windows to automatically run the script every day.\nOn your local computer, select the Windows\nStart\nbutton and then type\nTask Scheduler\n.\nSelect the\nTask Scheduler\napp to open it.\nIn the\nActions\nsection, select\nCreate Task\n.\nOn the\nGeneral\ntab, type a descriptive name for the scheduled task, such as\nHR Connector Script\n. You can also add an optional description.\nUnder\nSecurity options\n, do the following steps:\nDecide whether to run the script only when you're logged on to the computer or run it when you're logged on or not.\nMake sure that the\nRun with the highest privileges\ncheckbox is selected.\nSelect the\nTriggers\ntab, select\nNew\n, and then do the following steps:\nUnder\nSettings\n, select the\nDaily\noption, then choose a date and time to run the script for the first time. The script runs every day at the same specified time.\nUnder\nAdvanced settings\n, make sure the\nEnabled\ncheckbox is selected.\nSelect\nOk\n.\nSelect the\nActions\ntab, select\nNew\n, and then do the following steps:\nIn the\nAction\ndropdown list, make sure that\nStart a program\nis selected.\nIn the\nProgram/script\nbox, select\nBrowse\n, go to the following location, and select it so the path displays in the box:\nC:\\Windows\\System32\\WindowsPowerShell\\v1.0\\powershell.exe\n.\nIn the\nAdd arguments (optional)\nbox, paste the same script command that you ran in Step 4. For example,\n.\\HRConnector.ps1 -tenantId \"aaaabbbb-0000-cccc-1111-dddd2222eeee\" -appId \"00001111-aaaa-2222-bbbb-3333cccc4444\" -appSecret \"Aa1Bb~2Cc3.-Dd4Ee5Ff6Gg7Hh8Ii9_Jj0Kk1Ll2\" -jobId \"00001111-aaaa-2222-bbbb-3333cccc4444\" -filePath \"C:\\Users\\contosoadmin\\Desktop\\Data\\employee_termination_data.csv\"\nIn the\nStart in (optional)\nbox, paste the folder location of the script that you ran in Step 4. For example,\nC:\\Users\\contosoadmin\\Desktop\\Scripts\n.\nSelect\nOk\nto save the settings for the new action.\nIn the\nCreate Task\nwindow, select\nOk\nto save the scheduled task. You might be prompted to enter your user account credentials.\nThe new task appears in the Task Scheduler Library.\nThe last time the script ran and the next time it's scheduled to run displays. You can double-select the task to edit it.\nYou can also verify the last time the script ran on the flyout page of the corresponding HR connector in the Microsoft Purview portal.\n(Optional) Step 7: Upload data by using Power Automate templates\nYou can upload HR data by using Power Automate templates and define triggers. For example, you can configure a Power Automate template to trigger when new HR connector files are available in SharePoint or OneDrive locations. You can also streamline this process by storing confidential information like Microsoft Entra application secret (created in\nStep 2\n) in Azure Key Vault and use it with Power Automate for authentication.\nComplete the following steps to automatically upload HR data when new files become available on OneDrive for Business:\nDownload the\nImportHRDataforIRM.zip\npackage from the\nGitHub site\n.\nIn\nPower Automate\n, go to\nMy flows\n.\nSelect\nImport\nand upload the\nImportHRDataforIRM.zip\npackage.\nAfter the package is uploaded, update the content (name and OneDrive for Business connection), and select\nImport\n.\nSelect\nOpen flow\nand update the parameters. The following table describes the parameters to use in this Power Automate Flow and their required values. Use the information you obtained in the previous steps for these values.\nParameter\nDescription\nApp ID\nThis is the Microsoft Entra application ID for the app that you created in Microsoft Entra ID in\nStep 2\n. Microsoft Entra ID uses this application ID for authentication when the script attempts to access your Microsoft 365 organization.\nApp Secret\nThis is the Microsoft Entra application secret for the app that you created in Microsoft Entra ID in\nStep 2\n. This application secret is used for authentication.\nFile location\nThis is the OneDrive for Business location where Power Automate monitors for 'new file created' activities to trigger this flow.\nJob ID\nIdentifier for the HR connector created in\nStep 3\n. This identifier associates the HR data uploaded to the Microsoft cloud with the HR connector.\nTenant ID\nIdentifier for your Microsoft 365 organization obtained in\nStep 2\n. You can also obtain the tenant ID for your organization on the\nOverview\nblade in the Microsoft Entra admin center. This identifier is used to identify your organization.\nURI\nVerify that the value for this parameter is\nhttps://webhook.ingestion.office.com/api/signals\nSelect\nSave\n.\nGo to\nFlow overview\nand select\nTurn on\n.\nTest the flow manually by uploading a new file to your OneDrive for Business folder and verify that it ran successfully. This process might take a few minutes after the upload before the flow is triggered.\nYou can now monitor the HR connector as described in\nStep 5\n.\nIf needed, you can update the flow to create triggers based on file availability and modification events on SharePoint and other data sources supported by Power Automate Flows.\nExisting HR connectors\nOn December 13, 2021, we released the employee profile data scenario for HR connectors. If you created an HR connector before this date, we migrate the existing instances or your organization's HR connectors so your HR data continues to be imported to the Microsoft cloud. You don't have to do anything to maintain this functionality. You can keep using these connectors without disruption.\nIf you want to implement the employee profile data scenario, create a new HR connector and configure it as required. After you create a new HR connector, run the script by using the job ID of the new connector and CSV files with\nemployee profile data\npreviously described in this article.\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, - "last_changed": "2026-03-11T12:59:29.613756+00:00", + "last_changed": "2026-03-14T06:51:10.083468+00:00", "topic": "HR Data Connector", "section": "Microsoft Purview" }, @@ -818,7 +818,7 @@ "https://learn.microsoft.com/en-us/purview/retention": { "content_hash": "sha256:2c8f1bf8462f643bdceaab8ae18a61eeb98336d9cec5a5f850f8fe2febd04301", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nLearn about retention policies and retention labels\nFeedback\nSummarize this article for me\nMicrosoft Purview service description\nNote\nIf you're seeing messages about retention policies in Teams or have questions about retention labels in your apps, contact your IT department for information about how they have been configured for you. In the meantime, you might find the following articles helpful:\nTeams messages about retention policies\nApply retention labels to files in SharePoint or OneDrive\nThe information on this page is for IT administrators who can create retention policies and retention labels for compliance reasons.\nFor most organizations, the volume and complexity of their data is increasing daily—email, documents, instant messages, and more. Effectively managing or governing this information is important because you need to:\nComply proactively with industry regulations and internal policies\nthat require you to retain content for a minimum period of time—for example, the Sarbanes-Oxley Act might require you to retain certain types of content for seven years.\nReduce your risk in the event of litigation or a security breach\nby permanently deleting old content that you're no longer required to keep.\nHelp your organization to share knowledge effectively and be more agile\nby ensuring that your users work only with content that's current and relevant to them.\nRetention settings that you configure can help you achieve these goals. Managing content commonly requires two actions:\nAction\nPurpose\nRetain content\nPrevent permanent deletion and remain available for eDiscovery\nDelete content\nPermanently delete content from your organization\nWith these two retention actions, you can configure retention settings for the following outcomes:\nRetain-only: Retain content forever or for a specified period of time.\nDelete-only: Permanently delete content after a specified period of time.\nRetain and then delete: Retain content for a specified period of time and then permanently delete it.\nThese retention settings work with content in place that saves you the additional overheads of creating and configuring additional storage when you need to retain content for compliance reasons. In addition, you don't need to implement customized processes to copy and synchronize this data.\nUse the following sections to learn more about how retention policies and retention labels work, when to use them, and how they supplement each other. But if you're ready to get started and deploy retention settings for some common scenarios, see\nGet started with data lifecycle management\n.\nHow retention settings work with content in place\nWhen content has retention settings assigned to it, that content remains in its original location. Most of the time, people continue to work with their documents or mail as if nothing's changed. But if they edit or delete content that's included in the retention policy, a copy of the content is automatically retained.\nFor SharePoint and OneDrive sites: The copy is retained in the\nPreservation Hold\nlibrary.\nFor Exchange mailboxes: The copy is retained in the\nRecoverable Items\nfolder.\nFor Teams, Viva Engage messages, Copilot and AI apps: The copy is retained in a hidden folder named\nSubstrateHolds\nas a subfolder in the Exchange\nRecoverable Items\nfolder.\nNote\nBecause the Preservation Hold library is included in the site's storage quota, you might need to increase your storage when you use retention settings for SharePoint, OneDrive, and Microsoft 365 groups.\nThese secure locations and the retained content aren't visible to most people. In most cases, people don't even need to know that their content is subject to retention settings.\nFor more detailed information about how retention settings work for different workloads, see the following articles:\nLearn about retention for SharePoint and OneDrive\nLearn about retention for Teams\nLearn about retention for Viva Engage\nLearn about retention for Exchange\nLearn about retention for Copilot\nRetention policies and retention labels\nTo assign your retention settings to content, use\nretention policies\nand\nretention labels with label policies\n. You can use just one of these methods, or combine them.\nUse a retention policy to assign the same retention settings for content at a site or mailbox level, and use a retention label to assign retention settings at an item level (folder, document, email).\nFor example, if all documents in a SharePoint site should be retained for five years, it's more efficient to do this with a retention policy than apply the same retention label to all documents in that site. However, if some documents in that site should be retained for five years and others retained for 10 years, a retention policy wouldn't be able to do this. When you need to specify retention settings at the item level, use retention labels.\nUnlike retention policies, retention settings from retention labels travel with the content if it's moved to a different location within your Microsoft 365 tenant. In addition, retention labels have the following capabilities that retention policies don't support:\nOptions to start the retention period from when the content was labeled or based on an event, in addition to the age of the content or when it was last modified.\nUse\ntrainable classifiers\nto identify content to label.\nApply a default label for SharePoint items or Exchange messages.\nSupported actions at the end retention period:\nDisposition review\nto review the content before it's permanently deleted.\nAutomatically apply another retention label\nMark the content as a\nrecord\nas part of the label settings, and always have\nproof of disposition\nwhen content is deleted at the end of its retention period.\nRetention policies\nRetention policies can be applied to the following locations:\nExchange mailboxes\nSharePoint classic and communication sites\nOneDrive accounts\nMicrosoft 365 Group mailboxes & sites\nSkype for Business\nExchange public folders\nTeams channel messages (standard channels,\nshared channels\n, and private channels\npost-migration\n)\nTeams chats\nTeams private channel messages (\npre-migration\nonly)\nMicrosoft Copilot experiences\nEnterprise AI apps\nOther AI apps\nViva Engage community messages\nViva Engage user messages\nNote\nIf you have existing retention policies for Teams chats and Copilot interactions, they continue to be supported, although they can't be edited when your tenant supports the separate locations. At this point, any new retention policies must use the new locations.\nYou can efficiently apply a single policy to multiple locations, or to specific locations or users.\nFor the start of the retention period, you can choose when the content was created or, supported only for files and the SharePoint, OneDrive, and Microsoft 365 Groups locations, when the content was last modified.\nItems inherit the retention settings from their container specified in the retention policy. If they are, then moved outside that container when the policy is configured to retain content, a copy of that item is retained in the workload's secured location. However, the retention settings don't travel with the content in its new location. If that's required, use retention labels instead of retention policies.\nRetention labels\nUse retention labels for different types of content that require different retention settings. For example:\nTax forms that need to be retained for a minimum period of time.\nPress materials that need to be permanently deleted when they reach a specific age.\nCompetitive research that needs to be retained for a specific period and then permanently deleted.\nWork visas that must be marked as a record so that they can't be edited or deleted.\nIn all these cases, retention labels let you apply retention settings for governance control at the item level (document or email).\nWith retention labels, you can:\nEnable people in your organization to apply a retention label manually\nto content in Outlook and Outlook on the web, OneDrive, SharePoint, and Microsoft 365 groups. Users often know best what type of content they're working with, so they can classify it and have the appropriate retention settings applied.\nApply retention labels to content automatically\nif it matches specific conditions, that includes cloud attachments that are shared in email or Teams, or when the content contains:\nSpecific types of sensitive information.\nSpecific keywords that match a query you create.\nPattern matches for a trainable classifier.\nStart the retention period from when the content was labeled\nfor documents in SharePoint sites and OneDrive accounts, and for email items.\nStart the retention period when an event occurs\n, such as employees leave the organization, or contracts expire.\nApply a default retention label to a document library, folder, or document set\nin SharePoint, so that all documents that are stored in that location inherit the default retention label.\nMark items as a record\nas part of your\nrecords management\nstrategy. When this labeled content remains in Microsoft 365, further restrictions are placed on the content that might be needed for regulatory reasons. For more information, see\nCompare restrictions for what actions are allowed or blocked\n.\nRetention labels, unlike\nsensitivity labels\n, don't persist if the content is moved outside Microsoft 365.\nDynamically mitigate the risk of accidental or malicious deletes\nIn preview, you can use this solution with Insider Risk Management so that retention labels are automatically applied with\nAdaptive Protection\n.\nWhen you enable Adaptive Protection for your tenant, retention labels are automatically applied to unlabeled content if it's deleted by users who have been identified as an\nelevated risk\n. If these users delete content from SharePoint, OneDrive, or Exchange, a retention label is automatically applied to that content to retain it for 120 days. As a result, that content remains accessible for search and eDiscovery from the\nsecured locations used by the workload\n.\nNote\nIf you enabled and configured Adaptive Protection before this integration with data lifecycle management released, you'll need to opt-in to create this auto-labeling policy. See the instructions at the end of this section.\nWhen these items are retained with Adaptive Protection, the following auditing events are generated and identify the user and item:\nFor SharePoint and OneDrive:\nRetained file proactively\nFor Exchange:\nRetained email item proactively\nAfter the 120 days expire, the items then become eligible for permanent deletion. To learn more about when permanent deletion occurs, see\nHow retention works for SharePoint and OneDrive\nand\nHow retention works for Exchange\n.\nUnlike other labeling scenarios, users don't see the retention label, and you don't need to create or manage the retention label or auto-labeling retention policy. At this time, you can't change the retention period or assign different policies based on the different risk levels, or for different locations. The single retention label and auto-labeling retention policy for your tenant aren't visible in the Microsoft Purview portal.\nIf you're using Adaptive Protection but don't want to automatically retain content in this way, you can turn off the auto-labeling policy without affecting other Adaptive Protection policies. Use the same control if you need to turn on the auto-labeling retention policy for Adaptive Protection, and confirm the status.\nSign in to the Microsoft Purview portal\n>\nSolutions\n>\nSettings\n>\nSolution settings\n>\nData lifecycle management\n>\nAdaptive protection\n.\nFor\nAdaptive protection in Data Lifecycle Management\n, turn the setting off, confirm your choice, and select\nSave\n.\nAny retention labels that were applied as a result of Adaptive Protection are removed so that the items can then become eligible for permanent deletion.\nYou won't be able to turn on this setting unless Adaptive Protection is turned on for your tenant. If your account has the\nrequired permissions\n, you'll see an option to take you to the insider risk management solution where you can turn on and configure Adaptive Protection.\nOverride holds to reclaim disk space or permanently delete sensitive information\nAlso like adaptive protection, priority cleanup for files or mailbox items similarly applies retention labels under the covers. The labels are automatically configured and applied when you create a priority cleanup policy that identifies items with a query that you specify. This policy can override existing holds for retention and eDiscovery.\nPriority cleanup is specific to data lifecycle management and isn't supported for labeled items that are marked as records. Typical uses depend on the workload:\nFor SharePoint and OneDrive, you can use priority cleanup to periodically delete large files, such as Teams meeting recordings and transcripts. These files are regularly created if you're using the popular Copilot in Teams recap feature, and might be automatically retained with a retention policy that retains all items for many years. However, these specific files typically have little business value after 1-3 months, and you want to regularly delete them to save disk space. You can also use priroty cleanup to delete files in the Preservation Hold library that can prevent OneDrive accounts from being deleted after a user leaves the organization.\nFor Exchange, you can use priority cleanup to delete items when this action is required for security or privacy. For example, after a data spillage incident.\nFor more information and configuration instructions, see the following articles:\nOverride holds to clean up files for Copilot and reclaim storage\nExpedite the permanent deletion of sensitive information from mailboxes\n.\nClassifying content without applying any actions\nAlthough the main purpose of retention labels is to retain or delete content, you can also use retention labels without turning on any retention or other actions. In this case, you can use a retention label simply as a text label, without enforcing any actions.\nFor example, you can create and apply a retention label named \"Review later\" with no actions, and then use that label to find that content later.\nUsing a retention label as a condition in a DLP policy\nYou can specify a retention label as a condition in a Microsoft Purview Data Loss Prevention (DLP) policy for documents in SharePoint. For example, configure a DLP policy to prevent documents from being shared outside the organization if they have a specified retention label applied to it.\nFor more information, see\nCreate and Deploy data loss prevention policies\n.\nRetention labels and policies that apply them\nWhen you publish retention labels, they're included in a\nretention label policy\nthat makes them available for admins and users to apply to content. As the following diagram shows:\nA single retention label can be included in multiple retention label policies.\nRetention label policies specify the locations to publish the retention labels. A single label retention policy can include multiple locations.\nYou can also create one or more\nauto-apply retention label policies\n, each with a single retention label. With this policy, a retention label is automatically applied when conditions that you specify in the policy are met.\nRetention label policies and locations\nRetention labels can be published to different locations, depending on what the retention label does.\nIf the retention label is...\nThen the label policy can be applied to...\nPublished to admins and end users\nExchange, SharePoint, OneDrive, Microsoft 365 Groups\nAuto-applied based on sensitive information types, keywords or a query, or trainable classifiers\nExchange, SharePoint, OneDrive, Microsoft 365 Groups\nAuto-applied to cloud attachments\nSharePoint, OneDrive, Microsoft 365 Groups\nExchange public folders, Skype, Teams and Viva Engage messages don't support retention labels. To retain and delete content from these locations, use retention policies instead.\nOnly one retention label at a time\nUnlike\nsensitivity labels\n, you can't configure priorities for retention labels. Use the following information to understand label behavior for retention labels.\nAs with sensitivity labels, an item such as an email or document can have only a single retention label applied to it at a time. A retention label can be applied\nmanually\nby an end user or admin, or automatically by using any of the following methods:\nAuto-apply retention label policy\nA Microsoft Syntex model\nDefault retention label for SharePoint or Outlook\nOutlook rules\nPower Automate compliance action\nof\nApply a retention label on the item\nIf there are multiple auto-apply retention label policies that could apply a retention label, and the content meets the conditions of more than one of these policies, you can't control which retention label will be selected. However, in some cases, the retention label for the oldest auto-apply retention label policy (by date created) is selected. This happens only when the matching policies don't include multiple instances of the same type of condition (sensitive information types, specific keywords or searchable properties, or trainable classifiers).\nFor standard retention labels (they don't mark items as a\nrecord or regulatory record\n):\nAdmins and end users can manually change or remove an existing retention label that's applied on content.\nWhen items already have a retention label applied, the existing label won't be automatically removed or replaced by another retention label with the following exceptions:\nAt the end of the retention period, the existing label is configured to automatically\napply a different retention label\n, or the existing label is configured to\nrun a Power Automate flow\nwith the compliance action of\nRelabel an item at the end of retention\n.\nYou use the Power Automate compliance action of\nApply a retention label on the item\n. If the item already has a retention label applied, it will be replaced.\nThe existing label was applied as a default label. When you use a default label, there are some scenarios when it can be replaced by another default label, or automatically removed. For more information, see\nDefault labels for SharePoint and Outlook\n.\nFor retention labels that mark items as a record or a regulatory record:\nThese retention labels are never automatically changed during their configured retention period, even if the existing label was applied as a default label, or identified for\npriority cleanup\n.\nOnly admins for the container can manually change or remove retention labels that mark items as a record, but can't manually change or remove retention labels that mark items as a regulatory record. For more information, see\nCompare restrictions for what actions are allowed or blocked\n.\nAt the end of the retention period, an existing label can be replaced if it's configured to mark items as a record and automatically\napply a different retention label\nor to\nrun a Power Automate flow\nwith the compliance action of\nRelabel an item at the end of retention\n. You can't use these relabeling methods if the existing label is configured to mark items as a regulatory record.\nWill an existing label be overridden or removed?\nUse the following tables to help you quickly identify whether an existing retention label on items can be overridden by another retention label, or removed so that it's no longer labeled.\nNote\nUnless a labeled item is marked as a record or regulatory record, it can always be overridden by\npriority cleanup\nthat under the covers, applies a retention label to delete the item.\nA standard retention label refers to a retention label that isn't configured to mark items as records or regulatory records.\nWill a label be overridden?\nWill a label be removed?\nNew label application method\nStandard retention label\nMarks items as records\nMarks items as regulatory records\nManually applied\nYes\nYes\n1\nif admin for the container\nNo\nApplied with Power Automate actions\nYes\nYes\n1\nNot applicable\nApplied with the\nChange label\nlabel setting\nYes\nYes\nNot applicable\nApplied with the\nRelabel\ndisposition review action\nYes\nYes\nNo\nApplied with auto-apply retention label policy\nNo\nNo\nNot applicable\nApplied with Microsoft Syntex model\nNo\nNo\nNo\nOutlook rules\nNo\nNo\nNo\nInherited from default label for SharePoint\nYes if originally applied by another default label\n2\nNo\nNo\nInherited from default label for Outlook\nYes if originally applied by another default label\nNo\nNo\nFootnotes:\n1\nThe record must be\nlocked\n.\n2\nAn exception is if you move the item to another location with a different default label, then the original label isn't overwritten. Only if you then change the default label for this new location will the original default label be overwritten.\nNew labeling action\nStandard retention label\nMarks items as records\nMarks items as regulatory records\nManually remove\nYes\nYes\n1\nif admin for the container\nNo\nPower Automate relabel action\n- No label specified\nYes\nYes\n1\nNot applicable\nAuto-apply retention label policy\nNo\nNo\nNot applicable\nMicrosoft Syntex model\nNo\nNo\nNo\nOutlook rules\nNo\nNo\nNo\nAfter inherited from default label\n- Item then moved to location with a default label\nSharePoint: No\nOutlook: Yes\n2\nSharePoint: No\nOutlook: No\nSharePoint: No\nOutlook: No\nAfter inherited from default label\n- Default label then removed from the location\nSharePoint: No\nOutlook: Yes\nSharePoint: No\nOutlook: No\nSharePoint: No\nOutlook: No\nDelete label from the Microsoft Purview portal\nYes\n3\nNot applicable\nNot applicable\nFootnotes:\n1\nThe record must be\nlocked\n.\n2\nRolling out the end of June 2023, a\ndefault label isn't removed when it's moved to the\nDeleted Items\nfolder\n.\n3\nApplies only when the label can be deleted because it isn't included in any retention label policy, and isn't configured for event-based retention.\nMonitoring retention labels\nUse the\nMicrosoft Purview portal\nto monitor how retention labels are being used in your tenant, and identify where your labeled items are located:\nSign in to the Microsoft Purview portal\n>\nSolutions\n>\nData Lifecycle Management\n>\nOverview\nYou can then drill down to see details by selecting\nView details\nthat loads\ncontent explorer\nand\nactivity explorer\n.\nFor more information, including important prerequisites, see\nLearn about data classification\n.\nTip\nConsider using some of the other data classification insights, such as trainable classifiers and sensitive info types, to help you identify content that you might need to retain or delete, or manage as records.\nUsing Content Search to find all content with a specific retention label\nAfter retention labels are applied to content, either by users or auto-applied, you can use content search to find all items that have a specific retention label applied.\nWhen you create a content search, choose the\nRetention label\ncondition, and then enter the complete retention label name or part of the label name and use a wildcard. For more information, see\nKeyword queries and search conditions for Content Search\n.\nCompare capabilities for retention policies and retention labels\nUse the following table to help you identify whether to use a retention policy or retention label, based on capabilities.\nCapability\nRetention policy\nRetention label\nRetention settings that can retain and then delete, retain-only, or delete-only\nYes\nYes\nWorkloads supported:\n- Exchange\n- SharePoint\n- OneDrive\n- Microsoft 365 groups\n- Skype for Business\n- Teams\n- Copilot and AI apps\n- Viva Engage\nYes\nYes\nYes\nYes\nYes\nYes\nYes\nYes, except public folders\nYes\nYes\nYes\nNo\nNo\nNo\nRetention applied automatically\nYes\nYes\nAutomatically apply different retention settings at the end of the retention period\nNo\nYes\nRetention applied based on conditions\n- sensitive info types, KeyQL queries and keywords, trainable classifiers, cloud attachments\nNo\nYes\nRetention settings applied manually\nNo\nYes\nEnd-user interaction\nNo\nYes\nPersists if the content is moved\nNo\nYes, within your Microsoft 365 tenant\nDeclare item as a record\nNo\nYes\nStart the retention period when labeled or based on an event\nNo\nYes\nRun a Power Automate flow at the end of the retention period\nNo\nYes\nDisposition review\nNo\nYes\nProof of disposition for up to seven years\nNo\nYes, when you use disposition review or item is marked as a record\nAudit admin activities\nYes\nYes\nAudit retention actions\nNo\nYes\n*\nIdentify items subject to retention:\n- Content Search\n- Data classification page, content explorer, activity explorer\nNo\nNo\nYes\nYes\nFootnote:\n*\nFor retention labels that don't mark the content as a record or regulatory record, auditing events are limited to when an item in SharePoint or OneDrive has a label applied, changed, or removed. Or, when a retention label is used with\npriority-cleanup\n. For auditing details for retention labels, see the\nAuditing retention actions\nsection on this page.\nCombining retention policies and retention labels\nYou don't have to choose between using retention policies only or retention labels only. Both methods can be used together and in fact, complementary each other for a more comprehensive solution.\nThe following examples are just some of the ways in which you can combine retention policies and retention labels for the same location.\nFor more information about how retention policies and retention labels work together and how to determine their combined outcome, see the section on this page that explains the\nprinciples of retention and what takes precedence\n.\nExample for users to override automatic deletion\nScenario: By default, content in users' OneDrive accounts is automatically deleted after five years but users must have the option to override this for specific documents.\nYou create and configure a retention policy that automatically deletes content five years after it's last modified, and apply the policy to all OneDrive accounts.\nYou create and configure a retention label that keeps content forever and add this to a label policy that you publish to all OneDrive accounts. You explain to users how to manually apply this label to specific documents that should be excluded from automatic deletion if not modified after five years.\nExample to retain items for longer\nScenario: By default, SharePoint items are automatically retained and then deleted after five years, but documents in specific libraries must be retained for 10 years.\nYou create and configure a retention policy that automatically retains and then deletes content after five years, and apply the policy to all SharePoint and Microsoft 365 Groups instances.\nYou create and configure a retention label that automatically retains content for 10 years. You add this label to a label policy that you publish to all SharePoint and Microsoft 365 Groups instances so that SharePoint admins can then apply it as a default label to be inherited by all items in specific document libraries.\nExample to delete items in a shorter time period\nScenario: By default, emails aren't retained but are automatically deleted after 10 years. However, emails related to a specific project that has a prerelease code name must be automatically deleted after one year.\nYou create and configure a retention policy that automatically deletes content after 10 years, and apply the policy to all Exchange recipients.\nYou create and configure a retention label that automatically deletes content after one year. Options for applying this label to relevant emails include:\nYou create an auto-labeling policy that identifies content by using the project code name as the keyword, and apply the policy to all Exchange recipients\nYou publish the label and instruct users involved in the project how to create an automatic rule in Outlook that applies this label\nYou publish the label and instruct users to create a folder in Outlook for all emails related to the project and they apply the published label to the folder, and then create an Outlook rule to move all project-related emails to this folder\nHow long it takes for retention settings to apply\nWhen you submit retention policies for workloads and label policies to automatically apply a retention label, allow up to seven days for the retention settings to be applied to content:\nHow long it takes for retention policies to take effect\nHow long it takes for retention labels to take effect\nSimilarly, allow up to seven days for retention labels to be visible in apps after you publish the labels:\nWhen retention labels become available to apply\nOften, the policies take effect and labels are visible quicker than seven days. But with many potential variables that can impact this process, it's best to plan for the maximum of seven days.\nAdaptive or static policy scopes for retention\nWhen you create a retention policy or retention label policy, you must choose between adaptive and static to define the scope of the policy.\nAn\nadaptive scope\nuses a query that you specify, so the membership isn't static but dynamic by running daily against the attributes or properties that you specify for the selected locations. You can use multiple adaptive scopes with a single policy.\nExample: Emails and OneDrive documents for executives require a longer retention period than standard users. You create a retention policy with an adaptive scope that uses the Microsoft Entra attribute job title of \"Executive,\" and then select the Exchange email and OneDrive accounts locations for the policy. There's no need to specify email addresses or OneDrive URLs for these users because the adaptive scope automatically retrieves these values. For new executives, there's no need to reconfigure the retention policy because these new users with their corresponding values for email and OneDrive are automatically picked up.\nA\nstatic scope\ndoesn't use queries and is limited in configuration in that it can apply to all instances for a specified location, or use inclusion and exclusions for specific instances for that location. These three choices are sometimes referred to as \"org-wide,\" \"includes,\" and \"excludes\" respectively.\nExample: Emails and OneDrive documents for executives require a longer retention period than standard users. You create a retention policy with a static scope that selects the Exchange email and OneDrive accounts locations for the policy. For the Exchange email location, you're able to identify a group that contains just the executives, so you specify this group for the retention policy, and the group membership with the respective email addresses is retrieved when the policy is created. For the OneDrive accounts location, you must identify and then specify individual OneDrive URLs for each executive. For new executives, you must reconfigure the retention policy to add the new email addresses and OneDrive URLs. You must also update the OneDrive URLs anytime there is a change in an executive's UPN.\nOneDrive URLs are particularly challenging to reliably specify because by default, these URLs aren't created until the user accesses their OneDrive for the first time. And if a user's UPN changes, which you might not know about, their OneDrive URL automatically changes.\nAdvantages of using adaptive scopes over static scopes:\nNo limits on the\nnumber of items per policy\n. Although adaptive policies are still subject to the\nmaximum number of policies per tenant\nlimitations, the more flexible configuration will likely result in far fewer policies.\nYou can apply specific retention settings to just inactive mailboxes. This configuration isn't possible with a static scope because at the time the policy is assigned, static scopes don't support the specific inclusion of recipients with inactive mailboxes.\nFor more advantages of using adaptive scopes, see\nAdaptive scopes\n.\nAdvantages of using static scopes over adaptive scopes:\nSimpler configuration if you want all instances automatically selected for a workload.\nFor \"includes\" and \"excludes,\" this choice can be a simpler configuration initially if the numbers of instances that you have to specify are low and don't change. However, when these number of instances start to increase and you have frequent changes in your organization that require you to reconfigure your policies, adaptive scopes can be simpler to configure and easier to maintain.\nThe\nSkype for Business\nand\nExchange public folders\nlocations don't support adaptive scopes. For those locations, you must use a static scope.\nFor configuration information, see\nConfiguring adaptive scopes\n.\nCurrently, adaptive scopes don't support\nPreservation Lock to restrict changes to retention policies and retention label policies\n.\nPolicy lookup\nYou can configure multiple retention policies for Microsoft 365 locations, as well as multiple retention label policies that you publish or auto-apply. To find the policies for retention that are assigned to specific users, sites, and Microsoft 365 groups, use\nPolicy lookup\nfrom the\nData lifecycle management\nor\nRecords management\nsolutions in the Microsoft Purview portal.\nFor example, from the Microsoft Purview portal:\nYou must specify the exact email address for a user, exact URL for a site, or exact email address for a Microsoft 365 group. You can't use wildcards, or partial matches, for example.\nThe option for sites includes OneDrive accounts. For information how to specify the URL for a user's OneDrive account, see\nGet a list of all user OneDrive URLs in your organization\n.\nThe principles of retention, or what takes precedence?\nUnlike retention labels, you can apply more than one retention policy to the same content. Each retention policy can result in a retain action and a delete action. Additionally, that item could also be subject to these actions from a retention label.\nIn this scenario, when items can be subject to multiple retention settings that could conflict with one another, what takes precedence to determine the outcome?\nThe outcome isn't which single retention policy or single retention label wins, but how long an item is retained (if applicable) and when an item is deleted (if applicable). These two actions are calculated independently from each other, from all the retention settings applied to an item.\nFor example, an item might be subject to one retention policy that is configured for a delete-only action, and another retention policy that is configured to retain and then delete. Consequently, this item has just one retain action but two delete actions. The retention and deletion actions could be in conflict with one another and the two deletion actions might have a conflicting date. The principles of retention explain the outcome.\nBy default, retention always takes precedence over permanent deletion, and the longest retention period wins. These two simple rules always decide how long an item is retained unless it's subject to\npriority cleanup\nthat might be needed to expedite permanent deletion for exceptional circumstances.\nIf you're not using priority cleanup, there are a few more factors that determine when an item is permanently deleted, which include the delete action from a retention label always takes precedence over the delete action from a retention policy.\nUse the following flow to understand the retention and deletion outcomes for a single item that isn't subject to priority cleanup, where each level acts as a tie-breaker for conflicts, from top to bottom. If the outcome is determined by the first level because there are no further conflicts, there's no need to progress to the next level, and so on.\nImportant\nIf you are using retention labels: Before applying the principles to determine the outcome of multiple retention settings on the same item, make sure you know\nwhich retention label is applied\n.\nBefore explaining each principle in more detail, it's important to understand the difference between the retention period for the item vs. the specified retention period in the retention policy or retention label. That's because although the default configuration is to start the retention period when an item is created, so that the end of the retention period is fixed for the item, files also support the configuration to start the retention period from when the file is last modified. With this alternative configuration, every time the file is modified, the start of the retention period is reset, which extends the end of the retention period for the item. Retention labels also support starting the retention period when labeled and at the start of an event.\nTo apply the principles in action with a series of Yes and No questions, you can also use the\nretention flowchart\n.\nExplanation for the four different principles:\nRetention wins over deletion.\nContent won't be permanently deleted when it also has retention settings to retain it. While this principle ensures that content is preserved for compliance reasons, the delete process can still be initiated (user-initiated or system-initiated) and consequently, might remove the content from users' main view. However, permanent deletion is suspended. For more information about how and where content is retained, use the following links for each workload:\nHow retention works for SharePoint and OneDrive\nHow retention works with Microsoft Teams\nHow retention works with Viva Engage\nHow retention works for Exchange\nHow retention works with Copilot & AI apps\nExample for this first principle\n: An email message is subject to a retention policy for Exchange that is configured to delete items three years after they are created, and it also has a retention label applied that is configured to retain items five years after they are created.\nThe email message is retained for five years because this retention action takes precedence over deletion. The email message is permanently deleted at the end of the five years because of the delete action that was suspended while the retention action was in effect.\nThe longest retention period wins.\nIf content is subject to multiple retention settings that retain content for different periods of time, the content is retained until the end of the longest retention period for the item.\nNote\nIt's possible for a retention period of five years in a retention policy or label wins over a retention period of seven years in a retention policy or label, because the 5-year period is configured to start based on when the file is last modified, and the seven year period is configured to start from when the file is created.\nExample for this second principle\n: Documents in the Marketing SharePoint site are subject to two retention policies. The first retention policy is configured for all SharePoint sites to retain items for five years after they are created. The second retention policy is configured for specific SharePoint sites to retain items for 10 years after they are created.\nDocuments in this Marketing SharePoint site are retained for 10 years because that's the longest retention period for the item.\nExplicit wins over implicit for deletions.\nWith conflicts now resolved for retention, only conflicts for deletions remain:\nA retention label (however it was applied) provides explicit retention in comparison with retention policies, because the retention settings are applied to an individual item rather than implicitly assigned from a container. This means that a delete action from a retention label always takes precedence over a delete action from any retention policy.\nExample 1 for this third principle (label)\n: A document is subject to two retention policies that have a delete action of five years and 10 years respectively, and also a retention label that has a delete action of seven years.\nThe document is permanently deleted after seven years because the delete action from the retention label takes precedence.\nExample 2 for this third principle (label)\n: A document in the Marketing SharePoint site is subject to both a retention policy and a retention label. The retention policy is configured to retain items in all SharePoint sites for 10 years after creation, while the retention label specifies a delete action after seven years.\nIn this case, the document is retained for 10 years, which is the longest retention period. Because the retention label’s delete action takes precedence, the document is permanently deleted after 10 years. The item is not moved to the Preservation Hold Library (PHL).\nExample 3 for this third principle (label)\n: An email message is subject to one retention policy and a retention label. The retention policy is configured for all Exchange mailboxes to retain items for 10 years after they are creation, and the retention label specifies a delete action after seven years.\nThe email message is retained for 10 years, which is the longest retention period. After seven years, the retention label causes the item to be moved to the Recoverable Items folder. The item remains there until the 10 year retention period expires, after which it is permanently deleted.\nWhen you have retention policies only: If a retention policy for a location uses an adaptive scope or a static scope that includes specific instances (such as specific users for Exchange email) that retention policy takes precedence over a static scope that is configured for all instances for the same location.\nA static scope that is configured for all instances for a location is sometimes referred to as an \"org-wide policy\". For example,\nExchange mailboxes\nand the default setting of\nAll mailboxes\n. Or,\nSharePoint classic and communication sites\nand the default setting of\nAll sites\n. When retention policies aren't org-wide but have been configured with an adaptive scope or a static scope that includes specific instances, they have equal precedence at this level.\nExample 1 for this third principle (policies)\n: An email message is subject to two retention policies. The first retention policy is unscoped and deletes items after 10 years. The second retention policy is scoped to specific mailboxes and deletes items after five years.\nThe email message is permanently deleted after five years because the deletion action from the scoped retention policy takes precedence over the org-wide retention policy.\nExample 2 for this third principle (policies)\n: A document in a user's OneDrive account is subject to two retention policies. The first retention policy is scoped to include this user's OneDrive account and has a delete action after 10 years. The second retention policy is scoped to include this user's OneDrive account and has a delete action after seven years.\nWhen this document will be permanently deleted can't be determined at this level because both retention policies are scoped to include specific instances.\nThe shortest deletion period wins.\nApplicable to determine when items will be deleted from retention policies and the outcome couldn't be resolved from the previous level: Content is permanently deleted at the end of the shortest retention period for the item.\nNote\nIt's possible that a retention policy that has a retention period of seven years wins over a retention policy of five years because the first policy is configured to start the retention period based on when the file is created, and the second retention policy from when the file is last modified.\nExample for this fourth principle\n: A document in a user's OneDrive account is subject to two retention policies. The first retention policy is scoped to include this user's OneDrive account and has a delete action of 10 years after the file is created. The second retention policy is scoped to include this user's OneDrive account and has a delete action of seven years after the file is created.\nThis document will be permanently deleted after seven years because that's the shortest retention period for the item from these two scoped retention policies.\nItems subject to eDiscovery hold also fall under the first principle of retention; they cannot be permanently deleted by any retention policy or retention label. When that hold is released, the principles of retention continue to apply to them. For example, they could then be subject to an unexpired retention period or a delete action.\nPrinciples of retention examples that combine retain and delete actions\nThe following examples are more complex to illustrate the principles of retention when different retain and delete actions are combined. To make the examples easier to follow, all retention policies and labels use the default setting of starting the retention period when the item is created so the end of the retention period is the same for the item.\nAn item has the following retention settings applied to it:\nA retention policy for delete-only after five years\nA retention policy that retains for three years and then deletes\nA retention label that retains-only for seven years\nOutcome\n: The item is retained for seven years because retention takes precedence over deletion and seven years is the longest retention period for the item. At the end of this retention period, the item is permanently deleted because of the delete action from the retention policies.\nAlthough the two retention policies have different dates for the delete actions, the earliest that the item can be permanently deleted is at the end of the longest retention period, which is longer than both deletion dates.\nAn item has the following retention settings applied to it:\nAn org-wide retention policy that deletes-only after 10 years\nA retention policy scoped with specific instances that retains for five years and then deletes\nA retention label that retains for three years and then deletes\nOutcome\n: The item is retained for five years because that's the longest retention period for the item. At the end of that retention period, the item is permanently deleted because of the delete action of three years from the retention label. Deletion from retention labels takes precedence over deletion from all retention policies. In this example, all conflicts are resolved by the third level.\nUse Preservation Lock to restrict changes to policies\nSome organizations might need to comply with rules defined by regulatory bodies such as the Securities and Exchange Commission (SEC) Rule 17a-4, which requires that after a policy for retention is turned on, it cannot be turned off or made less restrictive.\nPreservation Lock ensures your organization can meet such regulatory requirements because it locks a retention policy or retention label policy so that no one—including an administrator—can turn off the policy, delete the policy, or make it less restrictive.\nYou apply Preservation Lock after the retention policy or retention label policy is created. For more information and instructions, see\nUse Preservation Lock to restrict changes to retention policies and retention label policies\n.\nReleasing a policy for retention\nProviding your policies for retention don't have a Preservation Lock, you can delete your policies at any time, which effectively turns off the retention settings for a retention policy, and retention labels can no longer be applied from retention label policies. Any previously applied retention labels remain with their configured retention settings and for these labels, you can still update the retention period when it's not based on when items were labeled.\nYou can also keep a policy, but change the location status to off, or disable the policy. Another option is to reconfigure the policy so it no longer includes specific users, sites, groups, and so on.\nAdditional information for specific locations:\nSharePoint sites and OneDrive accounts:\nWhen you release a retention policy for SharePoint sites and OneDrive accounts, any content that's subject to retention from the policy continues to be retained for 30 days to prevent inadvertent data loss. During this 30-day grace period deleted files are still retained (files continue to be added to the Preservation Hold library), but the timer job that periodically cleans up the Preservation Hold library is suspended for these files so you can restore them if necessary.\nAn exception to this 30-day grace period is when you update the policy to exclude one or more sites for SharePoint or accounts for OneDrive; in this case, the timer job deletes files for these locations in the Preservation Hold library without the 30-day delay.\nFor more information about the Preservation Hold library, see\nHow retention works for SharePoint and OneDrive\n.\nBecause of the behavior during the grace period, if you re-enable the policy or change the location status back to on within 30 days, the policy resumes without any permanent data loss during this time.\nExchange email and Microsoft 365 Groups\nWhen you release a retention policy for mailboxes that are\ninactive\nat the time the policy is released:\nIf the retention policy is explicitly applied to a mailbox, the retention settings no longer apply. With no retention settings applied, an inactive mailbox becomes eligible for automatic deletion in the usual way.\nAn explicit retention policy requires either an adaptive policy scope, or a static policy scope with an include configuration that specified an active mailbox at the time the policy was applied and later became inactive\nIf the retention policy is implicitly applied to a mailbox and the configured retention action is to retain, the retention policy continues to apply and an inactive mailbox never becomes eligible for automatic deletion. When the retain action no longer applies because the retention period has expired, the Exchange admin can now\nmanually delete the inactive mailbox\nAn implicit retention policy requires a static policy scope with the\nAll mailboxes\n(for Exchange email) or\nAll groups\n(for Microsoft 365 Groups) configuration.\nFor more information about inactive mailboxes that have retention policies applied, see\nInactive mailboxes and Microsoft 365 retention\n.\nAuditing retention configuration and actions\nWhen\nauditing is enabled\n, auditing events for retention are supported for both administration configuration (retention policies and retention labels) and retention actions (retention labels only).\nAuditing retention configuration\nAdministrator configuration for retention policies and retention labels is logged as auditing events when a retention policy or label is created, reconfigured, or deleted.\nFor the full list of auditing events, see\nRetention policy and retention label activities\n.\nAuditing retention actions\nRetention actions that are logged as auditing events are available only for retention labels and not for retention policies:\nSpecific to\npriority cleanup\n, use the\nCleanup ID\nassigned to the priority cleanup policy as a keyword search term in the auditing log to find\nrelated activities\n.\nSpecific to\nAdaptive Protection when a retention label is applied to an item\n:\nFor SharePoint and OneDrive, from\nRetention policy and retention label activities\n, select\nRetained file proactively\nFor Exchange, from\nRetention policy and retention label activities\n, select\nRetained email item proactively\nWhen a retention label is applied, changed, or removed from an item in SharePoint or OneDrive:\nFrom\nFile and page activities\n, select\nChanged retention label for a file\nWhen a labeled item in SharePoint is marked as a record, and it is unlocked or locked by a user:\nFrom\nFile and page activities\n, select\nChanged record status to unlocked\nand\nChanged record status to locked\nWhen a retention label that marks content as a record or regulatory record is applied to an item in Exchange:\nFrom\nExchange mailbox activities\n, select\nLabeled message as a record\nWhen a labeled item in SharePoint, OneDrive, or Exchange is marked as a record or regulatory record, and it is permanently deleted:\nFrom\nFile and page activities\n, select\nDeleted file marked as a record\nWhen a disposition reviewer takes action for an item that's reached the end of its retention period:\nFrom\nDisposition review activities\n, select\nApproved disposal\n,\nExtended retention period\n,\nRelabeled item\n, or\nAdded reviewers\nPowerShell cmdlets for retention policies and retention labels\nUse\nSecurity & Compliance PowerShell\nfor Purview retention cmdlets that support configuration at scale, scripting for automation, or might be necessary for advanced configuration scenarios.\nFor a list of available cmdlets, and to identify which ones are supported for the different locations, see\nPowerShell cmdlets for retention policies and retention labels\n.\nWhen to use retention policies and retention labels or eDiscovery holds\nAlthough retention settings and\nholds that you create with an eDiscovery case\ncan both prevent data from being permanently deleted, they are designed for different scenarios. To help you understand the differences and decide which to use, use the following guidance:\nRetention settings that you specify in retention policies and retention labels are designed for a long-term data lifecycle management strategy to retain or delete data for compliance requirements. The scope is usually broad with the main focus being the location and content rather than individual users. The start and end of the retention period is configurable, with the option to automatically delete content without additional administrator intervention.\nHolds for eDiscovery cases are designed for a limited duration to preserve data for a legal investigation. The scope is specific with the focus being content owned by identified users. The start and end of the preservation period isn't configurable but dependent on individual administrator actions, without an option to automatically delete content when the hold is released.\nSummary to compare retention with holds:\nConsideration\nRetention\neDiscovery holds\nBusiness need:\nCompliance\nLegal\nTime scope:\nLong-term\nShort-term\nFocus:\nBroad, content-based\nSpecific, user-based\nStart and end date configurable:\nYes\nNo\nContent deletion:\nYes (optional)\nNo\nAdministrative overheads:\nLow\nHigh\nIf content is subject to both retention settings and an eDiscovery hold, preserving content for the eDiscovery hold always takes precedence. In this way, the\nprinciples of retention\nexpand to eDiscovery holds because they preserve data until an administrator manually releases the hold. However, despite this precedence, don't use eDiscovery holds for long-term data lifecycle management. If you are concerned about automatic deletion of data, you can configure retention settings to retain items forever, or use\ndisposition review\nwith retention labels.\nIf you are using older eDiscovery tools to preserve data, see the following resources:\nExchange:\nIn-Place Hold and Litigation Hold\nHow to identify the type of hold placed on an Exchange Online mailbox\nSharePoint and OneDrive:\nAdd content to a case and place sources on hold in the eDiscovery Center\nRetirement of legacy eDiscovery tools\nUse retention policies and retention labels instead of older features\nIf you need to retain or delete content in Microsoft 365 for data lifecycle management, we recommend you use Microsoft 365 retention policies and retention labels instead of the following older features.\nIf you currently use these older features, they'll usually work side by side with Microsoft 365 retention policies and retention labels. Check their specific documentation for any restrictions. However, we recommend that going forward, you use Microsoft 365 retention policies and retention labels to benefit from a single solution to manage both retention and deletion of content across multiple workloads in Microsoft 365.\nTip\nFor SharePoint, see the following resources:\nUse Microsoft Purview risk and compliance solutions instead of the older information management and records management features in SharePoint for Microsoft 365\nMigration strategies for moving to Microsoft Purview risk and compliance solutions\nOlder features from Exchange Online:\nRetention tags and retention policies\n, also known as\nmessaging records management (MRM)\n(deletion only)\nHowever, if you use the following MRM features, be aware that they aren't currently supported by Microsoft 365 retention policies:\nAn archive policy for\narchive mailboxes\nto automatically move emails from a user's primary mailbox to their archive mailbox after a specified period of time. An archive policy (with any settings) can be used in conjunction with a Microsoft 365 retention policy that applies to a user's primary and archive mailbox.\nRetention policies applied by an admin to specific folders within a mailbox. A Microsoft 365 retention policy applies to all folders in the mailbox. However, an admin can configure different retention settings by using retention labels that a user can apply to folders in Outlook as a\ndefault retention label\n.\nJournaling\n(retention and archive)\nMight be required to integrate with third-party solutions and copies of email messages and their data communication are stored outside Exchange Online. Because you're moving data outside Microsoft 365, you must take extra precautions to secure it and also resolve any duplications that might result from this solution. It is your responsibility to monitor and follow up on any nondelivery receipts to the journaling mailbox that can occur because of external and dependent services. You don't have these additional administrative overheads when you use Microsoft 365 retention and other Microsoft Purview compliance solutions that also aren't limited to just email messages.\nLitigation hold\n(retention only)\nAlthough Litigation holds are still supported, we recommend you use Microsoft 365 retention or eDiscovery holds,\nas appropriate\n.\nOlder features from SharePoint and OneDrive:\nDocument deletion policies\n(deletion only)\nConfiguring in place records management\n(retention only)\nUse policies for site closure and deletion\n(deletion only)\nInformation management policies\n(deletion only)\nIf you have configured SharePoint sites for content type policies or information management policies to retain content for a list or library, those policies are ignored while a retention policy or retention label policy is in effect.\nRelated information\nSharePoint Online Limits\nLimits and specifications for Microsoft Teams\nResources to help you meet regulatory requirements for data lifecycle management and records management\nConfiguration guidance\nSee\nGet started with data lifecycle management\n. This article has information about subscriptions, permissions, and links to end-to-end configuration guidance for retention scenarios.\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Data Retention", @@ -836,7 +836,7 @@ "https://learn.microsoft.com/en-us/purview/create-retention-labels-data-lifecycle-management": { "content_hash": "sha256:7caad06f5e9d23ca9f3ce1b3637e52fc72ec32e3f3fc2c9dfb6c2cd0a02bf2d2", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nCreate retention labels for exceptions to your retention policies\nFeedback\nSummarize this article for me\nMicrosoft Purview service description\nAs part of your data governance strategy to retain what you need and delete what you don't, you might need to create a few retention labels for items that need exceptions to your retention policies.\nWhereas retention policies automatically apply to all items at the container level (such as SharePoint sites, user mailboxes, and so on), retention labels apply to individual items, such as a SharePoint document or an email message.\nMake sure you understand the\nprinciples of retention\nbefore you use retention labels to supplement a retention policy for specific SharePoint, OneDrive, or Exchange items. Typically, you'll use retention labels to retain specific items longer than an applied retention policy, but they can also be used to override automatic deletion at the end of the retention period, or apply a different deletion period.\nAs a typical example: The majority of content on your SharePoint sites need to be retained for three years, which is covered with a retention policy. But you have some contract documents that must be retained for seven years. These exceptions can be addressed with retention labels. After assigning the retention policy to all SharePoint sites, you apply the retention labels to the contract documents. All SharePoint items will be retained for three years, and just the contract documents will be retained for seven years.\nFor more examples of how retention labels can be used as exceptions to retention policies, see\nCombining retention policies and retention labels\n.\nRetention labels also support more capabilities than retention policies. For more information, see\nCompare capabilities for retention policies and retention labels\n.\nUse the following information to help you create retention labels to supplement retention policies as part of your data lifecycle management strategy.\nNote\nCreate retention labels from the\nRecords management\nsolution rather than\nData lifecycle management\nif you need to use retention labels to manage high-value items for business, legal, or regulatory record-keeping requirements. For example, you want to use event-based retention or disposition review. For instructions, see\nUse file plan to create and manage retention labels\n.\nBefore you begin\nTo make sure you have permissions to create and edit retention labels and their policies, see\nPermissions for retention policies and retention labels\n.\nHow to create retention labels for data lifecycle management\nSign in to the Microsoft Purview portal\n>\nSolutions\n>\nData Lifecycle Management\n>\nRetention labels\n.\nSelect\nCreate a label\nand follow the prompts to create the retention label. Be careful what name you choose, because this can't be changed after the label is saved.\nFor more information about the retention settings, see\nSettings for retaining and deleting content\n.\nAfter you have created the label and you see the options to publish the label, auto-apply the label, or just save the label: Select\nJust save the label for now\n, and then select\nDone\n.\nRepeat these steps to create any more retention labels that you need for different retention settings.\nTo edit an existing label, select it, and then select the\nEdit label\noption to start the Edit retention label configuration that lets you change the label descriptions and any eligible settings.\nMost settings can't be changed after the label is created and saved, which include the label name, and the retention settings except the retention period.\nNext steps\nNow you've created retention labels, they are ready to be added to items by publishing the labels, or automatically applying them:\nPublish retention labels and apply them in apps\nApply a retention label to content automatically\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Retention Labels", @@ -845,7 +845,7 @@ "https://learn.microsoft.com/en-us/purview/retention-policies-sharepoint": { "content_hash": "sha256:6978743bd07c4db082ceb5a92093d2962c606b917204201bd55dd9965808530d", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nLearn about retention for SharePoint and OneDrive\nFeedback\nSummarize this article for me\nMicrosoft Purview service description\nThe information in this article supplements\nLearn about retention\nbecause it has information that's specific to SharePoint and OneDrive.\nFor other workloads, see:\nLearn about retention for Microsoft Teams\nLearn about retention for Viva Engage\nLearn about retention for Exchange\nLearn about retention for Copilot\nWhat's included for retention and deletion\nNote\nIn preview,\nMicrosoft Facilitator AI-generated notes in meetings\nare supported by retention policies and retention labels.\nAll files stored in SharePoint (and SharePoint Embedded) or OneDrive sites can be retained by applying a retention policy or retention label. For SharePoint,\narchived sites\nare supported in addition to active sites.\nThe following files can be deleted:\nWhen you use a retention policy: All files in document libraries, which include any automatically created SharePoint document libraries, such as\nSite Assets\n.\nWhen you use retention labels: All files in all document libraries, and all files at the root level that aren't in a folder.\nTip\nWhen you use a\nquery with an auto-apply policy for a retention label\n, you can exclude specific document libraries by using the following entry:\nNOT(DocumentLink:\"\")\nFiles that can be retained and deleted include those used by Microsoft Loop and Copilot Pages. For these apps, content can be stored in SharePoint Embedded containers. Although the containers aren't SharePoint sites, for the purposes of retention and deletion, they behave the same as SharePoint sites. For more information about where the files are stored, see\nOverview of Loop storage\n.\nList items aren't supported by retention policies but are supported by retention labels with the exception of items in system lists. These are hidden lists used by SharePoint to manage the system and include the master page catalog, solution catalog, and data sources. When retention labels are applied to supported list items, they will always be retained according to the retention settings, but not deleted if they are hidden from search.\nWhen you apply a retention label to a supported list item that has a document attachment:\nFor a standard retention label (doesn't declare the item to be a record):\nThe document attachment doesn't automatically inherit the retention settings of the label, but can be labeled independently.\nFor a retention label that declares the item a record:\nThe document attachment automatically inherits the retention settings from the label if the document isn't already labeled.\nRetention settings from both retention policies and retention labels don't apply to organizing structures that include libraries, lists, folders, and Loop workspaces.\nFor retention policies and label policies: SharePoint sites must be indexed for the retention settings to be applied. However, if items in SharePoint document libraries are configured to not appear in search results, this configuration doesn't exclude files from the retention settings.\nIf a site is configured with the\nSet-SPOSite\nparameter\nLockState\nthat's set to NoAccess or ReadOnly, items in that site can't be deleted with this site configuration.\nHow retention works for SharePoint and OneDrive\nTo store content that needs to be retained, SharePoint and OneDrive create a Preservation Hold library if one doesn't exist for the site. The Preservation Hold library is a hidden system location that isn't designed to be used interactively but instead, automatically stores files when this is needed for compliance reasons. It's not supported to edit, delete, or move these automatically retained files yourself. Or, change or remove retention labels and sensitivity labels for these files. Instead, use compliance tools, such as those supported by\neDiscovery\nto access the content in these files.\nThe Preservation Hold library works in the following way to support retention policies and retention labels:\nWhen a user changes an item that's subject to retention from a retention policy or a retention label that marks items as a record, or deletes any item subject to retention, the original content is copied to the Preservation Hold library. This behavior lets the user change or delete the content in their app, while keeping a copy of the original for compliance reasons.\nRetention item\nUser edit or delete action\nCopy created in the Preservation Hold library\nRetention policy\nEdit item\nDelete item\nYes\n*\nYes\nStandard retention label (doesn’t mark items as records or regulatory records)\nEdit item\nDelete item\nNo\nYes\nRetention label that marks items as records\nEdit unlocked item\nEdit locked item\nDelete item\nYes\nNot applicable –\naction blocked\nNot applicable –\naction blocked\nRetention label that marks items as regulatory records\nEdit item\nDelete item\nNot applicable –\naction blocked\nNot applicable –\naction blocked\nFootnote\n:\n*\nNew content isn't copied to the Preservation Hold library the first time it's edited. To retain all versions of a file,\nversioning\nmust be turned on for the site.\nThis behavior for copying files into the Preservation Hold library applies to content that exists when the retention settings were applied. In addition, for retention policies, any new content that's created or added to the site after it was included in the policy will be retained in the Preservation Hold library.\nA timer job periodically runs on the Preservation Hold library. For content that has been in the Preservation Hold library for more than 30 days, this job compares the content to all queries used by the retention settings for that content. Content that is older than their configured retention period and isn't awaiting\ndisposition review\nis then deleted from the Preservation Hold library, and from the original location if it's still there. This timer job runs every seven days, which means that together with the minimal 30 days, it can take up to 37 days for content to be deleted from the Preservation Hold library.\nUsers see an error message if they try to delete a library, list, or site that's subject to retention.\nUsers also see an error message if they try to delete a labeled item in any of the following circumstances. The item isn't copied to the Preservation Hold library but remains in the original location:\nThe records management setting that allows users to delete labeled items is turned off.\nTo check or change this setting, go to the records management settings in the Microsoft Purview portal:\nSettings\n> **Solution settings **> Records Management >\nRetention Labels\n>\nDeleting content labeled for retention\n. There are separate settings for OneDrive and SharePoint.\nAlternatively, and if you don't have access these settings in the portals, you can use\nAllowFilesWithKeepLabelToBeDeletedSPO\nand\nAllowFilesWithKeepLabelToBeDeletedODB\nfrom\nGet-PnPTenant\nand\nSet-PnPTenant\n.\nThe retention label marks items as a record and it's\nlocked\n.\nOnly when the record is unlocked, does a copy of the last version get stored in the Preservation Hold library.\nThe retention label marks items as a\nregulatory record\n, which always prevents the item from being edited or deleted.\nAfter retention settings are assigned to content in a OneDrive account, SharePoint site, or SharePoint Embedded container for a Loop workspace, the paths the content takes depend on whether the retention settings are to retain and delete, to retain only, or delete only. In the explanations that follow, modified content is moved to the Preservation Hold library for retention policies, and retention labels that mark items as records (and the content is unlocked). Items that are modified with retention labels that don't mark items as records don't create copies in the Preservation Hold library, but do when items are deleted.\nWhen the retention settings are to retain and delete:\nIf the content is modified or deleted\nduring the retention period, a copy of the original content as it existed when the retention settings were assigned is created in the Preservation Hold library. There, the timer job identifies items whose retention period has expired. Those items are moved to the second-stage Recycle Bin, where they're permanently deleted at the end of 93 days. The second-stage Recycle Bin isn't visible to end users (only the first-stage Recycle Bin is), but site collection admins can view and restore content from there.\nNote\nTo help prevent inadvertent data loss, we no longer permanently delete content from the Preservation Hold library. Instead, we permanently delete content only from the Recycle Bin, so all content from the Preservation Hold library now goes through the second-stage Recycle Bin.\nIf the content is not modified or deleted\nduring the retention period, the timer job moves this content to the first-stage Recycle Bin at the end of the retention period. If a user deletes the content from there or empties this Recycle Bin (also known as purging), the document is moved to the second-stage Recycle Bin. A 93-day retention period spans both the first- and second-stage recycle bins. At the end of 93 days, the document is permanently deleted from wherever it resides, in either the first-stage or second-stage Recycle Bin. The Recycle Bin isn't indexed and therefore unavailable for searching. As a result, an eDiscovery search can't find any Recycle Bin content on which to place a hold.\nNote\nBecause of the\nfirst principle of retention\n, permanent deletion is always suspended if the same item must be retained because of another retention policy or retention label, or it's under eDiscovery holds for legal or investigative reasons.\nWhen the retention settings are retain-only, or delete-only, the contents paths are variations of retain and delete:\nContent paths for retain-only retention settings\nIf the content is modified or deleted\nduring the retention period: A copy of the original document is created in the Preservation Hold library and retained until the end of the retention period, when the copy in the Preservation Hold library is moved to the second-stage Recycle Bin and is permanently deleted after 93 days.\nIf the content is not modified or deleted\nduring the retention period: Nothing happens before and after the retention period; the document remains in its original location.\nContent paths for delete-only retention settings\nIf the content is deleted\nduring the configured period: The document is moved to first-stage Recycle Bin. If a user deletes the document from there or empties this Recycle Bin, the document is moved to the second-stage Recycle Bin. A 93-day retention period spans both the first-stage and second-stage recycle bins. At the end of 93 days, the document is permanently deleted from wherever it resides, in either the first-stage or second-stage Recycle Bin. If the content is modified during the configured period, it follows the same deletion path after the configured period.\nIf the content is not deleted\nduring the configured period: At the end of the configured period in the retention policy, the document is moved to the first-stage Recycle Bin. If a user deletes the document from there or empties this Recycle Bin (also known as purging), the document is moved to the second-stage Recycle Bin. A 93-day retention period spans both the first-stage and second-stage recycle bins. At the end of 93 days, the document is permanently deleted from wherever it resides, in either the first-stage or second-stage Recycle Bin. The Recycle Bin isn't indexed and therefore unavailable for searching. As a result, an eDiscovery search can't find any Recycle Bin content on which to place a hold.\nHow retention works with cloud attachments\nCloud attachments are embedded links to files that users share, or referenced in interactions for Microsoft 365 Copilot and Microsoft 365 Copilot Chat. They can be retained and deleted when your users share them in Outlook emails and Teams or Viva Engage messages, and they are referenced in interactions with Copilot. When you\nautomatically apply a retention label to cloud attachments\n, the retention label is applied to a copy of the shared file, which is stored in the Preservation Hold library.\nFor this scenario, we recommend you configure the label setting to start the retention period based on when the item is labeled. If you do configure the retention period based on when the item is created or last modified, this date is taken from the original file at the time of sharing. If you configure the start of retention to be when last modified, this setting has no effect for this copy in the Preservation Hold library.\nHowever, if the original file is modified and then shared again, a new copy of the file as a new version is saved and labeled in the Preservation Hold library.\nIf the original file is shared again but not modified, the labeled date of the copy in the Preservation Hold library is updated. This action resets the start of the retention period and is why we recommend you configure the start of the retention period to be based on when the item is labeled.\nBecause the retention label isn't applied to the original file, the labeled file is never modified or deleted by a user. The labeled file remains in the Preservation Hold library until the timer job identifies that its retention period has expired. If the retention settings are configured to delete items, the file is then moved to the second-stage Recycle Bin, where it's permanently deleted at the end of 93 days:\nThe copy that's stored in the Preservation Hold library is typically created within an hour from the cloud attachment being shared.\nTo safeguard against the original file being deleted or moved by users before the copy can be created and labeled, files in locations included in the auto-labeling policy are automatically copied into the Preservation Hold library if they are deleted or moved. These files have a temporary retention period of one day and then follow the standard cleanup process described on this page. When the original file has been deleted or moved, the copy for retaining cloud attachments uses this version of the file. The automatic and temporary retention of deleted or moved files in the Preservation Hold library is unique to auto-labeling policies for cloud attachments.\nHow retention works with OneNote content\nWhen you apply a retention policy to a location that includes OneNote content, or a retention label to a OneNote folder, the different OneNote sections inherit the retention settings as individual files. Pages from each section are contained within the file and inherit the retention settings from their parent section.\nBecause of this structure, each section will be individually retained and deleted (with all its pages), according to the retention settings you specify.\nOnly sections are impacted by the retention settings that you specify. For example, although you see a\nModified\ndate for each individual notebook, this date isn't used by Microsoft 365 retention.\nHow retention works with document versions\nVersioning is a feature of all document lists and libraries in SharePoint and OneDrive. By default, versioning retains a minimum of 500 major versions, although you can change this limit. For more information, see\nEnable and configure versioning for a list or library\nand\nHow versioning works in lists and libraries\n.\nWhen a document with versions is subject to retention settings to retain that content, and it's not marked as a record, how the versions are stored in the Preservation Hold library changed in July 2022 to improve performance. Now, all versions of that file are retained in a single file in the Preservation Hold library. Before the change, versions were copied to the Preservation Hold library as separate files, and after the change, remain as separate files.\nNote\nVersions that are from a record continue to be copied to the Preservation Hold library as separate files, which means that they can expire independently from each other and the current version.\nIf the label doesn't mark the item as a record and retention settings are configured to delete the item at the end of the retention period:\nIf the retention period is based on when the content was created, when labeled, or when an event starts, each version has the same expiration date as the original document. The original document and its versions all expire at the same time.\nIf the retention period is based on when the content was last modified:\nAfter the change where all versions of the file are retained in a single file in the Preservation Hold library\n: Each version has the same expiration date as the last version of the document. The last version of the document and its versions all expire at the same time.\nBefore the change where versions were copied to the Preservation Hold library as separate files\n: Each version has its own expiration date based on when the original document was modified to create that version. The original document and its versions expire independently of each other.\nWhen the retention action is to delete the document, all versions not in the Preservation Hold library are deleted at the same time according to the current version.\nFor items that are subject to a retention policy (or an eDiscovery hold), the versioning limits for the document library are ignored until the retention period of the document is reached (or the eDiscovery hold is released). In this scenario, old versions aren't automatically purged and users are prevented from deleting versions.\nThat's not the case for retention labels when the content isn't subject to a retention policy (or an eDiscovery hold). Instead, the versioning limits are honored so that older versions are automatically deleted to accommodate new versions, but users are still prevented from deleting versions.\nHow retention works with Microsoft 365 Archive\nFor administrators, there's very little change to how retention policies and retention labels work and are managed for sites that use\nMicrosoft 365 Archive\n. For example, the default policy configuration of all sites automatically includes archived sites as well as active sites. An active site that's included in a retention policy and then changed to be an archive site will continue to be subject to the configuration settings in the retention policy. The same applies to labeled items in a site that becomes archived. You can create a new retention policy for an archived site, and auto-apply retention labels for archived sites. Items still support disposition review, Power Automate actions, simulation mode is supported, policy lookup, adaptive scopes, and Microsoft Graph API to programmatically apply and manage retention labels are all supported for archived sites.\nThe one exception is for\ncloud attachments\n, where an item that's currently in an archived site won't be retained with an auto-apply retention label policy. Cloud attachments that were retained from an active site continue to to be subject to the configuration settings in the retention label.\nBecause users can't view and interact with items in archived sites, the user actions usually supported for retention labels won't be possible. For example, manually applying or removing retention labels, locking and unlocking records, editing of record properties that include the name and description. Similarly, although the Microsoft Purview portal supports disposition review, the contents of an item under disposition review can't be displayed and the URL link to the item won't work.\nWhen a user leaves the organization\nSharePoint\n:\nWhen a user leaves your organization, any content created by that user isn't affected because SharePoint is considered a collaborative environment, unlike a user's mailbox or OneDrive account.\nOneDrive\n:\nIf a user leaves your organization, any files that are subject to a retention policy or has a retention label will remain subject to the retention settings for the duration of the retention period specified in the policy or label. During that time, all sharing access continues to work and the content continues to be discoverable by Content Search and eDiscovery.\nWhen the retention period expires and the retention settings included a delete action, content moves into the Site Collection Recycle Bin and isn't accessible to anyone except the admin.\nConfiguration guidance\nIf you're new to configuring retention in Microsoft 365, see\nGet started with data lifecycle management\n.\nIf you're ready to configure a retention policy or retention label for Exchange, see the following instructions:\nCreate and configure retention policies\nPublish retention labels and apply them in apps\nApply a retention label to content automatically\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Retention for SharePoint", @@ -854,7 +854,7 @@ "https://learn.microsoft.com/en-us/purview/disposition": { "content_hash": "sha256:40ac5bd3664fe88bad2c47182c998d8367c42996ff94ecf4c0fe74d1aaff193e", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nDisposition of content\nFeedback\nSummarize this article for me\nMicrosoft Purview service description\nUse the\nDisposition\npage from\nRecords Management\nin the Microsoft Purview portal to manage disposition reviews and view the metadata of\nitems marked as records\nthat have been automatically deleted at the end of their retention period.\nPrerequisites for viewing content dispositions\nTo manage disposition reviews and confirm that items marked as records have been deleted, you must have sufficient permissions and auditing must be enabled. Also be aware of any\nlimitations\nfor disposition.\nPermissions for disposition\nTo successfully access\nDisposition\nin the Microsoft Purview portal, users must have the\nDisposition Management\nrole. This role is included in the\nRecords Management\ndefault role group.\nNote\nBy default, a global admin isn't granted the\nDisposition Management\nrole.\nTo grant users just the permissions they need for disposition reviews without granting them permissions to view and configure other features for retention and records management, create a custom role group (for example, named \"Disposition Reviewers\") and grant this group the\nDisposition Management\nrole.\nFor instructions to add users to the default roles or create your own role groups, see\nPermissions in the Microsoft Purview portal\n.\nAdditionally:\nTo view the contents of items during the disposition process, add users to the\nContent Explorer Content Viewer\nrole group. If users don't have the permissions from this role group, they can still select a disposition review action to complete the disposition review, but must do so without being able to view the item's contents from the mini-preview pane in the Microsoft Purview portal.\nBy default, each person that accesses the\nDisposition\npage sees only items that they're assigned to review. For a records management administrator to see all items assigned to all users, and all retention labels that are configured for disposition review: From the Microsoft Purview portal, navigate to\nSettings\n>\nSolution settings\n> **Records Management ** >\nDisposition\nto select and then enable a mail-enabled security group that contains the administrator accounts. Permissions are then granted to the group members, but not the group owner.\nMicrosoft 365 groups and security groups that aren't mail-enabled don't support this feature and don't display in the list to select. If you need to create a new mail-enabled security group, use the link to the\nMicrosoft 365 admin center\nto create the new group.\nImportant\nAfter you've enabled the group, you can't change it in the Microsoft Purview portal. See the next section for how to enable a different group by using PowerShell.\nThe Records Management settings are visible only to record management administrators.\nEnabling another security group for disposition\nAfter you've enabled a security group for disposition from the\nRecords Management settings\nin the Microsoft Purview portal, you can't disable this permission for the group or replace the selected group in the portal. However, you can enable another mail-enabled security group by using the\nEnable-ComplianceTagStorage\ncmdlet.\nFor example:\nEnable-ComplianceTagStorage -RecordsManagementSecurityGroupEmail dispositionreviewers@contosoi.com\nEnable auditing\nMake sure that auditing is enabled at least one day before the first disposition action. For more information, see\nSearch the audit log\n.\nDisposition reviews\nWhen content reaches the end of its retention period, there are several reasons why you might want to review that content and confirm whether it can be permanently deleted (\"disposed\"). For example, instead of deleting the content, you might need to:\nSuspend the deletion of relevant content for litigation or an audit.\nAssign a different retention period to the content, perhaps because the original retention settings were a temporary or provisional solution.\nMove the content from its existing location to an archive location, for example, if that content has research or historical value.\nWhen a disposition review is triggered at the end of the retention period, your chosen reviewers receive an email notification that they have content to review. These reviewers can be individual users or members of a mail-enabled security group. When you use a mail-enabled security group, only group members and not the group owner receive the email notifications.\nYou can customize the notification email that reviewers receive, including instructions in different languages. For multi-language support, you must specify the translations yourself and this custom text is displayed to all reviewers irrespective of their locale.\nUsers receive an initial email notification per label at the end of the item's retention period, with a reminder per label once a week of all disposition reviews that they're assigned. They can click the link in the notification and reminder emails to go directly to the records management\nDisposition\npage in the Microsoft Purview portal to review the content and take an action. Alternately, the reviewers can navigate to this\nDisposition\npage in the Microsoft Purview portal. Then:\nReviewers see only the disposition reviews that are assigned to them, whereas administrators who are added to the selected security group for records manager see all disposition reviews.\nReviewers can add new users to the same disposition review. Note that this action does not automatically grant these added users the\nrequired permissions\n.\nFor the disposition review process, a mini-review pane for each item shows a preview of the content if they have permissions to see it. If they don't have permissions, they can select the content link and request permissions. This mini-review pane also has tabs for additional information about the content:\nDetails\nto display indexed properties, where it's located, who created it and when, and who last modified it and when.\nHistory\nthat shows the history of any disposition review actions to date, with reviewer comments if available.\nA disposition review can include content in Exchange mailboxes, SharePoint sites, and OneDrive accounts. Content pending a disposition review in those locations is permanently deleted only after a reviewer for the final stage of disposition chooses to permanently delete the content.\nNote\nA mailbox must have at least 10 MB data to support disposition reviews.\nAdministrators can see an overview of all pending dispositions in the\nOverview\ntab. Reviewers see only their items pending disposition.\nSelect the\nView all pending dispositions\n, to view them the\nDisposition\npage.\nWorkflow summary for a disposition review\nBasic workflow for a disposition review, single stage:\nThe admin configures a retention label to start a disposition review at the end of the retention period.\nThe admin publishes the label and a user applies the label to one or more items, or the admin automatically applies the retention label to items.\nAt the end of the retention period for each item, the designated disposition reviewers receive an email to review the item for disposition.\nOne of the reviewers selects the link to review the item in the disposition page from the Microsoft Purview portal, and confirms that the item should be permanently deleted.\nAuto-approval for disposition\nYou can optionally specify a time period (7-365 days) for auto-approval. The default period if you select this option is 14 days.\nIf designated reviewers don't take manual action during this time period by using the\nstandard disposition review process\n, the item automatically passes to the next review stage. If the item is in the final review stage, the item is automatically disposed with permanent deletion.\nImportant\nIf you configure this option and items are already pending disposition review, they automatically become auto-approved if they have already exceeded the number of days that you specified for auto-approval. The time period always starts from when the item is ready for disposition review and not from when you configure the option.\nAs with all retention label changes, allow up to 7 days if you turn on, turn off, or change the number of days for this option.\nThere's no new auditing event for auto-approval. Instead, use the details in the existing\nApproved disposal\nauditing event to identify whether the item was manually approved or automatically approved by using this option.\nHow to configure a retention label for disposition review\nTriggering a disposition review at the end of the retention period is a configuration option that's available only with a retention label. Disposition review isn't available for a retention policy. For more information about these two retention solutions, see\nLearn about retention policies and retention labels\n.\nFrom the\nChoose what happens after the retention period\npage for a retention label:\nAfter you select the\nStart a disposition review\noption, select\n+ Create stages and assign reviewers\n. On the next page of the configuration, you'll specify how many consecutive stages of disposition you want and the disposition reviewers for each stage:\nOptionally, select whether you want to use\nautomatic-approval\n. If you use this option, specify the number of days reviewers have to take manual action before the item is automatically moved to the next disposition stage or automatically disposed.\nSelect\n+ Add a stage\n, and name your stage for identification purposes. Then specify the reviewers for that stage.\nFor the reviewers, specify up to 10 individual users or mail-enabled security groups. Microsoft 365 groups aren't supported for this option.\nIf you need more than one person to review an item at the end of its retention period, select\nAdd another stage\nand repeat the configuration process for the number of stages that you need, with a maximum of five stages.\nWithin each individual stage of disposition, any of the users you specify for that stage are authorized to take the next action for the item at the end of its retention period. These users can also add other users to their disposition review stage.\nNote\nIf you configured retention labels before multi-staged disposition review was available, you can upgrade your labels to support this feature: Edit the label and select\nEdit stages and reviewers\non the\nChoose what happens after the retention period\npage.\nDuring the configuration phase, for each stage specified, you can rename it, reorder it, or remove it by selecting\nEdit stages and reviewers\nthat now displays for the\nStart a disposition review\noption. Then for each stage, you can select the Stage actions option (\n...\n):\nHowever, you can't reorder or remove a stage after you've created the retention label. You'll see only the\nAdd a stage\nand\nRename a stage\noptions available. You can still edit the reviewers.\nAfter you've specified your reviewers, remember to grant them the\nDisposition Management\nrole permission. For more information, see the\nPermissions for disposition\nsection on this page.\nHow to customize email messages for disposition review\nExample default email notification sent to a reviewer:\nYou can customize the email messages that are sent to disposition reviewers for the initial notification and then reminders.\nSign in to the Microsoft Purview portal\n>\nSettings\n>\nSolutions settings\n>\nRecords Management\n>\nDisposition\n.\nFrom the\nEmail notifications for disposition reviews\nsection, select and specify whether you want to use just the default email message, or add your own text to the default message. Your custom text is added to the email instructions after the information about the retention label and before the next steps instructions.\nText for all languages can be added, but formatting and images are unsupported. URLs and email addresses can be entered as text and depending on the email client, display as hyperlinks or unformatted text in the customized email.\nExample text to add:\nIf you need additional information, visit the helpdesk website (https://support.contoso.com) or send them an email (helpdesk@contoso.com).\nSelect\nSave\nto save any changes.\nViewing and disposing of content\nWhen a reviewer is notified by email that content is ready to review, they can click a link in the email that takes them directly to the\nDisposition\npage from\nRecords management\nin the Microsoft Purview portal. There, the reviewers can see how many items for each retention label are waiting disposition with the\nType\ndisplaying\nPending disposition\n. They then select a retention label, and\nOpen in new window\nto see all content with that label:\nFrom the\nPending dispositions\npage, they see all pending dispositions for that label. When one or more items are selected, they can use the mini-preview pane and the\nSource\n,\nDetails\n, and\nHistory\ntab to inspect the content before taking action on it:\nIf you use the horizontal scroll bar, or close the min-review pane, you see more columns that include the expiry date and the name of the disposition review stage.\nAs you can see from the example shown, the actions supported are:\nApprove disposal\n:\nWhen this action is selected for an interim stage of disposition review (you've configured multiple stages): The item moves to the next disposition stage.\nWhen this action is selected for the final stage of disposition review, or there's only one stage of disposition: The item is marked as eligible for permanent deletion, which happens within 15 days.\nRelabel\n:\nWhen this action is selected, the item exits the disposition review process for the original label. The item is then subject to the retention settings of the newly selected retention label.\nExtend\n:\nWhen this action is selected, disposition review is effectively suspended until the end of the extended period and then disposition review is triggered again from the first stage.\nAdd reviewers\n:\nWhen this action is selected, the user is prompted to specify and add other users for review.\nNote\nThis action doesn't automatically grant the\nrequired permissions\nto the users who are added. If they don't have these permissions, they can't participate in the disposition review.\nEach action taken has a corresponding audit event in the\nDisposition review activities\nauditing activities group.\nDuring the disposition review process, unless you're using the optional setting of an\nauto-approval timeout period\n, the content never moves from its original location, and it's not marked for permanent deletion until this action is selected by a reviewer for the final or only disposition stage.\nDisposition of records\nFrom the\nRecords management\nmain page >\nDisposition\ntab, you can identify:\nItems deleted as a result of a disposition review.\nItems marked as a record or regulatory record but not marked for disposition review and automatically deleted at the end of their retention period.\nThese items indicate\nRecords Disposed\nin the\nType\ncolumn. For example:\nNote\nThis functionality uses information from the\nunified audit log\nand therefore requires auditing to be\nenabled and searchable\nso the corresponding events are captured.\nFor auditing of deleted items that were marked as records or regulatory records, search for\nDeleted file marked as a record\nin the\nFile and page activities\ncategory. This audit event is applicable to documents and emails.\nFilter and export the views\nWhen you select a retention label from the\nDisposition\npage, the\nPending disposition\ntab (if applicable) and the\nDisposed items\ntab let you filter the views to help you more easily find items.\nFor pending dispositions, the time range is based on the expiration date. For disposed items, the time range is based on the deletion date.\nYou can export information about the items in either view as a .csv file that you can then sort and manage using Excel.\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Disposition", @@ -863,7 +863,7 @@ "https://learn.microsoft.com/en-us/purview/retention-regulatory-requirements": { "content_hash": "sha256:47f0379e89d341189dd73e672fa86ad4cf9a31e33c5acfd5098fca64c883b8d4", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nRegulatory requirements for data lifecycle management and records management\nFeedback\nSummarize this article for me\nMicrosoft Purview service description\nUse the resources on this page to help you meet specific regulatory requirements for data lifecycle management and records management in Microsoft 365. Each section of this document focuses on one or more related regulations and includes any existing guidance or third-party assessment of how to configure Microsoft 365 to help with the requirements outlined.\nThese resources are available to download from the\nData Protection Resources, FAQ, and White Papers\npage of the Service Trust Portal.\nNew Zealand Public Records Act\nSupporting New Zealand's Public Records Act compliance obligations with Microsoft 365\n-\nDownload assessment\nApplicable workloads: SharePoint, OneDrive, Teams, and Exchange\nReleased January 2021, this report has been produced in partnership with Microsoft New Zealand to assess the capabilities of Microsoft 365 services for recording, storing, and managing requirements for electronic records, as specified by:\nNew Zealand Public Records Act 2005, which sets guidelines for preservation of public archives and local authority archives in New Zealand.\nThis report helps you understand how the system aspects of the New Zealand Public Records Act 2005 (PRA) are achievable when using Microsoft 365.\nSEC 17a-4(f), FINRA 4511(c), and CFTC 1.31(c)-(d)\nCohasset Assessment - Microsoft 365 - SEC Rule 17a-4(f) - Immutable Storage for SharePoint, OneDrive, Exchange, Teams, and Viva Engage\n-\nDownload assessment\nApplicable workloads: SharePoint, OneDrive, Teams, Exchange, and Viva Engage\nLatest version released July 2022, this report has been produced in partnership with Cohasset Associates, Inc. (Cohasset) to assess the capabilities of Microsoft 365 services for recording, storing, and managing requirements for electronic records, as specified by:\nSecurities and Exchange Commission (SEC) in 17 CFR § 240.17a-4(f), which regulates exchange members, brokers or dealers.\nFinancial Industry Regulatory Authority (FINRA) Rule 4511(c), which defers to the format and media requirements of SEC Rule 17a-4(f).\nThe principles-based electronic records requirements of the Commodity Futures Trading Commission (CFTC) in 17 CFR § 1.31(c)-(d).\nThe opinion from Cohasset is that when compliance features are properly configured and carefully applied and managed as described in their report, the assessed Microsoft 365 services meet the five requirements related to the recording and non-rewriteable, non-erasable storage of electronic records.\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "SEC 17a-4 / Preservation Lock", @@ -872,7 +872,7 @@ "https://learn.microsoft.com/en-us/purview/records-management": { "content_hash": "sha256:5878bb9d12dc52ab6be5e5f766a4f3e057bacc0e3b7fdf158f4ff5959f72524d", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nLearn about records management\nFeedback\nSummarize this article for me\nMicrosoft Purview service description\nA records management system, also known as records and information management, is a solution for organizations to manage regulatory, legal, and business-critical records. Records management for Microsoft Purview helps you achieve your organization's legal obligations, provides the ability to demonstrate compliance with regulations, and increases efficiency with regular disposition of items that are no longer required to be retained, no longer of value, or no longer required for business purposes.\nUse the following capabilities to support your records management solution for Microsoft 365 data:\nLabel items as a record\n. Create and configure retention labels to mark items as a\nrecord\nthat can then be applied by users or automatically applied by identifying sensitive information, keywords, or content types.\nMigrate and manage your retention requirements with file plan\n. By using a\nfile plan\n, you can bring in an existing retention plan to Microsoft 365, or build a new one for enhanced management capabilities.\nConfigure retention and deletion settings with retention labels\n. Configure\nretention labels\nwith the retention periods and actions based on various factors that include the date last modified or created.\nStart different retention periods when an event occurs\nwith\nevent-based retention\n.\nReview and validate disposition\nwith\ndisposition reviews\nand proof of\nrecords deletion\n.\nExport information about all disposed items\nwith the\nexport option\n.\nSet specific permissions\nfor records manager functions in your organization to\nhave the right access\n.\nUsing these capabilities, you can incorporate your organization's retention schedules and requirements into a records management solution that manages retention, records declaration, and disposition, to support the full lifecycle of your content.\nIn addition to the online documentation, you might find it useful to download a\ndeck with FAQs\nfrom a records management webinar. The recording of the actual webinar is no longer available.\nRecords\nWhen an item is declared a record by using a retention label:\nRestrictions are placed on the item in terms of what\nactions are allowed or blocked\n.\nAdditional activities about the item are logged.\nYou have proof of disposition when the item is deleted at the end of their retention period.\nYou use\nretention labels\nto mark items as a\nrecord\n, or a\nregulatory record\n. The difference between these two are explained in the next section. You can either publish those labels so that users and administrators can manually apply them to items, or for labels that mark items as a record, you can auto-apply those labels.\nBy using retention labels to declare records, you can implement a single and consistent strategy for managing records across your Microsoft 365 environment.\nCompare restrictions for what actions are allowed or blocked\nUse the following table to identify what restrictions are placed on items as a result of applying a standard retention label, and retention labels that mark items as a record or regulatory record.\nA standard retention label has retention settings and actions but doesn't mark items as a record or a regulatory record.\nNote\nFor completeness, the table includes columns for a locked and unlocked record, which is applicable to SharePoint and OneDrive, but not Exchange. The ability to lock and unlock a record uses\nrecord versioning\nthat isn't supported for Exchange items. So for all Exchange items that are marked as a record, the behavior maps to the\nRecord - locked\ncolumn, and the\nRecord - unlocked column\nis not relevant.\nAction\nRetention label\nRecord - locked\nRecord - unlocked\nRegulatory record\nEdit contents\nAllowed\nBlocked\nAllowed\nBlocked\nEdit properties, including rename\nAllowed\nAllowed\n1\nAllowed\nBlocked\nDelete\nAllowed\n2\nBlocked\nBlocked\nBlocked\nCopy\nAllowed\nAllowed\nAllowed\nAllowed\nMove within container\n3\nAllowed\nAllowed\nAllowed\nAllowed\nMove across containers\n3\nAllowed\nAllowed if never unlocked\nBlocked\nBlocked\nOpen/Read\nAllowed\nAllowed\nAllowed\nAllowed\nChange label\nAllowed\nAllowed - container admin only\nBlocked\nBlocked\nRemove label\n4\nAllowed\nAllowed - container admin only\nBlocked\nBlocked\nOverride with\npriority cleanup\nAllowed\nBlocked\nBlocked\nBlocked\nFootnotes:\n1\nEditing properties for a locked record is allowed by default but can be blocked by a tenant setting in the\nMicrosoft Purview portal\n:\nSign in to the Microsoft Purview portal\n>\nSettings\n>\nSolutions settings\n>\nRecords Management\n>\nRetention Labels\n>\nAllow editing of record properties\n2\nDeleting labeled items in SharePoint and OneDrive can be blocked as a tenant setting in the\nMicrosoft Purview portal\n:\nSign in to the Microsoft Purview portal\n>\nSettings\n>\nSolutions settings\n>\nRecords Management\n>\nRetention Labels\n>\nDeletion of items\n.\nWhen you apply a standard retention label to a list item that has a document attachment, that document doesn't inherit the retention settings and can be deleted from the list item. In comparison, if that retention label marked items as a record or regulatory record, the document attachment would inherit the retention settings and couldn't be deleted.\n3\nContainers include SharePoint sites, OneDrive accounts, and Exchange mailboxes.\n4\nLabels can be removed from items even if the labels are no longer published.\nImportant\nThe most important difference for a regulatory record is that after it is applied to content, nobody, not even a global administrator, can remove the label.\nRetention labels configured for regulatory records also have the following admin restrictions:\nThe retention period can't be made shorter after the label is saved, only extended.\nThese labels aren't supported by auto-labeling policies, and must be applied by using\nretention label policies\n.\nIn addition, a regulatory label can't be applied to a document that's checked out in SharePoint.\nBecause of the restrictions and irreversible actions, make sure you really do need to use regulatory records before you select this option for your retention labels. To help prevent accidental configuration, this option is not available by default but must first be enabled by using PowerShell. Instructions are included in\nDeclare records by using retention labels\n.\nValidating migrated records\nIf you're migrating files to SharePoint or OneDrive and your organization needs to manage these items as records, you might need to validate that the files haven't been altered and retain their immutability status. For example, you're using a migration solution and need to meet the chain of custody requirements. Typical file properties and methods often used for this type of validation, such as file size or file hash, might not be sufficient because SharePoint automatically updates the metadata for a file when it's uploaded.\nInstead, to validate your migrated files, you can use the value of the\nvti_writevalidationtoken\nproperty, which is a base64-encoded XOR hash of the file before it is modified by SharePoint. Use the following steps:\nGenerate the XOR hash of the original file by using the QuickXorHash algorithm. For more information, see the\nQuickXorHash Algorithm code snippet\n.\nBase64-encode the XOR hash. For more information, see the\nBase64Encode method documentation\n.\nAfter the file is migrated, retrieve the value of the\nvti_writevalidationtoken\nproperty from the uploaded file.\nCompare the value generated in step 2 with the value retrieved in step 3. These two values should match. If they do, you've validated that the file hasn't changed.\nConfiguration guidance\nSee\nGet started with records management\n. This article has information about subscriptions, permissions, and links to end-to-end configuration guidance for records management scenarios.\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Records Management", @@ -881,7 +881,7 @@ "https://learn.microsoft.com/en-us/purview/data-lifecycle-management": { "content_hash": "sha256:329d7423ea7fc5f66af3433064c97013e5e966e23b7f68d2444288affe6a4b35", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nLearn about data lifecycle management\nFeedback\nSummarize this article for me\nMicrosoft Purview service description\nMicrosoft Purview Data Lifecycle Management (formerly Microsoft Information Governance) provides you with tools and capabilities to retain the content that you need to keep, and delete the content that you don't.\nRetaining and deleting content is often needed for compliance and regulatory requirement, but deleting content that no longer has business value also helps you manage risk and liability. For example, it reduces your attack surface.\nMicrosoft 365 features\nRetention policies\nare the cornerstone for data lifecycle management. Use these policies for Microsoft 365 workloads that include Exchange, SharePoint, OneDrive, Teams, and Viva Engage. Configure whether content for these services needs to be retained indefinitely, or for a specific period if users edit or delete it. Or you can configure the policy to automatically permanently delete the content after a specified period if it's not already deleted. You can also combine these two actions for retain and then delete, which is a very typical configuration. For example, retain email for three years and then delete it.\nWhen you configure a retention policy, you can target all instances in your organization (such as all mailboxes and all SharePoint sites), or individual instances (such as only the mailboxes for specific departments or regions, or just selected SharePoint sites).\nIf you need exceptions for individual emails or documents, such as a longer retention period for legal documents, you do this with\nretention labels\nthat you publish to apps so that users can apply them, or automatically apply them by inspecting the content.\nRetention labels are also used with\nAdaptive Protection\n, if you're using\nthis solution with insider risk management\n. In this case, the retention label and auto-apply policy is automatically created for you. For details, see\nDynamically mitigate the risk of accidental or malicious deletes\n.\nUnder the covers, retention labels are also used with\npriority cleanup\nfor scenarios where you need to expedite the permanent deletion of sensitive information from mailboxes, even if they have existing holds for retention or eDiscovery.\nFor more information about retention policies and retention labels, and how retention works in Microsoft 365, see\nLearn about retention policies and retention labels\n.\nNote\nIf you need to manage high-value items for business, legal, or regulatory record-keeping requirements, use retention labels with\nrecords management\nrather than retention labels with data lifecycle management.\nOther data lifecycle management capabilities to help you keep what you need and delete what you don't:\nMailbox archiving\nto provide users with additional mailbox storage space, and auto-expanding archiving for mailboxes that need more than 100 GB storage. A default archiving policy automatically moves email to the archive mailbox, and if required, you can customize this policy. For more information about mailbox archiving, see\nLearn about archive mailboxes\n.\nInactive mailboxes\nthat retain mailbox content after employees leave the organization. For more information about inactive mailboxes, see\nLearn about inactive mailboxes\n.\nImport service for PST files\nby using network upload or drive shipping. For more information, see\nLearn about importing your organization's PST files\n.\nExchange (legacy) features\nRetention policies and retention tags\nfrom messaging records management (MRM), and\njournaling rules\nare older compliance features from Exchange that were originally configurable from the Classic Exchange admin center. They haven't been brought forward to the\nnew Exchange admin center\n.\nIf you're not already using these features, or have a specific business requirement to use them instead of the Microsoft 365 features for data lifecycle management, we don't recommend you use these older compliance features. Instead, use the newer Microsoft 365 features that retain data in place and support policies across other Microsoft 365 services.\nFor more information, see\nUse retention policies and retention labels instead of older features\n.\nDeployment guidance\nFor deployment guidance for data lifecycle management that includes a recommended deployment roadmap, licensing information, permissions, a list of supported scenarios, and end-user documentation, see\nGet started with data lifecycle management\n.\nLooking for deployment guidance to protect your data? See\nDeploy an information protection solution with Microsoft Purview\n.\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Data Lifecycle Management", @@ -890,7 +890,7 @@ "https://learn.microsoft.com/en-us/purview/ediscovery": { "content_hash": "sha256:fd46c21e9e2b25a377bc377a02807bd85f6ac1614ec0292536d44f2506cf98d4", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nMicrosoft Purview eDiscovery legacy solutions\nFeedback\nSummarize this article for me\nCaution\nMicrosoft retired all classic eDiscovery experiences on August 31, 2025. This retirement includes classic\nContent Search\n, classic\neDiscovery (Standard)\n, and classic\neDiscovery (Premium)\n.\nThe guidance in this article only applies to organizations hosted in Microsoft 365 operated by 21Vianet (China). If your organization isn't hosted by 21Vianet, use the guidance for the\nnew eDiscovery experience\nin the\nMicrosoft Purview portal\n.\nElectronic discovery, or eDiscovery, is the process of identifying and delivering electronic information that can be used as evidence in legal cases. You can use eDiscovery tools in Microsoft Purview to search for content in Exchange Online, OneDrive for Business, SharePoint Online, Microsoft Teams, Microsoft 365 Groups, and Viva Engage teams. You can search mailboxes and sites in the same eDiscovery search, and then export the search results. You can use Microsoft Purview eDiscovery (Standard) cases to identify, hold, and export content found in mailboxes and sites. If your organization has an Office 365 E5 or Microsoft 365 E5 subscription (or related E5 add-on subscriptions), you can further manage custodians and analyze content by using the feature-rich Microsoft Purview eDiscovery (Premium) solution in Microsoft 365.\neDiscovery solutions\nMicrosoft Purview provides three eDiscovery solutions: Content search, eDiscovery (Standard), and eDiscovery (Premium).\nContent Search\neDiscovery (Standard)\neDiscovery (Premium)\n- Search for content\n- Keyword queries and search conditions\n- Export search results\n- Role-based permissions\n- Search and export\n- Case management\n- Legal hold\n- Custodian management\n- Legal hold notifications\n- Advanced indexing\n- Review set filtering\n- Tagging\n- Analytics\n- Predictive coding models\nAnd more...\nContent search\n. Use the Content search tool to search for content across Microsoft 365 data sources and then export the search results to a local computer.\neDiscovery (Standard)\n. eDiscovery (Standard) builds on the basic search and export functionality of Content search by enabling you to create eDiscovery cases and assign eDiscovery managers to specific cases. eDiscovery managers can only access the cases of which they're members. eDiscovery (Standard) also lets you associate searches and exports with a case and lets you place an eDiscovery hold on content locations relevant to the case.\neDiscovery (Premium)\n. The eDiscovery (Premium) tool builds on the existing case management, preservation, search, and export capabilities in eDiscovery (Standard). eDiscovery (Premium) provides an end-to-end workflow to identify, preserve, collect, review, analyze, and export content that's responsive to your organization's internal and external investigations. It lets legal teams manage custodians and the legal hold notification workflow to communicate with custodians involved in a case. It allows you to collect and copy data from the live service into review sets, when you can filter, search, and tag content to cull non-relevant content from further review so your workflow can identify and focus on content that's most relevant. eDiscovery (Premium) provides analytics and machine learning-based predictive coding models to further narrow to scope of your investigation to the most relevant content.\nComparison of key capabilities\nThe following table compares the key capabilities available in Content search, eDiscovery (Standard), and eDiscovery (Premium).\nCapability\nContent search\neDiscovery (Standard)\neDiscovery (Premium)\nSearch for content\nKeyword queries and search conditions\nSearch statistics\nExport search results\nRole-based permissions\nCase management\nPlace content locations on legal hold\nCustodian management\nLegal hold notifications\nAdvanced indexing\nError remediation\nReview sets\nSupport for cloud attachments and SharePoint versions\nOptical character recognition\nConversation threading\nCollection statistics and reports\nReview set filtering\nTagging\nAnalytics\nPredictive coding models\nComputed document metadata\nTransparency of long-running jobs\nExport to customer-owned Azure Storage location\nHere's a description of each eDiscovery capability.\nSearch for content\n. Search for content that's stored in Exchange mailboxes, OneDrive for Business accounts, SharePoint sites, Microsoft Teams, Microsoft 365 Groups, and Viva Engage Teams. This includes content generated by other Microsoft 365 apps that store data in mailboxes and sites.\nKeyword queries and search conditions\n. Create Keyword Query Language (KeyQL) search queries to search for content keywords that match query criteria. You can also include conditions to narrow the scope of your search.\nSearch statistics\n. After you run a search, you can view statistics of the estimated search results, such as the number and total size of items matching your search criteria. Other statistics include the top content locations that contain search results and the number of items that match different parts of the search query.\nExport search results\n. Export search results to a local computer in your organization in a two-step process. When you export search results, items are copied from their original content location in Microsoft 365 to a Microsoft-provided Azure Storage location. Then you can download those items to a local computer.\nRole-based permissions\n. Use role-based access control (RBAC) permissions to control what eDiscovery-related tasks that different users can perform. You can use a built-in eDiscovery-related role group or create custom role groups that assign specific eDiscovery permissions.\nCase management\n. eDiscovery cases in eDiscovery (Standard) and eDiscovery (Premium) let you associate specific searches and exports with a specific investigation. You can also assign members to a case to control who can access the case and view the contents of the case. eDiscovery (Premium) also supports new case creation integration with\nMicrosoft Purview Insider Risk Management\ncases.\nPlace content locations on legal hold\n. Preserve content relevant to your investigation by placing a legal hold on the content locations in a case. This lets you secure electronically stored information from inadvertent (or intentional) deletion during your investigation.\nCustodian management\n. Manage the people that you've identified as people of interest in the case (called\ncustodians\n) and other data sources that may not be associated with a custodian. When you add custodians and non-custodial data sources to a case, you can place a legal hold on these data sources, communicate with custodians by using the legal hold notification process, and search custodian and non-custodial data sources to collect content relevant to the case.\nLegal hold notifications\n. Manage the process of communicating with case custodians. A legal hold notification instructs custodians to preserve content that's relevant to the case. You can track the notices that were received, read, and acknowledged by custodians. The communications workflow in eDiscovery (Premium) allows you to create and send initial notifications, reminders, and escalations if custodians fail to acknowledge a hold notification.\nAdvanced indexing\n. When you add custodial and non-custodian data sources to a case, the associated content locations are reindexed in a process called\nAdvanced indexing\n. Advanced indexing ensures any content deemed as partially indexed is reprocessed to make it fully searchable when you collect data for an investigation.\nError remediation\n. Fix processing errors using a process called\nerror remediation\n. Error remediation allows you to rectify data issues that prevent eDiscovery (Premium) from properly processing the content during Advanced indexing. For example, files that are password protected can't be processed since the files are locked or encrypted. Using error remediation, you can download files with errors, remove the password protection, and then upload the remediated files.\nReview sets\n. Add relevant data to a review set. A review set is a secure, Microsoft-provided Azure Storage location in the Microsoft cloud. When you add data to a review set, the collected items are copied from their original content location to the review set. Review sets provide a static, known set of content that you can search, filter, tag, analyze, and predict relevancy using predictive coding models. You can also track and report on what content gets added to the review set.\nSupport for cloud attachments and SharePoint versions\n. When you add content to a review set, you have the option to include cloud attachments or linked files. This means that the target file of a cloud attachment or linked file is added to the review set. You also have the option to add all versions of a SharePoint document to a review set.\nOptical character recognition (OCR)\n. When content is added to a review set, OCR functionality extracts text from images, and includes the image text with the content that's added to a review set. This lets you search for image text when you query the content in the review set.\nConversation threading\n. When chat messages from Teams and Viva Engage conversations are added to a review set, you can collect the entire conversation thread. This means that the entire chat conversation that contains items that match the collection criteria is added to the review set. This lets you review chat items in the context of the back-and-forth conversation.\nCollection statistics and reports\n. After you create a collection estimate or commit a collection to a review set, you can view a rich set of statistics on the retrieved items, such as the content locations that contain the most items that matched the search criteria and the number of items returned by the search query. You can also preview a subset of the results.\nReview set filtering\n. After content is added to a review set, you can apply filters to display only the set of items that match your filtering criteria. Then you can save the filter sets as a query, which lets you quickly reapply the saved filters. Review set filtering and saved queries help you quickly select content items that are most relevant to your investigation.\nTagging\n. Tags also help you omit non-relevant content and identify the most relevant content. When experts, attorneys, or other users review content in a review set, their opinions related to the content can be captured by using tags. For example, if the intent is to exclude unnecessary content, a user can tag documents with a tag such as \"non-responsive\". After content has been reviewed and tagged, a review set query can be created to exclude any content tagged as \"non-responsive\". This process eliminates the non-responsive content from subsequent steps in the eDiscovery workflow.\nAnalytics\n. eDiscovery (Premium) provides tools to analyze review set documents to help you organize the documents in a coherent manner and reduce the volume of documents to be reviewed.\nNear duplicate detection\ngroups textually similar documents together to help you make your review process more efficient.\nEmail threading\nidentifies specific email messages that give a complete context of the conversation in an email thread.\nThemes\nfunctionality attempts to analyze themes in review set documents and assign a theme to documents so that you can review documents with related theme. These analytics capabilities help make your review process more efficient so that reviewers can review a fraction of collected documents.\nPredictive coding models\n. Use predictive coding models to reduce large volumes of case content to a relevant set of items that you can prioritize for review. This is accomplished by creating and training your own predictive coding models that help you prioritize the review of the most relevant items in a review set. The system uses the training to apply prediction scores to every item in the review set. This lets you filter items based on the prediction score, which allows you to review the most relevant (or non-relevant) items first.\nComputed document metadata\n. Many of the eDiscovery (Premium) features, such as Advanced indexing, conversation threading, analytics, and predictive coding add metadata properties to review set documents. This metadata contains information related to the function performed by a specific feature. When reviewing documents, you can filter on metadata properties to display documents that match your filter criteria. This metadata can be imported into third-party review applications after review set documents are exported.\nTransparency of long-running jobs\n. Jobs in eDiscovery (Premium) are typically long-running processes that are triggered by user actions, such as the adding custodians to a case, adding content to a review set, running analytics, and training predictive coding models. You can track the status of these jobs and get support information if you need to escalate issues to Microsoft Support.\nExport to customer-owned Azure Storage location\n. When you export documents from a review set, you have the option to export them to an Azure Storage account managed by your organization. Additionally, eDiscovery (Premium) lets you customize what data is exported. This includes exporting file metadata, native files, text files, tags, and redacted documents saved to a PDF file.\neDiscovery subscription comparison\nBefore you get started, review the\nsubscription requirements\nfor Content search, eDiscovery (Standard), and eDiscovery (Premium). Generally, subscriptions that support eDiscovery (Standard) also support Content search and subscriptions that support eDiscovery (Premium) also support Content search and eDiscovery (Standard).\nGet started with eDiscovery\nSee the following articles to help you learn more and get started using Microsoft Purview eDiscovery solutions.\nGet started with eDiscovery (Premium)\nOverview of eDiscovery (Premium)\nGet started with eDiscovery (Premium)\nCreate and manage an eDiscovery (Premium) case\nIntegration with Insider Risk Management\nCases in\nMicrosoft Purview Insider Risk Management\ncan be quickly escalated to new cases in Microsoft Purview eDiscovery (Premium) when additional legal review is needed for potentially risky user activity. The tight integration between these solutions can help your risk and legal teams work more efficiently and can help provide a complete end-to-end view of user activities under review. Check out how to\nget started with Insider Risk Management\nand how to easily\nescalate an Insider Risk Management case\nto an eDiscovery (Premium) case.\neDiscovery roadmap\nTo see what eDiscovery features have been launched, are rolling out, or in development, see the\nMicrosoft 365 Roadmap\n.\nTraining\nTraining your IT administrators, eDiscovery managers, and compliance investigation teams in the basics for Content search, eDiscovery (Standard), and eDiscovery (Premium) can help your organization get started more quickly using Microsoft Purview eDiscovery tools. To help these users in your organization getting started with eDiscovery, see\nDescribe the eDiscovery and audit capabilities of Microsoft Purview\n.\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "eDiscovery", @@ -899,7 +899,7 @@ "https://learn.microsoft.com/en-us/purview/ediscovery-create-and-manage-cases": { "content_hash": "sha256:7bce2072834cda567a5a2fd658e1623117829913ccf734ed29d9b26b76d9e6db", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nCreate and manage an eDiscovery (Premium) case\nFeedback\nSummarize this article for me\nCaution\nMicrosoft retired all classic eDiscovery experiences on August 31, 2025. This retirement includes classic\nContent Search\n, classic\neDiscovery (Standard)\n, and classic\neDiscovery (Premium)\n.\nThe guidance in this article only applies to organizations hosted in Microsoft 365 operated by 21Vianet (China). If your organization isn't hosted by 21Vianet, use the guidance for the\nnew eDiscovery experience\nin the\nMicrosoft Purview portal\n.\nAfter setting up Microsoft Purview eDiscovery (Premium) and\nassigning permissions to eDiscovery managers\nin your organization that will manage cases, the next step is to create and manage a case.\nThis article also provides a high-level overview of using cases to manage the eDiscovery (Premium) workflow for a legal case or other types of investigations.\nCreate a case\nComplete the following steps to create a case and configure case settings. The user who creates the case is automatically added as a member. Members of the case can access the case in the Microsoft Purview portal and perform eDiscovery (Premium) tasks.\nGo to the\nMicrosoft Purview portal\nand sign in using the credentials for user account that has been assigned eDiscovery permissions. Members of the\nOrganization Management\nrole group can also create eDiscovery (Premium) cases.\nIn the left navigation pane of the Microsoft Purview portal, select\nShow all\n, and then select\neDiscovery\n>\nPremium\n, and then select the\nCases\ntab.\nSelect\nCreate a case\n.\nOn the\nName and description\npage, complete the following fields:\nName\n: give the case a name (required). The case name must be unique in your organization\nDescription\n: Add an optional description to help others understand this case.\nNumber\n: Enter an optional docket number or other numeric identifier.\nCase format\n: The\nNew (recommended)\noption is automatically selected.\nNote\nThe legacy\nClassic\nformat is no longer available when creating new cases. This format is now retired for all new cases.\nSelect\nNext\n.\nOn the\nMembers and settings\npage, complete the following fields as applicable:\nTeam members\n: Select users and groups that should be assigned to the case. Make sure that users and groups assigned here have been\nassigned the appropriate eDiscovery permissions\n.\nSearch and analytics\n: Select the options to configure the case. You can skip this section and configure these settings after the case is created if needed.\nText to ignore\n: Add text or regex expressions to define text to ignore in the case. You can apply this to\nNear-duplicates\n,\nEmail threads\n, or\nThemes\nmodules.\nOptical character recognition (OCR)\n: Configure the option and settings for finding text contained in images during advanced indexing.\nSelect\nNext\n.\nOn the\nSummary\npage, review the settings for the case and edit the settings if needed. Select\nSubmit\nto create the new case and start your investigation.\nMark a case as a favorite\nYou can mark an eDiscovery (Premium) case as a favorite for quicker access to cases you want to prioritize. Cases marked as favorites can be accessed quickly via the eDiscovery (Premium)\nOverview\npage or can be sorted to be shown at the top of the\nCases\ntab for ease of access. You can mark a case as a favorite in the case list on the\nCases\ntab or on the top right of the case\nOverview\ntab for each case. The\nRecent favorite cases\ncard on the\nOverview\ntab displays all the cases marked as favorites in your organization.\nManage the workflow\nTo get you started using eDiscovery (Premium), here's a basic workflow that aligns with\ncommon eDiscovery practices\n. In each of these steps, we'll also highlight some extended eDiscovery (Premium) functionality that you can explore. ediscovery-overview\nAdd custodians\nand\nnon-custodial data sources\nto the case\n. The first step after creating a case is to add custodians. A\ncustodian\nis a person having administrative control of a document or electronic file that may be relevant to the case. Additionally, you can add data sources that aren't associated with a specific user but may be relevant to the case. These are\nnon-custodial data sources\n.\nHere are some things that happen (or that you can do) when you add custodians to a case:\nData in the custodian's Exchange mailbox, OneDrive account, and any Microsoft Teams or Viva Engage groups that the custodian is a member of can be \"marked\" as custodial data in the case.\nCustodian (and non-custodial) data is reindexed (by a process called\nAdvanced indexing\n). This helps optimize searching for it in the next step.\nYou can place a hold on custodian and non-custodial data. This hold preserves data that may be relevant to the case during the investigation. To learn more about managing holds, see\nManage holds in eDiscovery (Premium)\n.\nYou can associate other data sources with a custodian (for example, you can associate a SharePoint site or Microsoft 365 Group with a custodian) so this data can be reindexed, placed on hold, and searched, just like the data in the custodian's mailbox or OneDrive account.\nYou can use the\ncommunications workflow\nin eDiscovery (Premium) to send a legal hold notification to custodians.\nCollect relevant content from data sources\n. After you add custodians and non-custodial data sources to a case, use the built-in collections tool to search these data sources for content that may be relevant to the case. You use keywords, properties, and conditions to\nbuild search queries\nthat return search results with the data that's most likely relevant to the case. You can also:\nView\ncollection statistics\nthat may help you refine a collection to narrow the results.\nPreview a sample of the collection to quickly verify whether the relevant data is being found.\nRevise a query and rerun the collection.\nCommit collection to a review set\n. Once you've configured and verified that a search returns the desired data, the next step is to add the search results to a review set. When you add data to a review set, items are copied from their original location to a secure Azure Storage location. The data is reindexed again to optimize it for thorough and fast searches when reviewing and analyzing items in the review set. Additionally, you can also\nadd non-Office 365 data into a review set\n.\nThere's also a special kind of review set that you can add data to, called a\nconversation review set\n. These types of reviews sets provide conversation reconstruction capabilities to reconstruct, review, and export threaded conversations like those in Microsoft Teams. For more information, see\nReview conversations in eDiscovery (Premium)\n.\nReview and analyze data in a review set\n. Now that data is in a review set, you can use a wide-variety of tools and capabilities to view and analyze the case data with the goal of reducing the data set to what is most relevant to the case you're investigating. Here's a list of some tools and capabilities that you can use during this process.\nGroup and view documents\n. This includes selecting the group options for review sets in your cases, viewing the metadata for each document in a review set, and viewing the document in its native version or text version.\nCreate queries and filters\n. You create search queries using various search criteria (including the ability to search all\nfile metadata properties\nto further refine and cull the case data to what is most relevant to the case. You can also use review set filters to quickly apply other conditions to the results of a search query to further refine those results.\nCreate and use tags\n. You can apply tags to documents in a review set to identify which are responsive (or non-responsive to the case) and then use those tags when creating search queries to include or exclude the tagged documents. You can also tagging to determine which documents to export.\nAnnotate and redact documents\n. You can use the annotation tool in a review to annotate documents and redact content in documents as work product. We generate a PDF version of an annotated or redacted document during review to reduce the risk of exporting the unredacted native version of the document.\nAnalyze case data\n. The analytics functionality in eDiscovery (Premium) is powerful. After you run analytics on the data in review set, we perform analysis such as near duplicate detection, email threading, and themes that can help reduce the volume of documents that you have to review. We also generate an Analytics reports that summarize the result of running analytics. As previously explained, running analytics also runs\nthe attorney-client privilege detection model\n.\nExport and download case data\n. A final step after collecting, reviewing, and analyzing case data is to export it out of eDiscovery (Premium) for external review or for review by people outside of the investigation team. Exporting data is a two-step process. The first step is to\nexport\ndata out of the review set and copy it to a different Azure Storage location (one provided by Microsoft or one managed by your organization). Then you use Azure Storage Explorer to\ndownload\nthe data to a local computer. In addition to the exported data files, the contains of the export package also contains an export report, a summary report, and an error report.\neDiscovery (Premium) architecture\nHere's an architecture diagram that shows the eDiscovery (Premium) end-to-end workflow in a single-geo environment and in a multi-geo environment, and the end-to-end data flow that's aligned with the\nElectronic Discovery Reference Model\n.\nView as an image\nDownload as a PDF file\nDownload as a Visio file\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Create Cases", @@ -908,7 +908,7 @@ "https://learn.microsoft.com/en-us/purview/ediscovery-keyword-queries-and-search-conditions": { "content_hash": "sha256:7de350e4447c8705571163f76d96fd6fc98edbdc3a61dddc9bf56f0768ffd971", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nKeyword queries and search conditions for eDiscovery\nFeedback\nSummarize this article for me\nCaution\nMicrosoft retired all classic eDiscovery experiences on August 31, 2025. This retirement includes classic\nContent Search\n, classic\neDiscovery (Standard)\n, and classic\neDiscovery (Premium)\n.\nThe guidance in this article only applies to organizations hosted in Microsoft 365 operated by 21Vianet (China). If your organization isn't hosted by 21Vianet, use the guidance for the\nnew eDiscovery experience\nin the\nMicrosoft Purview portal\n.\nThis article describes the properties available to help find content across email and chat in Exchange Online and documents and files stored on SharePoint and OneDrive using the eDiscovery search tools in the Microsoft Purview portal.\nThis includes Content search, Microsoft Purview eDiscovery (Standard), and Microsoft Purview eDiscovery (Premium) (eDiscovery searches in eDiscovery (Premium) are called\ncollections\n). You can also use the\n*-ComplianceSearch\ncmdlets in\nSecurity & Compliance PowerShell\nto search for these properties.\nThis article also describes:\nUsing Boolean search operators, search conditions, and other search query techniques to refine your search results.\nSearching for communications of various types related to specific users and projects during a specific time frame.\nSearching for site content that is related to a specific project, users and/or subjects during a specific time period.\nFor step-by-step instructions on how to create different eDiscovery searches, see:\nContent search\nSearch for content in eDiscovery (Standard)\nCreate a collection estimate in eDiscovery (Premium)\nNote\neDiscovery searches in the Microsoft Purview portal and the corresponding\n*-ComplianceSearch\ncmdlets in Security & Compliance PowerShell use the Keyword Query Language (KeyQL). For more detailed information, see\nKeyword Query Language syntax reference\n.\nSearch tips and tricks\nThe time zone for all searches is Coordinated Universal Time (UTC). Changing time zones for your organization isn't currently supported. Time zone display settings in the search view are only for applicable for values in\nData\ncolumn and don't affect time stamps on collected items.\nKeyword searches aren't case-sensitive. For example,\ncat\nand\nCAT\nreturn the same results.\nThe Boolean operators\nAND\n,\nOR\n,\nNOT\n, and\nNEAR\nmust be uppercase.\nUsing quotes stops wild cards and any operations inside the quotes.\nA space between two keywords or two\nproperty:value\nexpressions is the same as using\nOR\n. For example,\nfrom:\"Sara Davis\" subject:reorganization\nreturns all messages sent by Sara Davis or messages that contain the word reorganization in the subject line. However, using a mix of spaces and\nOR\nconditionals in a single query might lead to unexpected results. We recommend using either spaces or\nOR\nin a single query.\nUse syntax that matches the\nproperty:value\nformat. Values aren't case-sensitive, and they can't have a space after the operator. If there's a space, your intended value is a full-text search. For example\nto: pilarp\nsearches for \"pilarp\" as a keyword, rather than for messages sent to pilarp.\nWhen searching a recipient property, such as To, From, Cc, or Recipients, you can use an SMTP address, alias, or display name to denote a recipient. For example, you can use pilarp@contoso.com, pilarp, or \"Pilar Pinilla.\"\nYou can use only prefix searches; for example,\ncat*\nor\nset*\n. Suffix searches (\n*cat\n), infix searches (\nc*t\n), and substring searches (\n*cat*\n) aren't supported.\nWhen searching a property, use double quotation marks (\" \") if the search value consists of multiple words or special characters. For example,\nsubject:budget Q1\nreturns messages that contain\nbudget\nin the subject line and that contain\nQ1\nanywhere in the message or in any of the message properties. Using\nsubject:\"budget Q1\"\nreturns all messages that contain\nbudget Q1\nanywhere in the subject line.\nTo exclude content marked with a certain property value from your search results, place a minus sign (-) before the name of the property. For example,\n-from:\"Sara Davis\"\nexcludes any messages sent by Sara Davis.\nYou can export items based on message type. For example, to export Skype conversations and chats in Microsoft Teams, use the syntax\nkind:im\n. To return only email messages, you would use\nkind:email\n. To return chats, meetings, and calls in Microsoft Teams, use\nkind:microsoftteams\n.\nWhen searching sites, you have to add the trailing\n/\nto the end of the URL when using the\npath\nproperty to return only items in a specified site. If you don't include the trailing\n/\n, items from a site with a similar path name are also returned. For example, if you use\npath:sites/HelloWorld\nthen items from sites named\nsites/HelloWorld_East\nor\nsites/HelloWorld_West\nwould also be returned. To return items only from the HelloWorld site, you have to use\npath:sites/HelloWorld/\n.\nThe\nQuery language-country/region\nmust be defined in your search query prior to collecting content.\nWhen searching the\nSent\nfolders for emails, using the SMTP address for the sender isn't supported. Items in the\nSent\nfolder contain only display names.\nFinding content in Exchange Online\nAdmins are often charged with finding out who knew what when in the most efficient and effective way possible to respond to requests concerning ongoing or potential litigation, internal investigations, and other scenarios. These requests are often urgent, involve multiple stakeholder teams, and have significant impact if not completed in a timely manner. Knowing how to find the right information is critical for admins to complete searches successfully and help their organizations to manage the risk and cost associated with eDiscovery requirements.\nWhen an eDiscovery request is submitted, often there's only partial information available for the admin to start to collect content that might be related to a particular investigation. The request might include employee names, project titles, rough date ranges when the project was active, and not much more. From this information, the admin needs to create queries to find relevant content across Microsoft 365 services to determine the information needed for a particular project or subject. Understanding how information is stored and managed for these services help admins more efficiently find what they need quickly and in an effective manner.\nEmail, chat, meeting, and Microsoft 365 Copilot and Microsoft 365 Copilot Chat activity data (user prompts and Copilot responses) are all stored in Exchange Online. Many communication properties are available for searching items included in Exchange Online. Some properties such as\nFrom\n,\nSent\n,\nSubject\n, and\nTo\nare unique to certain items and aren't relevant when searching for files or documents in SharePoint and OneDrive for Business. Including these types of properties when searching across workloads can sometimes lead to unexpected results.\nFor example, to find content related to specific employees (\nUser 1\nand\nUser 2\n), associated with a project called\nTradewinds\n, and during January 2020 through January 2022, you might use a query with the following properties:\nAdd User 1 and User 2's Exchange Online locations as data sources to the case\nSelect User 1 and User 2's Exchange Online locations as collection locations\nFor\nKeyword\n, use\nTradewinds\nFor\nDate Range\n, use the\nJanuary 1, 2020\nto\nJanuary 31, 2022\nrange\nImportant\nFor emails, when a keyword is used, we search subject, body, and many properties related to the participants. However, due to recipient expansion, search might not return expected results when using the alias or part of the alias. Therefore we recommend using the full UPN.\nSearchable email properties\nThe following table lists the email message properties that can be searched by using the eDiscovery search tools in the Microsoft Purview portal or by using the\nNew-ComplianceSearch\nor the\nSet-ComplianceSearch\ncmdlet.\nImportant\nWhile email messages might have other properties supported in other Microsoft 365 services, only the email properties listed in this table are supported in eDiscovery search tools. Attempting to include other email messages properties in searches isn't supported.\nThe table includes an example of the\nproperty:value\nsyntax for each property and a description of the search results returned by the examples. You can enter these\nproperty:value\npairs in the keywords box for an eDiscovery search.\nNote\nWhen searching email properties, it's not possible to search for message headers. Header information isn't indexed for collections. Additionally, items in which the specified property is empty or blank aren't searchable. For example, using the\nproperty:value\npair of\nsubject:\"\"\nto search for email messages with an empty subject line return zero results. This also applies when searching site and contact properties.\nProperty\nProperty description\nExamples\nSearch results returned by the examples\nAttachmentNames\nThe names of files attached to an email message.\nattachmentnames:annualreport.ppt\nattachmentnames:annual*\nMessages that have an attached file named\nannualreport.ppt\n. In the second example, using the wildcard character ( * ) returns messages with the word\nannual\nin the file name of an attachment.\n1\nBcc\nThe Bcc field of an email message.\n1\nbcc:pilarp@contoso.com\nbcc:pilarp\nbcc:\"Pilar Pinilla\"\nAll examples return messages with\nPilar Pinilla\nincluded in the Bcc field.\n(\nSee Recipient Expansion\n)\nCategory\nThe categories to search. Categories can be defined by users by using Outlook or Outlook on the web (formerly known as Outlook Web App). The possible values are:\nblue\ngreen\norange\npurple\nred\nyellow\ncategory:\"Red Category\"\nMessages that have been assigned the\nred\ncategory in the source mailboxes.\nCc\nThe Cc field of an email message.\n1\ncc:pilarp@contoso.com\ncc:\"Pilar Pinilla\"\nIn both examples, messages with\nPilar Pinilla\nspecified in the Cc field.\n(\nSee Recipient Expansion\n)\nFolderid\nThe folder ID (GUID) of a specific mailbox folder in 48-character format. If you use this property, be sure to search the mailbox that the specified folder is located in. Only the specified folder is searched. Any subfolders in the folder won't be searched. To search subfolders, you need to use the\nFolderid\nproperty for the subfolder you want to search.\nfolderid:4D6DD7F943C29041A65787E30F02AD1F00000000013A0000\nfolderid:2370FB455F82FC44BE31397F47B632A70000000001160000 AND participants:garthf@contoso.com\nThe first example returns all items in the specified mailbox folder. The second example returns all items in the specified mailbox folder that were sent or received by\ngarthf@contoso.com\n.\nFrom\nThe sender of an email message.\n1\nfrom:pilarp@contoso.com\nMessages sent by the specified user.\n(\nSee Recipient Expansion\n)\nHasAttachment\nIndicates whether a message has an attachment. Use the values\ntrue\nor\nfalse\n.\nfrom:pilar@contoso.com AND hasattachment:true\nMessages sent by the specified user that have attachments.\nImportance\nThe importance of an email message, which a sender can specify when sending a message. By default, messages are sent with normal importance, unless the sender sets the importance as\nhigh\nor\nlow\n.\nimportance:high\nimportance:medium\nimportance:low\nMessages that are marked as high importance, medium importance, or low importance.\nIsRead\nIndicates whether messages have been read. Use the values\ntrue\nor\nfalse\n.\nisread:true\nisread:false\nThe first example returns messages with the IsRead property set to\nTrue\n. The second example returns messages with the IsRead property set to\nFalse\n.\nItemClass\nUse this property to search specific third-party data types that your organization imported to Office 365. Use the following syntax for this property:\nitemclass:ipm.externaldata.*\nitemclass:ipm.externaldata.Facebook* AND subject:contoso\nitemclass:ipm.externaldata.Twitter* AND from:\"Ann Beebe\" AND \"Northwind Traders\"\nThe first example returns Facebook items that contain the word \"contoso\" in the Subject property. The second example returns Twitter items that were posted by Ann Beebe and that contain the keyword phrase \"Northwind Traders\".\nFor a complete list of values to use for third-party data types for the ItemClass property, see\nUse Content search to search third-party data that was imported to Office 365\n.\nKind\nThe type of email message to search for. Possible values:\ncontacts\ndocs\nemail\nexternaldata\nfaxes\nim\njournals\nmeetings\nmicrosoftteams (returns items from chats, meetings, and calls in Microsoft Teams)\nnotes\nposts\nrssfeeds\ntasks\nvoicemail\nkind:email\nkind:email OR kind:im OR kind:voicemail\nkind:externaldata\nThe first example returns email messages that meet the search criteria. The second example returns email messages, instant messaging conversations (including Skype for Business conversations and chats in Microsoft Teams), and voice messages that meet the search criteria. The third example returns items that were imported to mailboxes in Microsoft 365 from third-party data sources, such as Twitter, Facebook, and Cisco Jabber that meet the search criteria. For more information, see\nArchiving third-party data in Office 365\n.\nParticipants\nAll the people fields in an email message. These fields are From, To, Cc, and Bcc.\n1\nparticipants:garthf@contoso.com\nparticipants:contoso.com\nMessages sent by or sent to garthf@contoso.com. The second example returns all messages sent by or sent to a user in the contoso.com domain.\n(\nSee Recipient Expansion\n)\nReceived\nThe date that an email message was received by a recipient.\nreceived:2021-04-15\nreceived>=2021-01-01 AND received<=2021-03-31\nMessages that were received on April 15, 2021. The second example returns all messages received between January 1, 2021 and March 31, 2021.\nRecipients\nAll recipient fields in an email message. These fields are To, Cc, and Bcc.\n1\nrecipients:garthf@contoso.com\nrecipients:contoso.com\nMessages sent to garthf@contoso.com. The second example returns messages sent to any recipient in the contoso.com domain.\n(\nSee Recipient Expansion\n)\nSent\nThe date that an email message was sent by the sender.\nsent:2021-07-01\nsent>=2021-06-01 AND sent<=2021-07-01\nMessages that were sent on the specified date or sent within the specified date range.\nSize\nThe size of an item, in bytes.\nsize>26214400\nsize:1..1048567\nMessages larger than 25 MB. The second example returns messages from 1 through 1,048,567 bytes (1 MB) in size.\nSubject\nThe text in the subject line of an email message.\nNote:\nWhen you use the Subject property in a query, the search returns all messages in which the subject line contains the text you're searching for. In other words, the query doesn't return only those messages that have an exact match. For example, if you search for\nsubject:\"Quarterly Financials\"\n, your results include messages with the subject \"Quarterly Financials 2018\".\nsubject:\"Quarterly Financials\"\nsubject:northwind\nMessages that contain the phrase \"Quarterly Financials\" anywhere in the text of the subject line. The second example returns all messages that contain the word northwind in the subject line.\nTo\nThe To field of an email message.\n1\nto:annb@contoso.com\nto:annb\nto:\"Ann Beebe\"\nAll examples return messages where Ann Beebe is specified in the To: line.\nNote\n1\nFor the value of a recipient property, you can use email address (also called\nuser principal name\nor UPN), display name, or alias to specify a user. For example, you can use annb@contoso.com, annb, or \"Ann Beebe\" to specify the user Ann Beebe.\nRecipient expansion\nTip\nUse the new\neDiscovery\nexperience and condition builder to use common mailbox and site\nproperties\nlike\nMessageIDs\nand\nChatThreadIds\nwhen searching for specific recipients or messages.\nWhen searching any of the recipient properties (From, To, Cc, Bcc, Participants, and Recipients), Microsoft 365 attempts to expand the identity of each user by looking them up in Microsoft Entra ID. If the user is found in Microsoft Entra ID, the query is expanded to include the user's email address (or UPN), alias, display name, and LegacyExchangeDN. For example, a query such as\nparticipants:ronnie@contoso.com\nexpands to\nparticipants:ronnie@contoso.com OR participants:ronnie OR participants:\"Ronald Nelson\" OR participants:\"\"\n.\nTo prevent recipient expansion, add a wild card character (asterisk) to the end of the email address and use a reduced domain name; for example,\nparticipants:\"ronnie@contoso*\"\nBe sure to surround the email address with double quotation marks.\nHowever, be aware that preventing recipient expansion in the search query might result in relevant items not being returned in the search results. Email messages in Exchange can be saved with different text formats in the recipient fields. Recipient expansion is intended to help mitigate this fact by returning messages that might contain different text formats. So preventing recipient expansion might result in the search query not returning all items that might be relevant to your investigation.\nNote\nIf you need to review or reduce the items returned by a search query due to recipient expansion, consider using eDiscovery (Premium). You can search for messages (taking advantage of recipient expansion), add them to a review set, and then use review set queries or filters to review or narrow the results. For more information, see\nCollect data for a case\nand\nQuery the data in a review set\n.\nFinding content in SharePoint and OneDrive\nTip\nUse the new\neDiscovery\nexperience and search\nexport options\nto download all list attachments for SharePoint sites.\nWhen searching for documents and files located in SharePoint or OneDrive for Business, it might make sense to adjust the query approach based on the metadata for the documents and files of interest. Files and documents have relevant properties like\nAuthor\n,\nCreated\n,\nCreatedBy\n,\nFileName\n,\nLastModifiedTime\n, and\nTitle\n. Most of these proprieties aren't relevant when searching for communications content in Exchange Online, and using these properties might lead to unexpected results if used across both documents and communications. Additionally,\nFileName\nand\nTitle\nof a document might not be the same and using one or the other to try to find a file with specific content might lead to different or inaccurate results. Keep these properties in mind when searching for specific document and file content in SharePoint and OneDrive for Business.\nFor example, to find content related to documents created by User 1, for a project called\nTradewinds\n, for specific files named\nFinancials\n, and from January 2020 to January 2022, you might use a query with the following properties:\nAdd User 1's OneDrive for Business site as a data sources to the case\nSelect User 1's OneDrive for Business site as a collection location\nAdd additional SharePoint site locations related to the project as collection locations\nFor\nFileName\n, use\nFinancials\nFor\nKeyword\n, use\nTradewinds\nFor\nDate Range\n, use the\nJanuary 1, 2020\nto\nJanuary 31, 2022\nrange\nSearchable site properties\nThe following table lists the SharePoint and OneDrive for Business properties that can be searched by using the eDiscovery search tools in the Microsoft Purview portal or by using the\nNew-ComplianceSearch\nor the\nSet-ComplianceSearch\ncmdlet.\nImportant\nWhile documents and files stored on SharePoint and OneDrive for Business might have other properties supported in other Microsoft 365 services, only the document and file properties listed in this table are supported in eDiscovery search tools. Attempting to include other document or file properties in searches isn't supported.\nThe table includes an example of the\nproperty:value\nsyntax for each property and a description of the search results returned by the examples.\nProperty\nProperty description\nExample\nSearch results returned by the examples\nAuthor\nThe author field from Office documents, which persists if a document is copied. For example, if a user creates a document and the emails it to someone else who then uploads it to SharePoint, the document will still retain the original author. Be sure to use the user's display name for this property.\nauthor:\"Garth Fort\"\nAll documents that are authored by Garth Fort.\nContentType\nThe SharePoint content type of an item, such as Item, Document, or Video.\ncontenttype:document\nAll documents would be returned.\nCreated\nThe date that an item is created.\ncreated>=2021-06-01\nAll items created on or after June 1, 2021.\nCreatedBy\nThe person that created or uploaded an item. Be sure to use the user's display name for this property.\ncreatedby:\"Garth Fort\"\nAll items created or uploaded by Garth Fort.\nDetectedLanguage\nThe language of an item.\ndetectedlanguage:english\nAll items in English.\nDocumentLink\nThe path (URL) of a specific folder on a SharePoint or OneDrive for Business site. If you use this property, be sure to search the site that the specified folder is located in. We recommend using this property instead of the\nSite\nand\nPath\nproperties.\nTo return items located in subfolders of the folder that you specify for the documentlink property, you have to add /* to the URL of the specified folder; for example,\ndocumentlink: \"https://contoso.sharepoint.com/Shared Documents/*\"\ndocumentlink:\"https://contoso-my.sharepoint.com/personal/garthf_contoso_com/Documents/Private\"\ndocumentlink:\"https://contoso-my.sharepoint.com/personal/garthf_contoso_com/Documents/Shared with Everyone/*\" AND filename:confidential\nThe first example returns all items in the specified OneDrive for Business folder. The second example returns documents in the specified site folder (and all subfolders) that contain the word \"confidential\" in the file name.\nFileExtension\nThe extension of a file; for example, docx, one, pptx, or xlsx.\nfileextension:xlsx\nAll Excel files (Excel 2007 and later)\nFileName\nThe name of a file.\nfilename:\"marketing plan\"\nfilename:estimate\nThe first example returns files with the exact phrase \"marketing plan\" in the title. The second example returns files with the word \"estimate\" in the file name.\nLastModifiedTime\nThe date that an item was last changed.\nlastmodifiedtime>=2021-05-01\nlastmodifiedtime>=2021-05-01 AND lastmodifiedtime<=2021-06-01\nThe first example returns items that were changed on or after May 1, 2021. The second example returns items changed between May 1, 2021 and June 1, 2021.\nModifiedBy\nThe person who last changed an item. Be sure to use the user's display name for this property.\nmodifiedby:\"Garth Fort\"\nAll items that were last changed by Garth Fort.\nSharedWithUsersOWSUser\nDocuments that have been shared with the specified user and displayed on the\nShared with me\npage in the user's OneDrive for Business site. These are documents that have been explicitly shared with the specified user by other people in your organization. When you export documents that match a search query that uses the SharedWithUsersOWSUser property, the documents are exported from the original content location of the person who shared the document with the specified user. For more information, see\nSearching for site content shared within your organization\n.\nsharedwithusersowsuser:garthf\nsharedwithusersowsuser:\"garthf@contoso.com\"\nBoth examples return all internal documents that have been explicitly shared with Garth Fort and that appear on the\nShared with me\npage in Garth Fort's OneDrive for Business account.\nSize\nThe size of an item, in bytes.\nsize>=1\nsize:1..10000\nThe first example returns items larger than 1 byte. The second example returns items from 1 through 10,000 bytes in size.\nTitle\nThe title of the document. The Title property is metadata that's specified in Microsoft Office documents. It's different from the file name of the document.\ntitle:\"communication plan\"\nAny document that contains the phrase \"communication plan\" in the Title metadata property of an Office document.\nSearchable contact properties\nThe following table lists the contact properties that are indexed and that you can search for using eDiscovery search tools. These are the properties that are available for users to configure for the contacts (also called personal contacts) that are located in the personal address book of a user's mailbox. To search for contacts, you can select the mailboxes to search and then use one or more contact properties in the keyword query.\nTip\nTo search for values that contain spaces or special characters, use double quotation marks (\" \") to contain the phrase; for example,\nbusinessaddress:\"123 Main Street\"\n.\nProperty\nProperty description\nBusinessAddress\nThe address in the\nBusiness Address\nproperty. The property is also called the\nWork\naddress on the contact properties page.\nBusinessPhone\nThe phone number in any of the\nBusiness Phone\nnumber properties.\nCompanyName\nThe name in the\nCompany\nproperty.\nDepartment\nThe name in the\nDepartment\nproperty.\nDisplayName\nThe display name of the contact. This is the name in the\nFull Name\nproperty of the contact.\nEmailAddress\nThe address for any email address property for the contact. Users can add multiple email addresses for a contact. Using this property would return contacts that match any of the contact's email addresses.\nFileAs\nThe\nFile as\nproperty. This property is used to specify how the contact is listed in the user's contact list. For example, a contact could be listed as\nFirstName,LastName\nor\nLastName,FirstName\n.\nGivenName\nThe name in the\nFirst Name\nproperty.\nHomeAddress\nThe address in any of the\nHome\naddress properties.\nHomePhone\nThe phone number in any of the\nHome\nphone number properties.\nIMAddress\nThe IM address property, which is typically an email address used for instant messaging.\nMiddleName\nThe name in the\nMiddle\nname property.\nMobilePhone\nThe phone number in the\nMobile\nphone number property.\nNickname\nThe name in the\nNickname\nproperty.\nOfficeLocation\nThe value in\nOffice\nor\nOffice location\nproperty.\nOtherAddress\nThe value for the\nOther\naddress property.\nSurname\nThe name in the\nLast\nname property.\nTitle\nThe title in the\nJob title\nproperty.\nSearch operators\nBoolean search operators, such as\nAND\n,\nOR\n, and\nNOT\n, help you define more-precise searches by including or excluding specific words in the search query. Other techniques, such as using property operators (such as\n>=\nor\n..\n), quotation marks, parentheses, and wildcards, help you refine a search query. The following table lists the operators that you can use to narrow or broaden search results.\nOperator\nUsage\nDescription\nAND\nkeyword1 AND keyword2\nReturns items that include all of the specified keywords or\nproperty:value\nexpressions. For example,\nfrom:\"Ann Beebe\" AND subject:northwind\nwould return all messages sent by Ann Beebe that contained the word northwind in the subject line.\n2\n+\nkeyword1 + keyword2 + keyword3\nReturns items that contain\neither\nkeyword2\nor\nkeyword3\nand\nthat also contain\nkeyword1\n. Therefore, this example is equivalent to the query\n(keyword2 OR keyword3) AND keyword1\n.\nThe query\nkeyword1 + keyword2\n(with a space after the\n+\nsymbol) isn't the same as using the\nAND\noperator. This query would be equivalent to\n\"keyword1 + keyword2\"\nand return items with the exact phase\n\"keyword1 + keyword2\"\n.\nOR\nkeyword1 OR keyword2\nReturns items that include one or more of the specified keywords or\nproperty:value\nexpressions.\n2\nNOT\nkeyword1 NOT keyword2\nNOT from:\"Ann Beebe\"\nNOT kind:im\nExcludes items specified by a keyword or a\nproperty:value\nexpression. In the second example excludes messages sent by Ann Beebe. The third example excludes any instant messaging conversations, such as Skype for Business conversations that are saved to the Conversation History mailbox folder.\n2\nNEAR\nkeyword1 NEAR(n) keyword2\nReturns items with words that are near each other. In the\nkeyword1 NEAR(n) keyword2\nsyntax,\nn\nequals the number of words exclusive of\nkeyword1\nand\nkeyword2\n. For example, to identify instances where the term\nbest\nis within 3 words of\nworst\n(example sentence, 'Best is opposite of worst.'), you would use\nbest NEAR(3) worst\n. This returns any items where there are 3 or fewer words between\nbest\n(keyword1) and\nworst\n(keyword2). If no number is specified, the default\nn\nvalue is 8.\n2\n:\nproperty:value\nThe colon (:) in the\nproperty:value\nsyntax specifies that the value of the property being searched for contains the specified value. For example,\nrecipients:garthf@contoso.com\nreturns any message sent to garthf@contoso.com.\n=\nproperty=value\nThe same as the\n:\noperator.\n<\nproperty\nproperty>value\nDenotes that the property being searched is greater than the specified value.\n1\n<=\nproperty<=value\nDenotes that the property being searched is less than or equal to a specific value.\n1\n>=\nproperty>=value\nDenotes that the property being searched is greater than or equal to a specific value.\n1\n..\nproperty:value1..value2\nDenotes that the property being searched is greater than or equal to value1 and less than or equal to value2.\n1\n\" \"\n\"fair value\"\nsubject:\"Quarterly Financials\"\nIn a keyword query (where you type the\nproperty:value\npair in the\nKeyword\nbox), use double quotation marks (\" \") to search for an exact phrase or term. However, if you use the\nSubject\nor\nSubject/Title\nsearch condition\ncondition, don't add double quotation marks to the value because quotation marks are automatically added when using these search conditions. If you do add quotation marks to the value, two pairs of double quotations are added to the condition value, and the search query will return an error.\n*\ncat*\nsubject:set*\nPrefix searches (also called\nprefix matching\n) where a wildcard character ( * ) is placed at the end of a word in keywords or\nproperty:value\nqueries. In prefix searches, the search returns results with terms that contain the word followed by zero or more characters. For example,\ntitle:set*\nreturns documents that contain the word \"set\", \"setup\", and \"setting\" (and other words that start with \"set\") in the document title.\nNote:\nYou can use only prefix searches; for example,\ncat*\nor\nset*\n. Suffix searches (\n*cat\n), infix searches (\nc*t\n), and substring searches (\n*cat*\n) aren't supported.\nAlso, adding a period ( . ) to a prefix search changes the results that are returned. That's because a period is treated as a stop word. For example, searching for\ncat*\nand searching for\ncat.*\nwill return different results. We recommend not using a period in a prefix search.\n( )\n(fair OR free) AND (from:contoso.com)\n(IPO OR initial) AND (stock OR shares)\n(quarterly financials)\nParentheses group together Boolean phrases,\nproperty:value\nitems, and keywords. For example,\n(quarterly financials)\nreturns items that contain the words quarterly and financials.\nNote\n1\nUse this operator for properties that have date or numeric values.\n2\nBoolean search operators must be uppercase; for example,\nAND\n. If you use a lowercase operator, such as\nand\n, it is treated as a keyword in the search query.\nSearch conditions\nYou can add conditions to a search query to narrow a search and return a more refined set of results. Each condition adds a clause to the KeyQL search query that is created and run when you start the search.\nConditions for common properties\nConditions for mail properties\nConditions for document properties\nOperators used with conditions\nGuidelines for using conditions\nExamples of using conditions in search queries\nConditions for common properties\nCreate a condition using common properties when searching mailboxes and sites in the same search. The following table lists the available properties to use when adding a condition.\nCondition\nDescription\nDate\nFor email, the date a message was created or imported from a PST file. For documents, the date a document was last modified.\nIf you're searching for email messages for a specific time period, you should use the message\nReceived\nand\nSent\nconditions if you're unsure if the email messages might have been imported instead of natively created in Exchange.\nSender/Author\nFor email, the person who sent a message. For documents, the person cited in the author field from Office documents. You can type more than one name, separated by commas. Two or more values are logically connected by the\nOR\noperator.\n(\nSee Recipient Expansion\n)\nSize (in bytes)\nFor both email and documents, the size of the item (in bytes).\nSubject/Title\nFor email, the text in the subject line of a message. For documents, the title of the document. As previously explained, the Title property is metadata specified in Microsoft Office documents. You can type the name of more than one subject/title values, separated by commas. Two or more values are logically connected by the\nOR\noperator.\nNote\n: Don't include double quotation marks to the values for this condition because quotation marks are automatically added when using this search condition. If you add quotation marks to the value, two pairs of double quotations are added to the condition value, and the search query will return an error.\nRetention label\nFor both email and documents, retention labels that can be automatically or manually applied to messages and documents. Retention labels can be used to declare records and help you manage the data lifecycle of content by enforcing retention and deletion rules specified by the label. You can type part of the retention label name and use a wildcard or type the complete label name. For more information about retention labels, see\nLearn about retention policies and retention labels\n.\nConditions for mail properties\nCreate a condition using mail properties when searching mailboxes or public folders in Exchange Online. The following table lists the email properties that you can use for a condition. These properties are a subset of the email properties that were previously described. These descriptions are repeated for your convenience.\nCondition\nDescription\nMessage kind\nThe message type to search. This is the same property as the Kind email property. Possible values:\ncontacts\ndocs\nemail\nexternaldata\nfax\nim\njournals\nmeetings\nmicrosoftteams\nnotes\nposts\nrssfeeds\ntasks\nvoicemail\nParticipants\nAll the people fields in an email message. These fields are From, To, Cc, and Bcc. (\nSee Recipient Expansion\n)\nType\nThe message class property for an email item. This is the same property as the ItemClass email property. It's also a multi-value condition. So to select multiple message classes, hold the\nCTRL\nkey and then select two or more message classes in the drop-down list that you want to add to the condition. Each message class that you select in the list is logically connected by the\nOR\noperator in the corresponding search query.\nFor a list of the message classes (and their corresponding message class ID) that are used by Exchange and that you can select in the\nMessage class\nlist, see\nItem Types and Message Classes\n.\nReceived\nThe date that an email message was received by a recipient. This is the same property as the Received email property.\nRecipients\nAll recipient fields in an email message. These fields are To, Cc, and Bcc. (\nSee Recipient Expansion\n)\nSent\nThe date that an email message was sent by the sender. This is the same property as the Sent email property.\nSubject\nThe text in the subject line of an email message.\nNote\n: Don't include double quotation marks to the values for this condition because quotation marks are automatically added when using this search condition. If you add quotation marks to the value, two pairs of double quotations are added to the condition value, and the search query will return an error.\nTo\nThe recipient of an email message in the To field.\nConditions for document properties\nCreate a condition using document properties when searching for documents on SharePoint and OneDrive for Business sites. The following table lists the document properties that you can use for a condition. These properties are a subset of the site properties that were previously described. These descriptions are repeated for your convenience.\nCondition\nDescription\nAuthor\nThe author field from Office documents, which persists if a document is copied. For example, if a user creates a document and the emails it to someone else who then uploads it to SharePoint, the document will still retain the original author.\nTitle\nThe title of the document. The Title property is metadata that's specified in Office documents. It's different than the file name of the document.\nCreated\nThe date that a document is created.\nLast modified\nThe date that a document was last changed.\nFile type\nThe extension of a file; for example, docx, one, pptx, or xlsx. This is the same property as the FileExtension site property.\nNote:\nIf you include a File type condition using the\nEquals\nor\nEquals any of\noperator in a search query, you can't use a prefix search (by including the wildcard character ( * ) at the end of the file type) to return all versions of a file type. If you do, the wildcard is ignored. For example if you include the condition\nEquals any of doc*\n, only files with an extension of\n.doc\nare returned. Files with an extension of\n.docx\nwon’t be returned. To return all versions of a file type, used the\nproperty:value\npair in a keyword query; for example,\nfiletype:doc*\n.\nOperators used with conditions\nWhen you add a condition, you can select an operator that is relevant to type of property for the condition. The following table describes the operators that are used with conditions and lists the equivalent that is used in the search query.\nOperator\nQuery equivalent\nDescription\nAfter\nproperty>date\nUsed with date conditions. Returns items that were sent, received, or modified after the specified date.\nBefore\npropertyvalue\nReturns items where the specified property is greater than the specified value.\n1\nGreater or equal\nsize>=value\nReturns items where the specified property is greater than or equal to the specified value.\n1\nLess\nsizevalue\nReturns items that don't equal the specified size.\n1\nNote\n1\nThis operator is available only for conditions that use the Size property.\nGuidelines for using conditions\nKeep the following in mind when using search conditions.\nA condition is logically connected to the keyword query (specified in the keyword box) by the\nAND\noperator. That means that items have to satisfy both the keyword query and the condition to be included in the results. This is how conditions help to narrow your results.\nIf you add two or more unique conditions to a search query (conditions that specify different properties), those conditions are logically connected by the\nAND\noperator. That means only items that satisfy all the conditions (in addition to any keyword query) are returned.\nIf you add more than one condition for the same property, those conditions are logically connected by the\nOR\noperator. That means items that satisfy the keyword query and any one of the conditions are returned. So, groups of the same conditions are connected to each other by the\nOR\noperator and then sets of unique conditions are connected by the\nAND\noperator.\nIf you add multiple values (separated by commas or semi-colons) to a single condition, those values are connected by the\nOR\noperator. That means items are returned if they contain any of the specified values for the property in the condition.\nAny condition that uses an operator with\nContains\nand\nEquals\nlogic will return similar search results for simple string searches. A simple string search is a string in the condition that doesn't include a wildcard). For example, a condition that uses\nEquals any of\nwill return the same items as a condition that uses\nContains any of\n.\nThe search query that is created by using the keywords box and conditions is displayed on the\nSearch\npage, in the details pane for the selected search. In a query, everything to the right of the notation\n(c:c)\nindicates conditions that are added to the query.\n(c:c)\nshouldn't be used in manually entered queries and isn't equal to\nAND\nor\nOR\n.\nConditions only add properties to the search query; they don't add operators. This is why the query displayed in the detail pane doesn't show operators to the right of the\n(c:c)\nnotation. KeyQL adds the logical operators (according to the previously explained rules) when the executing the query.\nYou can use the drag and drop control to resequence the order of conditions. Select the control for a condition and move it up or down.\nAs previously explained, some condition properties allow you to type multiple values (separated by semi-colons). Each value is logically connected by the\nOR\noperator, and results in the query\n(filetype=docx) OR (filetype=pptx) OR (filetype=xlsx)\n. The following illustration shows an example of a condition with multiple values.\nNote\nYou can't add multiple conditions (by selecting\nAdd condition\nfor the same property). Instead, you have to provide multiple values for the condition (separated by semi-colons), as shown in the previous example.\nExamples of using conditions in search queries\nThe following examples show the GUI-based version of a search query with conditions, the search query syntax that is displayed in the details pane of the selected search (which is also returned by the\nGet-ComplianceSearch\ncmdlet), and the logic of the corresponding KeyQL query.\nExample 1\nThis example returns email items or documents that contain the keyword \"report\", that were sent or created before April 1, 2021, and that contain the word \"northwind\" in the subject field of email messages or in the title property of documents. The query excludes Web pages that meet the other search criteria.\nGUI\n:\nSearch query syntax\n:\nreport(c:c)(date<2021-04-01)(subjecttitle:\"northwind\")(-filetype:aspx)\nSearch query logic\n:\nreport AND (date<2021-04-01) AND (subjecttitle:\"northwind\") NOT (filetype:aspx)\nExample 2\nThis example returns email messages or calendar meetings that were sent between December 1, 2019 and November 30, 2020 and that contain words that start with \"phone\" or \"smartphone\".\nGUI\n:\nSearch query syntax\n:\nphone* OR smartphone*(c:c)(sent=2019-12-01..2020-11-30)(kind=\"email\")(kind=\"meetings\")\nSearch query logic\n:\nphone* OR smartphone* AND (sent=2019-12-01..2020-11-30) AND ((kind=\"email\") OR (kind=\"meetings\"))\nSpecial characters\nSome special characters aren't included in the search index and therefore aren't searchable. This also includes the special characters that represent search operators in the search query. Here's a list of special characters that are either replaced by a blank space in the actual search query or cause a search error.\n+ - = : ! @ # % ^ & ; _ / ? ( ) [ ] { }\nSearchable sensitive data types\nYou can use eDiscovery search tools in the Microsoft Purview portal to search for sensitive data, such as credit card numbers or social security numbers, that is stored in documents on SharePoint and OneDrive for Business sites. You can do this by using the\nSensitiveType\nproperty and the name (or ID) of a sensitive information type in a keyword query. For example, the query\nSensitiveType:\"Credit Card Number\"\nreturns documents that contain a credit card number. The query\nSensitiveType:\"U.S. Social Security Number (SSN)\"\nreturns documents that contain a U.S. social security number.\nTo see a list of the sensitive information types that you can search for, go to\nData classifications\n>\nSensitive info types\nin the Microsoft Purview portal. Or you can use the\nGet-DlpSensitiveInformationType\ncmdlet in Security & Compliance PowerShell to display a list of sensitive information types.\nLimitations for searching sensitive data types\nTo search for custom sensitive information types, you have to specify the ID of the sensitive information type in the\nSensitiveType\nproperty. Using the name of a custom sensitive information type (as shown in the example for built-in sensitive information types in the previous section) will return no results. Use the\nPublisher\ncolumn on the\nSensitive info types\npage in the Microsoft Purview portal (or the\nPublisher\nproperty in PowerShell) to differentiate between built-in and custom sensitive information types. Built-in sensitive data types have a value of\nMicrosoft Corporation\nfor the\nPublisher\nproperty.\nTo display the name and ID for the custom sensitive data types in your organization, run the following command in Security & Compliance PowerShell:\nGet-DlpSensitiveInformationType | Where-Object {$_.Publisher -ne \"Microsoft Corporation\"} | FT Name,Id\nThen you can use the ID in the\nSensitiveType\nsearch property to return documents that contain the custom sensitive data type; for example,\nSensitiveType:7e13277e-6b04-3b68-94ed-1aeb9d47de37\nYou can't use sensitive information types and the\nSensitiveType\nsearch property to search for sensitive data at-rest in Exchange Online mailboxes. This includes 1:1 chat messages, 1:N group chat messages, and team channel conversations in Microsoft Teams because all of this content is stored in mailboxes. However, you can use data loss prevention (DLP) policies to protect sensitive email data in transit. For more information, see\nLearn about data loss prevention\nand\nSearch for and find personal data\n.\nSearching for site content shared with external users\nYou can also use eDiscovery search tools in the Microsoft Purview portal to search for documents stored on SharePoint and OneDrive for Business sites that have been shared with people outside of your organization. This can help you identify sensitive or proprietary information that's being shared outside your organization. You can do this by using the\nViewableByExternalUsers\nproperty in a keyword query. This property returns documents or sites that have been shared with external users by using one of the following sharing methods:\nA sharing invitation that requires users to sign in to your organization as an authenticated user.\nAn anonymous guest link, which allows anyone with this link to access the resource without having to be authenticated.\nHere are some examples:\nThe query\nViewableByExternalUsers:true AND SensitiveType:\"Credit Card Number\"\nreturns all items that have been shared with people outside your organization and contain a credit card number.\nThe query\nViewableByExternalUsers:true AND ContentType:document AND site:\"https://contoso.sharepoint.com/Sites/Teams\"\nreturns a list of documents on all team sites in the organization that have been shared with external users.\nTip\nA search query such as\nViewableByExternalUsers:true AND ContentType:document\nmight return numerous .aspx files in the search results. To eliminate these (or other types of files), you can use the\nFileExtension\nproperty to exclude specific file types; for example\nViewableByExternalUsers:true AND ContentType:document NOT FileExtension:aspx\n.\nWhat is considered content that is shared with people outside your organization? Documents in your organization's SharePoint and OneDrive for Business sites that are shared by sending a sharing invitation or that are shared in public locations. For example, the following user activities result in content that is viewable by external users:\nA user shares a file or folder with a person outside your organization.\nA user creates and sends a link to a shared file to a person outside your organization. This link allows the external user to view (or edit) the file.\nA user sends a sharing invitation or a guest link to a person outside your organization to view (or edit) a shared file.\nIssues using the ViewableByExternalUsers property\nWhile the\nViewableByExternalUsers\nproperty represents the status of whether a document or site is shared with external users, there are some caveats to what this property does and doesn't reflect. In the following scenarios, the value of the\nViewableByExternalUsers\nproperty won't be updated, and the results of a search query that uses this property might be inaccurate.\nChanges to sharing policy, such as turning off external sharing for a site or for the organization. The property will still show previously shared documents as being externally accessible even though external access might have been revoked.\nChanges to group membership, such as adding or removing external users to Microsoft 365 Groups or Microsoft 365 security groups. The property won't automatically be updated for items the group has access to.\nSending sharing invitations to external users where the recipient hasn't accepted the invitation, and therefore doesn't yet have access to the content.\nIn these scenarios, the\nViewableByExternalUsers\nproperty won't reflect the current sharing status until the site or document library is recrawled and reindexed.\nSearching for site content shared within your organization\nAs previously explained, you can use the\nSharedWithUsersOWSUser\nproperty so search for documents that have been shared between people in your organization. When a person shares a file (or folder) with another user inside your organization, a link to the shared file appears on the\nShared with me\npage in the OneDrive for Business account of the person who the file was shared with. For example, to search for the documents that have been shared with Sara Davis, you can use the query\nSharedWithUsersOWSUser:\"sarad@contoso.com\"\n. If you export the results of this search, the original documents (located in the content location of the person who shared the documents with Sara) are downloaded.\nDocuments must be explicitly shared with a specific user to be returned in search results when using the\nSharedWithUsersOWSUser\nproperty. For example, when a person shares a document in their OneDrive account, they have the option to share it with anyone (inside or outside the organization), share it only with people inside the organization, or share it with a specific person. Here's a screenshot of the\nShare\nwindow in OneDrive that shows the three sharing options.\nOnly documents that are shared by using the third option (shared with\nSpecific people\n) are returned by a search query that uses the\nSharedWithUsersOWSUser\nproperty.\nSearching for Skype for Business conversations\nYou can use the following keyword query to specifically search for content in Skype for Business conversations:\nkind:im\nThe previous search query also returns chats from Microsoft Teams. To prevent this, you can narrow the search results to include only Skype for Business conversations by using the following keyword query:\nkind:im AND subject:conversation\nThe previous keyword query excludes chats in Microsoft Teams because Skype for Business conversations are saved as email messages with a Subject line that starts with the word \"Conversation\".\nTo search for Skype for Business conversations that occurred within a specific date range, use the following keyword query:\nkind:im AND subject:conversation AND (received=startdate..enddate)\nCharacter limits for searches\nThere's a 4,000 character limit for search queries when searching for content in SharePoint sites and OneDrive accounts.\nHere's how the total number of characters in the search query are calculated:\nThe characters in keyword search query (including both user and filter fields) count against this limit.\nThe characters in any location property (such as the URLs for all the SharePoint sites or OneDrive locations being searched) count against this limit.\nThe characters in all the search permissions filters that are applied to the user running the search count against the limit.\nNote\nThe 4,000 character limit applies to Content search, eDiscovery (Standard), and eDiscovery (Premium).\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "KeyQL Reference", @@ -917,7 +917,7 @@ "https://learn.microsoft.com/en-us/purview/ediscovery-create-holds": { "content_hash": "sha256:92f130696c77a7e577ea65dce06422b3a465b584441aa3239cc475ebc9864749", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nCreate eDiscovery holds in an eDiscovery case\nFeedback\nSummarize this article for me\nCaution\nMicrosoft retired all classic eDiscovery experiences on August 31, 2025. This retirement includes classic\nContent Search\n, classic\neDiscovery (Standard)\n, and classic\neDiscovery (Premium)\n.\nThe guidance in this article only applies to organizations hosted in Microsoft 365 operated by 21Vianet (China). If your organization isn't hosted by 21Vianet, use the guidance for the\nnew eDiscovery experience\nin the\nMicrosoft Purview portal\n.\nYou can use a Microsoft Purview eDiscovery (Premium) or (Standard) case to create holds to preserve content that might be relevant to the case. You can place a hold on the Exchange mailboxes and OneDrive for Business accounts of people you're investigating in the case. You can also place a hold on the mailboxes and sites that are associated with Microsoft Teams, Microsoft 365 groups, and Viva Engage Groups. When you place content locations on hold, content is preserved until you remove the content location from the hold or until you delete the hold.\nImportant\nFor long term data retention not related to eDiscovery investigations, it is strongly advised to use retention policies and retention labels. For more information, see\nLearn about retention policies and retention labels\n.\nAfter you create an eDiscovery hold, it may take up to 24 hours for the hold to take effect.\nWhen you create a hold, you have the following options to scope the content that's preserved in the specified content locations:\nCreate an infinite hold where all content in the specified locations is placed on hold. Alternatively, you can create a query-based hold where only the content in the specified locations that matches a search query is placed on hold.\nSpecify a date range to preserve only the content that was sent, received, or created within that date range. Alternatively, you can hold all content in specified locations regardless of when sent, received, or created.\nHow to create an eDiscovery hold\nTo create an eDiscovery hold that's associated with a eDiscovery (Premium) or (Standard) case:\nGo to\nMicrosoft Purview portal\nand sign in using the credentials for user account with the appropriate eDiscovery permissions.\nIn the left navigation pane, select\nShow all\n, and then select\neDiscovery > Premium\nor\neDiscovery > Standard\n.\nOn the\neDiscovery > Premium\nor\neDiscovery (Standard)\npage, select the name of the case that you want to create the hold in.\nOn the\nHome\npage for the case, select the\nHold\ntab.\nOn the\nHold\npage, select\nCreate\n.\nOn the\nName your hold\nworkflow page, give the hold a name and add an optional description, and then select\nNext\n. The name of the hold must be unique in your organization.\nOn the\nChoose locations\nworkflow page, choose the content locations that you want to place on hold. You can place mailboxes, sites, and public folders on hold.\nExchange mailboxes\n: Set the toggle to\nOn\nand then select\nChoose users, groups, or teams\nto specify the mailboxes to place on hold. Use the search box to find user mailboxes and distribution groups (to place a hold on the mailboxes of group members) to place on hold. You can also place a hold on the associated mailbox for a Microsoft Team, Microsoft 365 group, and Viva Engage Group. For more information about the application data that is preserved when a mailbox is placed on hold, see\nContent stored in mailboxes for eDiscovery\n.\nImportant\nWhen you select a distribution list to be placed on hold, the hold is placed on each of the member mailboxes in the distribution list when the policy is created. Subsequent changes in the distribution list do not change or update the holds or the policy.\nSharePoint sites\n: Set the toggle to\nOn\nand then select\nChoose sites\nto specify SharePoint sites and OneDrive accounts to place on hold. Type the URL for each site that you want to place on hold. You can also add the URL for the SharePoint site for a Microsoft Team, Microsoft 365 group or a Yammer Group.\nImportant\nTo create a hold for a subsite related to a SharePoint Online site, you must use the\nPath\nproperty in a query filter to select a specific subsite.\nExchange public folders\n: Set the toggle to\nOn\nto put all public folders in your Exchange Online organization on hold. You can't choose specific public folders to put on hold. Leave the toggle switch off if you don't want to put a hold on public folders.\nImportant\nWhen adding Exchange mailboxes or SharePoint sites to a hold, you must explicitly add at least one content location to the hold. In other words, if you set the toggle to\nOn\nfor mailboxes or sites, you must select specific mailboxes or sites to add to the hold. Otherwise, the eDiscovery hold will be created but no mailboxes or sites will be added to the hold.\nWhen finished adding locations to the hold, select\nNext\n.\nTo create a query-based hold using keywords or conditions, complete the following steps. To preserve all content in the specified content locations, select\nNext\n.\nIn the box under\nKeywords\n, type a query to preserve only the content that matches the query criteria. You can specify keywords, email message properties, or site properties, such as file names. You can also use more complex queries that use a Boolean operator, such as\nAND\n,\nOR\n, or\nNOT\n.\nSelect\nAdd condition\nto add one or more conditions to narrow the query for the hold. Each condition adds a clause to the KeyQL search query that is created and run when you create the hold. For example, you can specify a date range so that email or site documents that were created within the date ranged are preserved. A condition is logically connected to the keyword query (specified in the\nKeywords\nbox) and other conditions by the\nAND\noperator. That means items have to satisfy both the keyword query and the condition to be preserved.\nFor more information about creating a search query and using conditions, see\nKeyword queries and search conditions for eDiscovery\n.\nAfter configuring a query-based hold, select\nNext\n.\nReview your settings (and edit them if necessary), and then select\nSubmit\n.\nAfter creating a hold, check that the hold is applied successfully by navigating to the\nHold\ntab in the case and selecting the hold policy. For more information about troubleshooting holds with errors, see\nResolve eDiscovery hold errors\n.\nNote\nWhen you create a query-based hold, all content from selected locations is initially placed on hold. Subsequently, any content that doesn't match the specified query is cleared from the hold every seven to 14 days. However, a query-based hold won't clear content if more than five holds of any type are applied to a content location, or if any item has indexing issues.\nQuery-based holds placed on sites\nKeep the following things in mind when you place a query-based eDiscovery hold on documents located in SharePoint sites:\nA query-based hold initially preserves all documents in a site for a short period of time after they're deleted. That means when a document is deleted, it is moved to the Preservation Hold library even if it doesn't match the criteria of the query-based hold. However, deleted documents that don't match a query-based hold will be removed by a timer job that processes the Preservation Hold library. The timer job runs periodically and compares all documents in the Preservation Hold library to your query-based eDiscovery holds (and other types of holds and retention policies). The timer job deletes the documents that don't match a query-based hold and preserves the documents that do.\nQuery-based holds shouldn't be used to perform targeted preservation, like preserving documents in a specific folder or site or by using other location-based hold criteria. Doing so may have unintended results. We recommend using non-location based hold criteria such as keywords, date ranges, or other document properties to preserve site documents.\nSearch locations on eDiscovery hold\nWhen you\nsearch for content\nin a eDiscovery (Standard) case, you can quickly configure the search to only search the content locations that have been placed on a hold associated with the case.\nSelect the\nLocations on hold\noption to search all the content locations that have been placed on hold. If the case contains multiple eDiscovery holds, the content locations from all holds are searched when you select this option. Additionally, if a content location was placed on a query-based hold, only the items that match the hold query are searched when you run the search. In other words, only the content that matches both the hold criteria and the search criteria is returned with the search results. For example, if a user was placed on query-based case hold that preserves items that were sent or created before a specific date, only those items would be searched. This is accomplished by connecting the case hold query and the search query by an\nAND\noperator.\nHere are some other things to keep in mind when searching locations on eDiscovery hold:\nIf a content location is part of multiple holds within the same case, the hold queries are combined by\nOR\noperators when you search that content location using the all case content option. Similarly, if a content location is part of two different holds, where one is query-based and the other is an infinite hold (where all content is placed on hold), then all content is search because of the infinite hold.\nIf a search is configured it to search locations on hold and then you change an eDiscovery hold in the case (by adding or removing a location or changing a hold query), the search configuration is updated with those changes. However, you have to rerun the search after the hold is changed to update the search results.\nIf multiple eDiscovery holds are placed on a single location in an eDiscovery case and you select to search locations on hold, the maximum number of keywords for that search query is 500. That's because the search combines all the query-based holds by using the\nOR\noperator. If there are more than 500 keywords in the combined hold queries and the search query, then all content in the mailbox is searched, not just that content that matches the query-based case holds.\nIf an eDiscovery hold has a status of\nOn (Pending)\n, you can still search the locations on hold while the hold is being turned on.\nPreserve content in Microsoft Teams\nConversations that are part of a Microsoft Teams channel are stored in the mailbox that's associated with the Microsoft Team. Similarly, files that team members share in a channel are stored on the team's SharePoint site. Therefore, you have to place the Team mailbox and SharePoint site on eDiscovery hold to preserve conversations and files in a channel.\nAlternatively, conversations that are part of the Chat list in Teams (called\n1:1 chats\nor\n1:N group chats\n) are stored in the mailboxes of the users who participate in the chat. And files that users share in chat conversations are stored in the OneDrive account of the user who shares the file. Therefore, you have to add the individual user mailboxes and OneDrive accounts to an eDiscovery hold to preserve conversations and files in the chat list. It's a good idea to place a hold on the mailboxes of members of a Microsoft Team in addition to placing the team mailbox and site on hold.\nNote\nIf your organization has an Exchange hybrid deployment (or your organization synchronizes an on-premises Exchange organization with Office 365) and has enabled Microsoft Teams, on-premises users can use the Teams chat application and participate in 1:1 chats and 1:N group chats. These conversations are stored in cloud-based storage that's associated with an on-premises user. If an on-premises user is placed on an eDiscovery hold, the Teams chat content in the cloud-based storage will be preserved. For more information, see\nSearch for Teams chat data for on-premises users\n.\nFor more information about preserving Teams content, see\nPlace a Microsoft Teams user or team on legal hold\n.\nPreserve card content\nSimilarly, card content generated by apps in Teams channels, 1:1 chats, and 1:N group chats are stored in mailboxes and is preserved when a mailbox is placed on an eDiscovery hold. A\ncard\nis a UI container for short pieces of content. Cards can have multiple properties and attachments, and can include items that trigger card actions. For more information, see\nCards\n. Like other Teams content, where card content is stored is based on where the card was used. Content for cards used in a Teams channel is stored in the Teams group mailbox. Card content for 1:1 and 1xN chats are stored in the mailboxes of the chat participants.\nPreserve meeting and call information\nSummary information for meetings and calls in a Teams channel is also stored in the mailboxes of users who dialed into the meeting or call. This content is also preserved when an eDiscovery hold is placed on user mailboxes.\nPreserve content in private channels\nStarting in February 2020, we also turned on the ability to preserve content in private channels. Because private channel chats are stored in the mailboxes of the chat participants, placing a user mailbox on eDiscovery hold preserves private channel chats. Also, if a user mailbox was placed on an eDiscovery hold prior to February 2020, the hold will now automatically apply to private channel messages stored in that mailbox. Preserving files shared in private channels is also supported.\nPreserve wiki content\nEvery Team or team channel also contains a Wiki for note taking and collaboration. The Wiki content is automatically saved to a file with a .mht format. This file is stored in the Teams Wiki Data document library on the team's SharePoint site. You can preserve the wiki content by adding the team's SharePoint site to an eDiscovery hold.\nNote\nThe capability to preserve Wiki content for a Team or team channel (when you place the team's SharePoint site on hold) was released on June 22, 2017. If a team site is on hold, the Wiki content will be retained starting on that date. However, if a team site is on hold and the Wiki content was deleted before June 22, 2017, the Wiki content was not preserved.\nMicrosoft 365 groups\nTeams is built on Microsoft 365 groups. Therefore, placing Microsoft 365 groups on eDiscovery hold is similar placing Teams content on hold.\nKeep the following things in mind when placing both Teams and Microsoft 365 groups on an eDiscovery hold:\nAs previously explained, to place content located in Teams and Microsoft 365 groups on hold, you have to specify the mailbox and SharePoint site that associated with a group or team.\nRun the\nGet-UnifiedGroup\ncmdlet in\nExchange Online PowerShell\nto view properties for Teams and Microsoft 365 groups. This is a good way to get the URL for the site that's associated with a Team or Microsoft 365 group. For example, the following command displays selected properties for a Microsoft 365 group named Senior Leadership Team:\nGet-UnifiedGroup \"Senior Leadership Team\" | FL DisplayName,Alias,PrimarySmtpAddress,SharePointSiteUrl\n\nDisplayName : Senior Leadership Team\nAlias : seniorleadershipteam\nPrimarySmtpAddress : seniorleadershipteam@contoso.onmicrosoft.com\nSharePointSiteUrl : https://contoso.sharepoint.com/sites/seniorleadershipteam\nNote\nTo run the\nGet-UnifiedGroup\ncmdlet, you have to be assigned the View-Only Recipients role in Exchange Online or be a member of a role group that's assigned the View-Only Recipients role.\nWhen a user's mailbox is searched, any Team or Microsoft 365 group that the user is a member of won't be searched. Similarly, when you place a Team or Microsoft 365 group on eDiscovery hold, only the group mailbox and group site are placed on hold. The mailboxes and OneDrive for Business sites of group members aren't placed on hold unless you explicitly add them to the eDiscovery hold. So if you have to place a Team or Microsoft 365 group on hold for a legal reason, consider adding the mailboxes and OneDrive accounts of team or group members on the same hold.\nTo get a list of the members of a Team or Microsoft 365 group, you can view the properties on the\nGroups\npage in the Microsoft 365 admin center. Alternatively, you can run the following command in Exchange Online PowerShell:\nGet-UnifiedGroupLinks -LinkType Members | FL DisplayName,PrimarySmtpAddress\nNote\nTo run the\nGet-UnifiedGroupLinks\ncmdlet, you have to be assigned the View-Only Recipients role in Exchange Online or be a member of a role group that's assigned the View-Only Recipients role.\nPreserve content in OneDrive accounts\nImportant\nThe retention period for deleted OneDrive accounts is different from the retention period for mailboxes. For more information about the retention period for deleted OneDrive accounts, see\nOneDrive retention and deletion\n.\nTo collect a list of the URLs for the OneDrive for Business sites in your organization so you can add them to a hold or search associated with an eDiscovery case, see\nCreate a list of all OneDrive locations in your organization\n. The script in this article creates a text file that contains a list of all OneDrive sites in your organization. To run this script, you have to install and use the SharePoint Online Management Shell. Be sure to append the URL for your organization's MySite domain to each OneDrive site that you want to search. This is the domain that contains all your OneDrive; for example,\nhttps://contoso-my.sharepoint.com\n. Here's an example of a URL for a user's OneDrive site:\nhttps://contoso-my.sharepoint.com/personal/sarad_contoso_onmicrosoft.com\n.\nImportant\nThe URL for a user's OneDrive account includes their user principal name (UPN) (for example,\nhttps://alpinehouse-my.sharepoint.com/personal/sarad_alpinehouse_onmicrosoft_com\n). In the rare case that a person's UPN is changed, their OneDrive URL will also change to incorporate the new UPN. If a user's OneDrive account is part of an eDiscovery hold, and their UPN is changed, you need to update the hold by adding the user's new OneDrive URL and removing the old one. If the URL for the OneDrive site changes, previously placed holds on the site remain effective and content is preserved. For more information, see\nHow UPN changes affect the OneDrive URL\n.\nRemoving content locations from an eDiscovery hold\nAfter a mailbox, SharePoint site, or OneDrive account is removed from an eDiscovery hold, a\ndelay hold\nis applied. This means that the actual removal of the hold is delayed for 30 days to prevent data from being permanently deleted (purged) from a content location. This gives admins an opportunity to search for or recover content that will be purged after an eDiscovery hold is removed. The details of how the delay hold works for mailboxes and sites are different.\nMailboxes:\nA delay hold is placed on a mailbox the next time the Managed Folder Assistant processes the mailbox and detects that an eDiscovery hold was removed. Specifically, a delay hold is applied to a mailbox when the Managed Folder Assistant sets one of the following mailbox properties to\nTrue\n:\nDelayHoldApplied:\nThis property applies to email-related content (generated by people using Outlook and Outlook on the web) that's stored in a user's mailbox.\nDelayReleaseHoldApplied:\nThis property applies to cloud-based content (generated by non-Outlook apps such as Microsoft Teams, Microsoft Forms, and Microsoft Yammer) that's stored in a user's mailbox. Cloud data generated by a Microsoft app is typically stored in a hidden folder in a user's mailbox.\nWhen a delay hold is placed on the mailbox (when either of the previous properties is set to\nTrue\n), the mailbox is still considered to be on hold for an unlimited hold duration, as if the mailbox was on Litigation Hold. After 30 days, the delay hold expires, and Microsoft 365 will automatically attempt to remove the delay hold (by setting the DelayHoldApplied or DelayReleaseHoldApplied property to\nFalse\n) so that the hold is removed. After either of these properties are set to\nFalse\n, the corresponding items that are marked for removal are purged the next time the mailbox is processed by the Managed Folder Assistant.\nFor more information, see\nManaging mailboxes on delay hold\n.\nSharePoint and OneDrive sites:\nAny SharePoint or OneDrive content that's being retained in the Preservation Hold library isn't deleted during the 30-day delay hold period after a site is removed from an eDiscovery hold. This is similar to what happens when a site is released from a retention policy. Additionally, you can't manually delete this content in the Preservation Hold library during the 30-day delay hold period. To release a site from the 30-day delay hold/grace period hold, see the\nCan't delete a site because of an invalid retention policy or eDiscovery hold\ntroubleshooting article.\nFor more information, see\nReleasing a policy for retention\n.\nA delay hold is also applied to content locations on hold when you close a eDiscovery (Standard) case because holds are turned off when a case is closed. For more information about closing a case, see\nClose, reopen, and delete a eDiscovery (Standard) case\n.\neDiscovery hold limits\nThe following table lists the limits for eDiscovery cases and case holds.\nDescription of limit\nLimit\nMaximum number of cases for an organization.\nNo limit\nMaximum number of eDiscovery hold policies for an organization. This limit includes the combined total of hold policies in eDiscovery (Standard) and eDiscovery (Premium) cases.\n10,000\n1\nMaximum number of mailboxes in a single eDiscovery hold. This limit includes the combined total of user mailboxes, and the mailboxes associated with Microsoft 365 groups, Microsoft Teams, and Viva Engage Groups.\n1,000\nMaximum number of sites in a single eDiscovery hold. This limit includes the combined total of OneDrive for Business sites, SharePoint sites, and the sites associated with Microsoft 365 groups, Microsoft Teams, and Viva Engage Groups.
\nEndpoint settings\n).\nSupported\nSupported\nSupported\nAuditable and restrictable\nCopy or move using RDP\nDetects when a user attempts to copy an item to a remote desktop session.\nSupported\nSupported\nNot supported\nAuditable and restrictable\nCreate an item\nDetects the creation of an item.\nSupported\nSupported\nSupported\nAuditable\nRename an item\nDetects the renaming of an item.\nSupported\nSupported\nSupported\nAuditable\nAccess by restricted apps\nDetects when an application that is on the restricted apps list (as defined in\nrestricted apps and app groups\n) attempts to access protected files on an endpoint device.\nSupported\nSupported\nSupported\nCreate Windows Recall snapshots (in preview)\nDetects when there is an item or Teams message that contains a sensitive information type or has a senstivity label applied that would be included in a\nWindows Recall\nsnapshot.\nSupported\nNot supported\nNot supported\nAuditable and restrictable\nCopy to clipboard behavior\nWhen you configure a rule to\nBlock\nor\nBlock with override\nwhen a user attempts the Copy to clipboard activity on content from a file that matches the policy, end users see this behavior with these configurations:\nWord file 123 contains sensitive information that matches the copy to clipboard Block rule.\nExcel file 123 contains sensitive information that matches the copy to clipboard Block rule.\nPowerPoint file 123 contains sensitive information that matches the copy to clipboard Block rule.\nWord file 789 doesn't contain sensitive information.\nExcel file 789 doesn't contain sensitive information.\nPowerPoint file 789 doesn't contain sensitive information.\nNotepad (or any non Microsoft Office based app or process) file XYZ contains sensitive information that matches the copy to clipboard Block rule.\nNotepad (or any non Microsoft Office based app or process) file ABC doesn't contain sensitive information.\nSource\nDestination\nBehavior\nWord file 123/Excel file 123/PowerPoint file 123\nWord file 123/Excel file 123/PowerPoint file 123\ncopy and paste are allowed, in other words intra file copy and paste is allowed.\nWord file 123/Excel File 123/PowerPoint file 123\nWord file 789/Excel file 789/PowerPoint file 789\ncopy and paste are blocked, in other words inter file copy and paste is blocked.\nWord file 789/Excel file 789/PowerPoint file 789\nWord file 123/Excel File 123/PowerPoint file 123\ncopy and paste are allowed\nWord file 123/Excel file 123/PowerPoint file 123\nNotepad file ABC\ncopy and paste are blocked\nNotepad file XYZ\nany\ncopy is blocked\nNotepad file ABC\nany\ncopy and paste are allowed\nNote\nWhen a DLP rule that blocks copying is applied to an open file, copying from any other file within the same application (even files with no DLP rules applied) is restricted while the DLP blocked file is open.\nBest practice for endpoint DLP policies\nSay you want to block all items that contain credit card numbers from leaving endpoints of Finance department users. We recommend:\nCreate a policy and scope it to endpoints and to that group of users.\nCreate a rule in the policy that detects the type of information that you want to protect. In this case, set\ncontent contains\nto\nSensitive information type\n*, and select\nCredit Card\n.\nSet the actions for each activity to\nBlock\n.\nFor more information on designing your DLP policies, see\nDesign a data loss prevention policy\n.\nNote\nIn Microsoft Purview, DLP policy evaluation of sensitive items occurs centrally, so there's no time lag for policies and policy updates to be distributed to individual devices. When a policy is updated in Microsoft Purview portal, it generally takes about an hour for those updates to be synchronized across the service. Once policy updates are synchronized, items on targeted devices are automatically reevaluated the next time they're accessed or modified. In preview, you can also use\non-demand classification\nto identify and classifies sensitive content in historical data stored in SharePoint and OneDrive. For Authorized Groups changes, the policy needs 24 hours to sync.\nMonitored files\nFiles monitored via policy\nEndpoint DLP monitors these file types policy in Windows 10, 11:\nFile Type\nFormat\nMonitored file extensions\nWord processing\nWord\n.doc, .docx, .docm, .dot, .dotx, .dotm, .docb, .obd, obt\nSpreadsheet\nExcel, CSV, TSV\n.xls, .xlsx, .xlt, .xlm, .xlsm, .xlsb, .xlb, .xlc, .csv, .tsv\nPresentation\nPowerPoint\n.ppt, .pptx, .pptm, .pps, .ppsx, .ppsm, .pot, .potx, .potm, .ppam, .pos\nArchive\nZip, RAR, 7z, TAR\n.zip, .rar, .7z, .tar, .gz, .arj, .bz2, .cab, .chm, .lzh, .lzma, .mhtml, .xar, .xz\nAdobe PDF\nPDF\n.pdf\nText\nText\n.asm, .bat, .c, .cmd, .cpp, .cs, .csv, .cxx, .def, .dic, .h, .hpp, .hxx, .idl, .inc, .inx, .java, .m3u, .mpx, .php, .pl, .pos, .txt, .vcf, .vcs\nHTML\nHTML\n.ascx, .asp, .aspx, .hta, .htm, .htw, .htx, .jhtml, .html\nJSON\nJSON\n.json\nMail\nMail\n.eml, .msg, .nws\nXML\nXML\n.jsp, .mspx\nProtected Files\nPFile\n.pbmp, .pgif, .pjfif, .pjpe, .pjpeg, .pjpg, .ppng, .ptif, .ptiff, .ptxt, .pxla, .pxlam, .pxml\nOther\nOther\n.dfx, .dxf, .fluid, .mime, .pointpub, .rtf, .vtt\nEndpoint DLP monitors these file types policy in the latest three major releases of macOS:\nFile Type\nFormat\nMonitored file extensions\nWord processing\nWord\n.doc, .docx, .docm, .dot, .dotx, .dotm, .docb\nSpreadsheet\nExcel, CSV, TSV\n.xls, .xlsx, .xlt, .xltx, .xltm, .xlw, .xlm, .xlsm, .xlsb, .xlb, .xlc, .csv, .tsv\nPresentation\nPowerPoint\n.ppt, .pptx, .pptm, .pps, .ppsx, .ppsm, .pot, .potx, .potm, .ppam\nArchive\nZip, RAR, 7z, TAR\n.zip, .rar, .7z, .tar\nAdobe PDF\nPDF\n.pdf\nText\nText\n.asm, .bat, .c, .cmd, .cpp, .cs, .csv, .cxx, .def, .dic, .h, .hpp, .hxx, .idl, .inc, .inx, .java, .m3u, .mpx, .php, .pl, .pos, .txt, .vcf, .vcs\nHTML\nHTML\n.ascx, .asp, .aspx, .hta, .htm, .htw, .htx, .jhtml, .html, .mht\nJSON\nJSON\n.json\nMail\nMail\n.eml, .msg, .nws\nXML\nXML\n.jsp, .mspx\nOther\nOther\n.pointpub, .rtf, .vtt\nMicrosoft Office xml\nMicrosoft Office xml\n.powerpointml\nThese file types can be monitored through policy settings in Windows 10, 11 and macOS devices, if\nOCR\nis enabled:\n.jpg, .png, .tif, .tiff, .bmp, .jpeg\nEndpoint DLP scans files with the specified extensions listed above. If you need to protect or monitor files not listed, use the 'Apply restrictions to only unsupported file extensions' feature. The key differences between this feature and the\nFile extension is\ncondition are:\nEndpoint DLP scans content for the\nFile extension is\ncondition, allowing you to see Sensitive info type values in events or alerts. In contrast, the\nApply restrictions to only unsupported file extensions\nfeature does not scan file content.\nThe\nFile extension is\ncondition triggers content scanning, which may consume higher machine resources like CPU and memory, potentially causing performance issues for some file types.\nFor more details on the\nApply restrictions to only unsupported file extensions\nfeature, refer to the following resources:\nHelp protect files that Endpoint Data Loss Prevention doesn't scan\nHelp protect against sharing of a defined set of unsupported files\nFiles audited regardless of policy match\nActivities can be audited on these file types in Windows 10, 11, and in the latest three major releases of macOS, even if no policy match exists:\nWindows 10, 11\nmacOS\n.doc, .docx, .docm, .dot, .dotx, .dotm, .docb, .xls, .xlsx, .xlt, .xlm, .xlsm, .xltx, .xltm, .xlsb, .xlw, .ppt, .pptx, .pos, .pps, .pptm, .potx, .potm, .ppam, .ppsx, .pbix, .pdf, .csv, .tsv, .zip, .rar, .7z, .tar, .war, .gz, .dlp\n.doc, .docx, .docm, .dot, .dotx, .dotm, .docb, .xls, .xlsx, .xlt, .xlm, .xlsm, .xltx, .xltm, .xlsb, .xlw, .ppt, .pptx, .pos, .pps, .pptm, .potx, .potm, .ppam, .ppsx, .pbix, .pdf, .csv, .tsv\nNote\nThese file types can be audited, regardless of a policy match, in Windows 10, 11 and macOS devices, so long as\nOCR\nis enabled:\n.jpg, .png, .tif, .tiff, .bmp, .jpeg\nImportant\nFor information about the Adobe requirements for using Microsoft Purview Data Loss Prevention (DLP) features with PDF files, see this article from Adobe:\nMicrosoft Purview Information Protection Support in Acrobat\n.\nIf you only want to monitor data from policy matches, you can turn off the\nAlways audit file activity for devices\nin the\nData loss prevention settings\n>\nEndpoint settings\n.\nIf the\nAlways audit file activity for devices\nsetting is on, activities on any Word, PowerPoint, Excel, PDF, and .csv files are always audited, even if the device isn't targeted by any policy.\nTo ensure activities are audited for all supported file types, create a\ncustom DLP policy\n.\nEndpoint DLP monitors activity-based on MIME type, so activities are captured, even if the file extension is changed, for these files types:\nAfter the extension is changed to any other file extension:\n.doc\n.docx\n.xls\n.xlsx\n.ppt\n.pptx\n.pdf\nIf the extension is changed only to supported file extensions:\n.txt\n.msg\n.rtf\n.c\n.cpp\n.h\n.cs\n.java\n.tsv\nUnsupported File types\nEndpoint DLP does not support monitoring for the following file types:\n.exe\n.dll\n.sys\n.lib\n.obj\n.mui\n.spl\n.drv\n.pf\n.crdownload\n.ini\nWhat's different in Endpoint DLP\nThere are a few extra concepts that you need to be aware of before you dig into Endpoint DLP.\nEnabling Device management\nDevice management is the functionality that enables the collection of telemetry from devices and brings it into Microsoft Purview solutions like Endpoint DLP and\ninsider risk management\n. You need to onboard all the devices you want to use as locations in your DLP policies.\nOnboarding and offboarding are handled via scripts that you download from the device management center. The device management center has custom scripts for each of the following deployment methods:\nLocal script (up to 10 machines)\nGroup policy\nSystem Center Configuration Manager (version 1610 or later)\nMobile Device Management/Microsoft Intune\nVDI onboarding scripts for non-persistent machines\nUse the procedures in\nGetting started with Microsoft 365 Endpoint DLP\nto onboard devices.\nOnboarding devices to Defender also onboards them to DLP. So, if you have onboarded devices through\nMicrosoft Defender for Endpoint\n, those devices show up automatically in the list of devices. You need only\nTurn on device monitoring\nto use endpoint DLP.\nViewing Endpoint DLP data\nYou can view alerts related to DLP policies enforced on endpoint devices by going to the\nDLP Alerts Management Dashboard\nand\nInvestigate data loss incidents with Microsoft Defender XDR\n.\nYou can also view details of the associated event, with rich metadata, in the same dashboard\nOnce a device is onboarded, information about audited activities flows into Activity explorer even before you configure and deploy any DLP policies that have devices as a location.\nEndpoint DLP collects extensive information on audited activity.\nFor example, if a file is copied to removable USB media, you'd see these attributes in the activity details:\nactivity type\nclient IP\ntarget file path\nhappened timestamp\nfile name\nuser\nfile extension\nfile size\nsensitive information type (if applicable)\nsha1 value\nsha256 value\nprevious file name\nlocation\nparent\nfilepath\nsource location type\nplatform\ndevice name\ndestination location type\napplication that performed the copy\nMicrosoft Defender for Endpoint device ID (if applicable)\nremovable media device manufacturer\nremovable media device model\nremovable media device serial number\nWhy are all endpoint policies on all onboarded devices?\nA device can support multiple user accounts. Because endpoint policies are scoped to users, different users of the device may have different policies scoped to them. Therefore, the device must have access to all endpoint DLP policies in order to evaluate items. All devices that are onboarded into Purview are scanned regardless if the user is in scope.\nEndpoint DLP and offline devices\nWhen a Windows endpoint device is offline, existing policies continue to be enforced on existing files. Additionally, with just-in-time protection enabled and in \"block\" mode, when a new\nfile\nis created on an offline device, the file is still prevented from being shared until the device connects to the data classification service and evaluation completes. If a new\npolicy\nis created on the server, or an existing policy is modified, those changes are updated on the device once it reconnects to the internet.\nImportant\nThis functionality isn't supported on macOS endpoint devices.\nConsider the following use cases.\nPolicies that have been pushed to a device will continue to be applied to files already classified as sensitive even after the device goes offline.\nPolicies that are updated while a device is offline won't be pushed to that device. Similarly, such policies won't be enforced on that device, until the device is back online. However, the outdated policy that exists on the offline device will still be enforced.\nJust-in-time protection\nIf notifications are configured to display, they'll always display when DLP policies are triggered, regardless of whether or not the device is online.\nNote\nWhile policies that have already been pushed to an offline device are enforced, the enforcement events don't appear in activity explorer until the device is back online.\nDLP policies regularly sync to endpoint devices. If a device is offline, the policies can't be synchronized. In this case, the\nDevices list\nreflects that the device is out of sync with the policies in on the server.\nJust-in-time protection\nJust-in-time protection blocks egress activities on these monitored files until policy evaluation completes successfully:\nItems that have never been evaluated.\nItems on which the evaluation has gone stale. These are previously evaluated items that haven't been reevaluated by the current, updated cloud versions of the policies.\nBefore you can deploy just-in-time protection, you must first deploy Antimalware Client version 4.18.23080 or later.\nNote\nFor machines with an outdated version of the Antimalware Client, we recommend disabling just-in-time protection by installing one of the following KBs:\nWindows 10 -\nKB5032278\nWindows 11 -\nKB5032288\nTo enable Just-in-time protection in the Microsoft Purview Portal, select\nSettings\nin the left navigation pane, choose\nJust-in-time protection\n, and configure your desired settings.\nChoose which locations to monitor:\nSelect\nDevices\n.\nChoose\nEdit\n.\nIn the flyout pane, select the scope of accounts and distribution groups you want to apply just-in-time protection to. Keep in mind that, while policy evaluation is processing, Endpoint DLP blocks all egress activities for each user whose account is in the selected scope. Endpoint DLP audits the egress activities for all user accounts that are excluded (via the Exclude setting) or are otherwise not in scope.\nJust-in-time protection is supported on macOS devices running the three latest major versions.\nFallback action in case of failure\n: This configuration specifies the enforcement mode that DLP should apply when the policy evaluation doesn't complete. No matter which value you select, the relevant telemetry shows in activity explorer.\nTip\nTips for maximizing user productivity:\nConfigure and deploy your Endpoint DLP policies to your devices before enabling just-in-time protection to prevent unnecessarily blocking user activity during policy evaluation.\nMake sure to carefully configure your settings for egress activities. Just-in-time protection blocks an egress activity only when that activity has one or more\nBlock\nor\nBlock with override\npolicies. This means that egress activities that aren't blocked will only be audited, even for users included in the scope of the applicable policies.\nFor more information, see\nGet started with Just-In-Time protection.\n.\nNext steps\nNow that you've learned about Endpoint DLP, your next steps are:\nOnboard Windows 10 or Windows 11 devices into Microsoft Purview overview\nOnboard macOS devices into Microsoft Purview overview\nConfigure endpoint data loss prevention settings\nUsing Endpoint data loss prevention\nSee also\nGetting started with Microsoft Endpoint data loss prevention\nUsing Microsoft Endpoint data loss prevention\nLearn about data loss prevention\nCreate and Deploy data loss prevention policies\nGet started with Activity explorer\nMicrosoft Defender for Endpoint\nInsider risk management\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "content_hash": "sha256:b0fff63cdf18b7969bb68ec1335bc53845abbb3e7a261181869417dd6aa12be9", + "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nLearn about Endpoint data loss prevention\nFeedback\nSummarize this article for me\nYou can use Microsoft Purview Data Loss Prevention (DLP) to monitor the actions that are being taken on items you've determined to be sensitive and to help prevent the unintentional sharing of those items.\nEndpoint data loss prevention\n(Endpoint DLP) extends the activity monitoring and protection capabilities of DLP to Windows 10/11, macOS (the three latest released major versions) devices, and Windows certain server versions. Once devices are onboarded into the Microsoft Purview solutions, the information about what users are doing with sensitive items is made visible in\nactivity explorer\n. You can then enforce protective actions on those items via\nDLP policies\n.\nWhen are files classified by Endpoint DLP?\nEndpoint DLP classifies files under the following conditions:\nCreation or Modification:\nEvery time a file is created or modified, it is fully scanned for sensitive information types (SITs) and labels. The file is then evaluated for matches against DLP policies and rules.\nReading an Already Classified File:\nWhen an already classified file is read, Endpoint DLP checks for any changes to the policies, rules, or SITs. If there is a change in the DLP configuration, it re-evaluates the policies and rules, but it will not re-extract the text.\nTip\nIf you're looking for device control for removable storage, see\nMicrosoft Defender for Endpoint Device Control Removable Storage Access Control\n.\nNote\nEndpoint DLP cannot detect the sensitivity label from another tenant on a document.\nEndpoint DLP Windows 10/11 and macOS support\nEndpoint DLP allows you to onboard devices running the following versions of Windows Server:\nWindows Server 2019 (\nNovember 14, 2023—KB5032196 (OS Build 17763.5122) - Microsoft Support\n)\nWindows Server 2022 (\nNovember 14, 2023 Security update (KB5032198) - Microsoft Support\n)\nNote\nInstalling the supported Windows Server KBs disables the\nClassification\nfeature on the server. This means that Endpoint DLP won't classify files on the server. However, Endpoint DLP will still protect those files on the server that were classified before those KBs were installed on server. To ensure this protection, install Microsoft Defender version 4.18.23100 (October 2023) or later.\nBy default, Endpoint DLP isn't enabled for Windows servers when they're initially onboarded. Before you can see Endpoint DLP events for your servers in Activity Explorer, you must first\nEnable Endpoint DLP for Windows Servers\n.\nOnce properly configured, the same data loss protection policies can be automatically applied to both Windows PCs and Windows servers.\nSetting\nSubsetting\nWindows 10, 1809 and later, Windows 11, Windows Server 2019, Windows Server 2022 (21H2 onwards) for Endpoints (X64)\nmacOS (three latest released versions)\nNotes\nAdvanced classification scanning and protection\nAllocated bandwidth limits\nSupported\nSupported\nAdvanced classification enables these features for macOS: -\nDocument Fingerprinting\n-\nExact data match based sensitive information types\n-\nTrainable classifiers\n-\nLearn about named entities\nFile path exclusions for Windows\nn/a\nSupported\nn/a\nFile path exclusions for Mac\nn/a\nn/a\nSupported\nmacOS includes a recommended list of exclusions that is on by default\nSetup evidence collection for file activities on devices\nSet evidence cache on device\nSupported\nPreview\nNetwork share coverage and exclusions\nn/a\nSupported\nSupported\nRestricted apps and app groups\nRestricted app groups\nSupported\nSupported\nRestricted apps and app groups\nRestricted apps\nSupported\nSupported\nRestricted apps and app groups\nAuto-quarantine settings\nSupported\nSupported\nUnallowed Bluetooth apps\nn/a\nSupported\nSupported\nBrowser and domain restrictions to sensitive data\nUnallowed browsers\nSupported\nSupported\nBrowser and domain restrictions to sensitive data\nService domains\nSupported\nSupported\nBrowser and domain restrictions to sensitive data\nSensitive service domain groups\nSupported\nSupported\nAdditional settings for Endpoint DLP\nBusiness justification in policy tips\nSupported\nSupported\nAlways audit file activity for devices\nn/a\nSupported\nSupported\nPrinter groups\nn/a\nSupported\nSupported\nRemovable USB device groups\nn/a\nSupported\nSupported\nNetwork share groups\nn/a\nSupported\nSupported\nVPN settings\nn/a\nSupported\nNot supported\nScoping DLP policies (preview)\nIn preview, DLP policies for endpoints are scoped by users, and devices. For an endpoint policy to be applied, both the user and the device must be in the policy scope. This means that if a user is in the policy scope, but the device isn't, the policy won't be applied. Similarly, if a device is in the policy scope, but the user isn't, the policy won't be applied.\nFor more information on scoping endpoint DLP policies, see\nDevice scoping\n.\nOther settings\nSetting\nWindows 10/11, Windows 10, 1809 and later, Windows 11\nWindows Server 2019, Windows Server 2022 (21H2 onwards) for Endpoints (X64)\nmacOS (three latest released versions)\nArchive file\nSupported\nSupported\nSupported\nFile type and File extension\nSupported\nSupported\nSupported\nEnable Endpoint DLP for Windows Servers\nNot supported\nSupported\nNot supported\nNote\nEndpoint DLP is not supported on Windows Servers configured as Domain Controllers or Windows Servers with the Core Server option selected during installation.\nEndpoint activities you can monitor and take action on\nEndpoint DLP enables you to audit and manage the following types of activities users take on sensitive items that are physically stored Windows 10, Windows 11, or macOS devices.\nActivity\nDescription\nWindows 10 (21H2, 22H2), Windows 11 (21H2, 22H2), Windows Server 2019, Server 2022 (21H2 onwards) for Endpoints (X64)\nWindows 11 (21H2, 22H2) for Endpoints (ARM64)\nmacOS three latest released versions\nAuditable/\nRestrictable\nUpload to a restricted cloud service domain or access from an unallowed browser\nDetects when a user attempts to upload an item to a restricted service domain or access an item through a browser. If they're using an unallowed browser, the upload activity is blocked and the user is redirected to use Microsoft Edge. Microsoft Edge then either allows or blocks the upload or access based on the DLP policy configuration. You can block, warn, or audit when protected files can be uploaded or prevented from being uploaded to cloud services based on the allow/unallowed domains list in\nData loss prevention settings\n. When the configured action is set to warn or block, other browsers (defined on the unallowed browsers list under\nData loss prevention settings\n) are blocked from accessing the file.\nSupported\nSupported\nSupported\nAuditable and restrictable\nPaste to supported browsers\nDetects when a user attempts to paste content to a restricted service domain. Evaluation is performed on the content that is being pasted. This evaluation is independent of how the source item that the content came from is classified.\nSupported\nSupported\nPreview\nAuditable and restrictable\nCopy to clipboard\nWhen a user attempts to copy content from a protected file, you can block, block with override, or audit the copying of protected files to a clipboard on an endpoint device. If the rule is configured to\nBlock\nor\nBlock with override\ncopying is blocked when the source content is sensitive except when the destination is within the same Microsoft 365 Office app. This activity also applies to redirected clipboards when using Azure Virtual Desktop with Windows 365.\nSupported\nSupported\nSupported\nAuditable and restrictable\nCopy to USB removable device\nWhen this activity is detected, you can block, warn, or audit the copying or moving of protected files from an endpoint device to USB removable media.\nSupported\nSupported\nSupported\nAuditable and restrictable\nCopy to a network share\nWhen this activity is detected, you can block, warn, or audit the copying or moving of protected files from an endpoint device to any network share, including redirected USB devices that are displayed as network shares on an Azure Virtual Desktop with Windows 365.\nSupported\nSupported\nSupported\nAuditable and restrictable\nPrint\nWhen this activity is detected, you can block, warn, or audit the printing of protected files from an endpoint device. This activity also applies to redirected printers when using Azure Virtual Desktop together with Windows 365.\nSupported\nSupported\nSupported\nAuditable and restrictable\nCopy or move using unallowed Bluetooth app\nDetects when a user attempts to copy an item to an unallowed Bluetooth app (as defined in the list of unallowed Bluetooth apps in\nData loss prevention settings\n>\nEndpoint settings\n).\nSupported\nSupported\nSupported\nAuditable and restrictable\nCopy or move using RDP\nDetects when a user attempts to copy an item to a remote desktop session.\nSupported\nSupported\nNot supported\nAuditable and restrictable\nCreate an item\nDetects the creation of an item.\nSupported\nSupported\nSupported\nAuditable\nRename an item\nDetects the renaming of an item.\nSupported\nSupported\nSupported\nAuditable\nAccess by restricted apps\nDetects when an application that is on the restricted apps list (as defined in\nrestricted apps and app groups\n) attempts to access protected files on an endpoint device.\nSupported\nSupported\nSupported\nCreate Windows Recall snapshots (in preview)\nDetects when there is an item or Teams message that contains a sensitive information type or has a senstivity label applied that would be included in a\nWindows Recall\nsnapshot.\nSupported\nNot supported\nNot supported\nAuditable and restrictable\nCopy to clipboard behavior\nWhen you configure a rule to\nBlock\nor\nBlock with override\nwhen a user attempts the Copy to clipboard activity on content from a file that matches the policy, end users see this behavior with these configurations:\nWord file 123 contains sensitive information that matches the copy to clipboard Block rule.\nExcel file 123 contains sensitive information that matches the copy to clipboard Block rule.\nPowerPoint file 123 contains sensitive information that matches the copy to clipboard Block rule.\nWord file 789 doesn't contain sensitive information.\nExcel file 789 doesn't contain sensitive information.\nPowerPoint file 789 doesn't contain sensitive information.\nNotepad (or any non Microsoft Office based app or process) file XYZ contains sensitive information that matches the copy to clipboard Block rule.\nNotepad (or any non Microsoft Office based app or process) file ABC doesn't contain sensitive information.\nSource\nDestination\nBehavior\nWord file 123/Excel file 123/PowerPoint file 123\nWord file 123/Excel file 123/PowerPoint file 123\ncopy and paste are allowed, in other words intra file copy and paste is allowed.\nWord file 123/Excel File 123/PowerPoint file 123\nWord file 789/Excel file 789/PowerPoint file 789\ncopy and paste are blocked, in other words inter file copy and paste is blocked.\nWord file 789/Excel file 789/PowerPoint file 789\nWord file 123/Excel File 123/PowerPoint file 123\ncopy and paste are allowed\nWord file 123/Excel file 123/PowerPoint file 123\nNotepad file ABC\ncopy and paste are blocked\nNotepad file XYZ\nany\ncopy is blocked\nNotepad file ABC\nany\ncopy and paste are allowed\nNote\nWhen a DLP rule that blocks copying is applied to an open file, copying from any other file within the same application (even files with no DLP rules applied) is restricted while the DLP blocked file is open.\nBest practice for endpoint DLP policies\nSay you want to block all items that contain credit card numbers from leaving endpoints of Finance department users. We recommend:\nCreate a policy and scope it to endpoints and to that group of users.\nCreate a rule in the policy that detects the type of information that you want to protect. In this case, set\ncontent contains\nto\nSensitive information type\n*, and select\nCredit Card\n.\nSet the actions for each activity to\nBlock\n.\nFor more information on designing your DLP policies, see\nDesign a data loss prevention policy\n.\nNote\nIn Microsoft Purview, DLP policy evaluation of sensitive items occurs centrally, so there's no time lag for policies and policy updates to be distributed to individual devices. When a policy is updated in Microsoft Purview portal, it generally takes about an hour for those updates to be synchronized across the service. Once policy updates are synchronized, items on targeted devices are automatically reevaluated the next time they're accessed or modified. In preview, you can also use\non-demand classification\nto identify and classifies sensitive content in historical data stored in SharePoint and OneDrive. For Authorized Groups changes, the policy needs 24 hours to sync.\nMonitored files\nFiles monitored via policy\nEndpoint DLP monitors these file types policy in Windows 10, 11:\nFile Type\nFormat\nMonitored file extensions\nWord processing\nWord\n.doc, .docx, .docm, .dot, .dotx, .dotm, .docb, .obd, obt\nSpreadsheet\nExcel, CSV, TSV\n.xls, .xlsx, .xlt, .xlm, .xlsm, .xlsb, .xlb, .xlc, .csv, .tsv\nPresentation\nPowerPoint\n.ppt, .pptx, .pptm, .pps, .ppsx, .ppsm, .pot, .potx, .potm, .ppam, .pos\nArchive\nZip, RAR, 7z, TAR\n.zip, .rar, .7z, .tar, .gz, .arj, .bz2, .cab, .chm, .lzh, .lzma, .mhtml, .xar, .xz\nAdobe PDF\nPDF\n.pdf\nText\nText\n.asm, .bat, .c, .cmd, .cpp, .cs, .csv, .cxx, .def, .dic, .h, .hpp, .hxx, .idl, .inc, .inx, .java, .m3u, .mpx, .php, .pl, .pos, .txt, .vcf, .vcs\nHTML\nHTML\n.ascx, .asp, .aspx, .hta, .htm, .htw, .htx, .jhtml, .html\nJSON\nJSON\n.json, .adaptivecard, .messagecard\nMail\nMail\n.eml, .msg, .nws\nXML\nXML\n.infopathml, .jsp, .mspx\nMicrosoft Office xml\nMicrosoft Office xml\n.excelml, .powerpointml, .wordml\nProtected Files\nPFile\n.pbmp, .pgif, .pjfif, .pjpe, .pjpeg, .pjpg, .ppng, .ptif, .ptiff, .ptxt, .pxla, .pxlam, .pxml\nOther\nOther\n.dfx, .dxf, .fluid, .mime, .pointpub, .rtf, .vtt\nEndpoint DLP monitors these file types policy in the latest three major releases of macOS:\nFile Type\nFormat\nMonitored file extensions\nWord processing\nWord\n.doc, .docx, .docm, .dot, .dotx, .dotm, .docb\nSpreadsheet\nExcel, CSV, TSV\n.xls, .xlsx, .xlt, .xltx, .xltm, .xlw, .xlm, .xlsm, .xlsb, .xlb, .xlc, .csv, .tsv\nPresentation\nPowerPoint\n.ppt, .pptx, .pptm, .pps, .ppsx, .ppsm, .pot, .potx, .potm, .ppam, .pos\nArchive\nZip, RAR, 7z, TAR\n.zip, .zipx, .rar, .7z, .tar\nAdobe PDF\nPDF\n.pdf\nText\nText\n.asm, .bat, .c, .cmd, .cpp, .cs, .csv, .cxx, .dic, .h, .hpp, .hxx, .idl, .inc, .inx, .inf, .inx, .java, .m3u, .mpx, .php, .pl, .pos, .txt, .vcf, .vcs\nHTML\nHTML\n.ascx, .asp, .aspx, .css, .hta, .htm, .htw, .htx, .jhtml, .html, .mht\nJSON\nJSON\n.json, .adaptivecard, .messagecard\nMail\nMail\n.eml, .msg, .nws\nXML\nXML\n.infopathml, .jsp, .mspx\nMicrosoft Office xml\nMicrosoft Office xml\n.excelml, .powerpointml, .wordml\nOther\nOther\n.pointpub, .rtf, .vtt\nThese file types can be monitored through policy settings in Windows 10, 11 and macOS devices, if\nOCR\nis enabled:\n.jpg, .png, .tif, .tiff, .bmp, .jpeg\nEndpoint DLP scans files with the specified extensions listed above. If you need to protect or monitor files not listed, use the 'Apply restrictions to only unsupported file extensions' feature. The key differences between this feature and the\nFile extension is\ncondition are:\nEndpoint DLP scans content for the\nFile extension is\ncondition, allowing you to see Sensitive info type values in events or alerts. In contrast, the\nApply restrictions to only unsupported file extensions\nfeature does not scan file content.\nThe\nFile extension is\ncondition triggers content scanning, which may consume higher machine resources like CPU and memory, potentially causing performance issues for some file types.\nFor more details on the\nApply restrictions to only unsupported file extensions\nfeature, refer to the following resources:\nHelp protect files that Endpoint Data Loss Prevention doesn't scan\nHelp protect against sharing of a defined set of unsupported files\nFiles audited regardless of policy match\nActivities can be audited on these file types in Windows 10, 11, and in the latest three major releases of macOS, even if no policy match exists:\nWindows 10, 11\nmacOS\n.doc, .docx, .docm, .dot, .dotx, .dotm, .docb, .xls, .xlsx, .xlt, .xlm, .xlsm, .xltx, .xltm, .xlsb, .xlw, .ppt, .pptx, .pos, .pps, .pptm, .potx, .potm, .ppam, .ppsx, .pbix, .pdf, .csv, .tsv, .zip, .rar, .7z, .tar, .war, .gz, .dlp\n.doc, .docx, .docm, .dot, .dotx, .dotm, .docb, .xls, .xlsx, .xlt, .xlm, .xlsm, .xltx, .xltm, .xlsb, .xlw, .ppt, .pptx, .pos, .pps, .pptm, .potx, .potm, .ppam, .ppsx, .pbix, .pdf, .csv, .tsv\nNote\nThese file types can be audited, regardless of a policy match, in Windows 10, 11 and macOS devices, so long as\nOCR\nis enabled:\n.jpg, .png, .tif, .tiff, .bmp, .jpeg\nImportant\nFor information about the Adobe requirements for using Microsoft Purview Data Loss Prevention (DLP) features with PDF files, see this article from Adobe:\nMicrosoft Purview Information Protection Support in Acrobat\n.\nIf you only want to monitor data from policy matches, you can turn off the\nAlways audit file activity for devices\nin the\nData loss prevention settings\n>\nEndpoint settings\n.\nIf the\nAlways audit file activity for devices\nsetting is on, activities on any Word, PowerPoint, Excel, PDF, and .csv files are always audited, even if the device isn't targeted by any policy.\nTo ensure activities are audited for all supported file types, create a\ncustom DLP policy\n.\nEndpoint DLP monitors activity-based on MIME type, so activities are captured, even if the file extension is changed, for these files types:\nAfter the extension is changed to any other file extension:\n.doc\n.docx\n.xls\n.xlsx\n.ppt\n.pptx\n.pdf\nIf the extension is changed only to supported file extensions:\n.txt\n.msg\n.rtf\n.c\n.cpp\n.h\n.cs\n.java\n.tsv\nUnsupported File types\nEndpoint DLP does not support monitoring for the following file types:\n.exe\n.dll\n.sys\n.lib\n.obj\n.mui\n.spl\n.drv\n.pf\n.crdownload\n.ini\nWhat's different in Endpoint DLP\nThere are a few extra concepts that you need to be aware of before you dig into Endpoint DLP.\nEnabling Device management\nDevice management is the functionality that enables the collection of telemetry from devices and brings it into Microsoft Purview solutions like Endpoint DLP and\ninsider risk management\n. You need to onboard all the devices you want to use as locations in your DLP policies.\nOnboarding and offboarding are handled via scripts that you download from the device management center. The device management center has custom scripts for each of the following deployment methods:\nLocal script (up to 10 machines)\nGroup policy\nSystem Center Configuration Manager (version 1610 or later)\nMobile Device Management/Microsoft Intune\nVDI onboarding scripts for non-persistent machines\nUse the procedures in\nGetting started with Microsoft 365 Endpoint DLP\nto onboard devices.\nOnboarding devices to Defender also onboards them to DLP. So, if you have onboarded devices through\nMicrosoft Defender for Endpoint\n, those devices show up automatically in the list of devices. You need only\nTurn on device monitoring\nto use endpoint DLP.\nViewing Endpoint DLP data\nYou can view alerts related to DLP policies enforced on endpoint devices by going to the\nDLP Alerts Management Dashboard\nand\nInvestigate data loss incidents with Microsoft Defender XDR\n.\nYou can also view details of the associated event, with rich metadata, in the same dashboard\nOnce a device is onboarded, information about audited activities flows into Activity explorer even before you configure and deploy any DLP policies that have devices as a location.\nEndpoint DLP collects extensive information on audited activity.\nFor example, if a file is copied to removable USB media, you'd see these attributes in the activity details:\nactivity type\nclient IP\ntarget file path\nhappened timestamp\nfile name\nuser\nfile extension\nfile size\nsensitive information type (if applicable)\nsha1 value\nsha256 value\nprevious file name\nlocation\nparent\nfilepath\nsource location type\nplatform\ndevice name\ndestination location type\napplication that performed the copy\nMicrosoft Defender for Endpoint device ID (if applicable)\nremovable media device manufacturer\nremovable media device model\nremovable media device serial number\nWhy are all endpoint policies on all onboarded devices?\nA device can support multiple user accounts. Because endpoint policies are scoped to users, different users of the device may have different policies scoped to them. Therefore, the device must have access to all endpoint DLP policies in order to evaluate items. All devices that are onboarded into Purview are scanned regardless if the user is in scope.\nEndpoint DLP and offline devices\nWhen a Windows endpoint device is offline, existing policies continue to be enforced on existing files. Additionally, with just-in-time protection enabled and in \"block\" mode, when a new\nfile\nis created on an offline device, the file is still prevented from being shared until the device connects to the data classification service and evaluation completes. If a new\npolicy\nis created on the server, or an existing policy is modified, those changes are updated on the device once it reconnects to the internet.\nImportant\nThis functionality isn't supported on macOS endpoint devices.\nConsider the following use cases.\nPolicies that have been pushed to a device will continue to be applied to files already classified as sensitive even after the device goes offline.\nPolicies that are updated while a device is offline won't be pushed to that device. Similarly, such policies won't be enforced on that device, until the device is back online. However, the outdated policy that exists on the offline device will still be enforced.\nJust-in-time protection\nIf notifications are configured to display, they'll always display when DLP policies are triggered, regardless of whether or not the device is online.\nNote\nWhile policies that have already been pushed to an offline device are enforced, the enforcement events don't appear in activity explorer until the device is back online.\nDLP policies regularly sync to endpoint devices. If a device is offline, the policies can't be synchronized. In this case, the\nDevices list\nreflects that the device is out of sync with the policies in on the server.\nJust-in-time protection\nJust-in-time protection blocks egress activities on these monitored files until policy evaluation completes successfully:\nItems that have never been evaluated.\nItems on which the evaluation has gone stale. These are previously evaluated items that haven't been reevaluated by the current, updated cloud versions of the policies.\nBefore you can deploy just-in-time protection, you must first deploy Antimalware Client version 4.18.23080 or later.\nNote\nFor machines with an outdated version of the Antimalware Client, we recommend disabling just-in-time protection by installing one of the following KBs:\nWindows 10 -\nKB5032278\nWindows 11 -\nKB5032288\nTo enable Just-in-time protection in the Microsoft Purview Portal, select\nSettings\nin the left navigation pane, choose\nJust-in-time protection\n, and configure your desired settings.\nChoose which locations to monitor:\nSelect\nDevices\n.\nChoose\nEdit\n.\nIn the flyout pane, select the scope of accounts and distribution groups you want to apply just-in-time protection to. Keep in mind that, while policy evaluation is processing, Endpoint DLP blocks all egress activities for each user whose account is in the selected scope. Endpoint DLP audits the egress activities for all user accounts that are excluded (via the Exclude setting) or are otherwise not in scope.\nJust-in-time protection is supported on macOS devices running the three latest major versions.\nFallback action in case of failure\n: This configuration specifies the enforcement mode that DLP should apply when the policy evaluation doesn't complete. No matter which value you select, the relevant telemetry shows in activity explorer.\nTip\nTips for maximizing user productivity:\nConfigure and deploy your Endpoint DLP policies to your devices before enabling just-in-time protection to prevent unnecessarily blocking user activity during policy evaluation.\nMake sure to carefully configure your settings for egress activities. Just-in-time protection blocks an egress activity only when that activity has one or more\nBlock\nor\nBlock with override\npolicies. This means that egress activities that aren't blocked will only be audited, even for users included in the scope of the applicable policies.\nFor more information, see\nGet started with Just-In-Time protection.\n.\nNext steps\nNow that you've learned about Endpoint DLP, your next steps are:\nOnboard Windows 10 or Windows 11 devices into Microsoft Purview overview\nOnboard macOS devices into Microsoft Purview overview\nConfigure endpoint data loss prevention settings\nUsing Endpoint data loss prevention\nSee also\nGetting started with Microsoft Endpoint data loss prevention\nUsing Microsoft Endpoint data loss prevention\nLearn about data loss prevention\nCreate and Deploy data loss prevention policies\nGet started with Activity explorer\nMicrosoft Defender for Endpoint\nInsider risk management\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, - "last_changed": "2026-03-11T12:59:29.613756+00:00", + "last_changed": "2026-03-14T06:51:10.083468+00:00", "topic": "Endpoint DLP", "section": "Microsoft Purview" }, "https://learn.microsoft.com/en-us/purview/endpoint-dlp-getting-started": { "content_hash": "sha256:904bb3e0bfa70ce4ba343e540367cb07fba6d3d2dd3cffd2c2d34718cf6ccca1", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nGet started with endpoint data loss prevention\nFeedback\nSummarize this article for me\nEndpoint data loss prevention (Endpoint DLP) is part of the Microsoft Purview Data Loss Prevention (DLP) suite of features you can use to discover and protect sensitive items across Microsoft 365 services. For more information about all of Microsoft's DLP offerings, see\nLearn about data loss prevention\n. To learn more about Endpoint DLP, see\nLearn about Endpoint data loss prevention\nMicrosoft Endpoint DLP allows you to monitor\nonboarded Windows 10, and Windows 11\nand\nonboarded macOS devices\nrunning any of the three latest released versions. Once a device is onboarded, DLP detects when sensitive items are used and shared. This gives you the visibility and control you need to ensure that they're used and protected properly, and to help prevent risky behavior that might compromise them.\nTip\nGet started with Microsoft Security Copilot to explore new ways to work smarter and faster using the power of AI. Learn more about\nMicrosoft Security Copilot in Microsoft Purview\n.\nBefore you begin\nSKU/subscriptions licensing\nFor information on licensing, see\nMicrosoft 365 Enterprise Plans\nMicrosoft 365 Service Descriptions\nConfigure proxy on the Windows 10 or Windows 11 device\nIf you're onboarding Windows 10 or Windows 11 devices, check to make sure that the device can communicate with the cloud DLP service. For more information, see,\nConfigure device proxy and internet connection settings for Information Protection\n.\nWindows 10 and Windows 11 Onboarding procedures\nFor a general introduction to onboarding Windows devices, see:\nOnboard Windows devices into Microsoft 365 overview\nFor specific guidance to onboarding Windows devices, see:\nArticle\nDescription\nOnboard Windows 10 or 11 devices using Group Policy\nUse Group Policy to deploy the configuration package on devices.\nOnboard Windows 10 or 11 devices using Microsoft Endpoint Configuration Manager\nYou can use either use Microsoft Endpoint Configuration Manager (current branch) version 1606 or Microsoft Endpoint Configuration Manager (current branch) version 1602 or earlier to deploy the configuration package on devices.\nOnboard Windows 10 or 11 devices using Microsoft Intune\nUse Microsoft Intune to deploy the configuration package on device.\nOnboard Windows 10 or 11 devices using a local script\nLearn how to use the local script to deploy the configuration package on endpoints.\nOnboard non-persistent virtual desktop infrastructure (VDI) devices\nLearn how to use the configuration package to configure VDI devices.\nEndpoint DLP support for virtualized environments\nYou can onboard virtual machines as monitored devices in Microsoft Purview portal. There's no change to the onboarding procedures listed above.\nThe table that follows lists the virtual operating systems that are supported by virtualization environments.\nVirtualization\nplatform\nWindows 10\nWindows 11\nWindows Server 2019\nWindows Server 2022\n21H2, 22H2, Data Center\nAzure virtual desktop (AVD)\nSingle session supported for 21H2, 22H2\nMulti session supported for 21H2, 22H2\nSingle session supported for 21H2, 22H2\nMulti session supported for 21H2, 22H2\nSingle session and Multi session supported.\nSupported\nWindows 365\nSupported for 21H2, 22H2\nSupported for 21H2, 22H2\nNot applicable\nNot applicable\nCitrix Virtual Apps and Desktops 7 (2209 and higher)\nSingle session supported for 21H2, 22H2\nMulti session supported for 21H2, 22H2\nSingle session supported for 21H2, 22H2\nMulti session supported for 21H2, 22H2\nSupported\nSupported\nAmazon workspaces\nSingle session supported for 21H2, 22H2\nNot applicable\nWindows 10 powered by Windows Server 2019\nNot applicable\nHyper-V\nSingle session supported for 21H2, 22H2\nMulti session with Hybrid AD join supported for 21H2, 22H2\nSingle session supported for 21H2, 22H2\nMulti session with Hybrid AD join supported for 21H2, 22H2\nSupported with Hybrid AD join\nSupported with Hybrid AD join\nKnown issues\nYou can't monitor\nCopy to Clipboard\nand\nEnforcing Endpoint DLP\non Azure Virtual Desktop environments via browsers. However, the same egress operation will be monitored by\nEndpoint DLP for actions via Remote Desktop Session (RDP)\n.\nCitrix XenApp doesn't support access by restricted app monitoring.\nLimitations\nHandling of USBs in virtualized environments: USB storage devices are treated as network shares. You need to include the\nCopy to network share\nactivity to monitor\nCopy to a USB device\n. All activity explorer events for virtual devices and incident alerts show the\nCopy to a network share\nactivity for all copy to USB events.\nmacOS onboarding procedures\nFor a general introduction to onboarding macOS devices, see:\nOnboard macOS devices into Microsoft Purview\nFor specific guidance to onboarding macOS devices, see:\nArticle\nDescription\nIntune\nFor macOS devices that are managed through Intune\nIntune for Microsoft Defender for Endpoint customers\nFor macOS devices that are managed through Intune and that have Microsoft Defender for Endpoint (MDE) deployed to them\nJAMF Pro)\nFor macOS devices that are managed through JAMF Pro\nJAMF Pro for Microsoft Defender for Endpoint customers)\nFor macOS devices that are managed through JAMF Pro and that have Microsoft Defender for Endpoint (MDE) deployed to them\nOnce a device is onboarded, it should be visible in the devices list, and should start reporting audit activity to Activity explorer.\nSee also\nLearn about Endpoint data loss prevention\nUsing Endpoint data loss prevention\nLearn about data loss prevention\nCreate and Deploy data loss prevention policies\nGet started with Activity explorer\nMicrosoft Defender for Endpoint\nOnboarding tools and methods for Windows machines\nMicrosoft 365 subscription\nMicrosoft Entra joined devices\nDownload the new Microsoft Edge based on Chromium\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Onboard Devices", @@ -953,18 +953,18 @@ "https://learn.microsoft.com/en-us/purview/dlp-configure-endpoint-settings": { "content_hash": "sha256:f5b4c56fa7975d0508182f489eb4c3fb9e79b3ede3876cc71bf3bdc9d848cdaf", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nConfigure endpoint data loss prevention settings\nFeedback\nSummarize this article for me\nMany aspects of endpoint data loss prevention (DLP) behavior are controlled by centrally configured settings that are applied to all DLP policies for devices. Use these settings to control the following behaviors:\nCloud egress restrictions\nVarious types of restrictive actions on user activities per application\nFile path exclusions for Windows and macOS devices\nBrowser and domain restrictions\nAppearance of business justifications for overriding policies in policy tips\nWhether actions performed on Office, PDF, and CSV files are automatically audited\nTo access these settings, from the Microsoft Purview portal, navigate to\nData loss prevention\n>\nOverview\n>\nData loss prevention settings\n>\nEndpoint settings\n.\nTip\nGet started with Microsoft Security Copilot to explore new ways to work smarter and faster using the power of AI. Learn more about\nMicrosoft Security Copilot in Microsoft Purview\n.\nImportant\nFor information about the Adobe requirements for using Microsoft Purview Data Loss Prevention (DLP) features with PDF files, see this article from Adobe:\nMicrosoft Purview Information Protection Support in Acrobat\n.\nAdvanced classification scanning and protection\nAdvanced classification scanning and protection allow the Microsoft Purview cloud-based data classification service to scan items, classify them, and return the results to the local machine. Therefore, you can take advantage of classification techniques such as\nexact data match\nclassification,\ntrainable classifiers\n,\ncredential classifiers\n, and\nnamed entities\nin your DLP policies.\nNote\nThe\nPaste to browser\naction doesn't support advanced classification.\nWhen advanced classification is turned on, content is sent from the local device to the cloud services for scanning and classification. If bandwidth usage is a concern, you can set a limit on how much bandwidth can be used in a rolling 24-hour period. The limit is configured in\nEndpoint DLP settings\nand is applied per device. If you set a bandwidth usage limit and that usage limit is exceeded, DLP stops sending the user content to the cloud. At that point, data classification continues locally on the device but classification using exact data match, named entities, trainable classifiers, and credential classifiers aren't available. When the cumulative bandwidth usage drops below the rolling 24-hour limit, communication with the cloud services resumes.\nIf bandwidth usage isn't a concern, select\nDo not limit bandwidth. Unlimited\nto allow unlimited bandwidth use.\nAdvanced classification file scanning size limits\nEven with\nDo not limit bandwidth. Unlimited\nenabled for advanced classification, there are still limits on the size of individual files that can be scanned.\nThere is a 64 MB limit on text files.\nThere is a 50 MB limit on image files when Optical Character Recognition (OCR) is enabled.\nAdvanced classification will not work for text files larger than 64 MB, even if the bandwidth limit is set to\nDo not limit bandwidth. Unlimited\n.\nThe following Windows versions (and later) support advanced classification scanning and protection.\nall Windows 11 versions\nWindows 10 versions 20H1/21H1 or higher (KB 5006738)\nWindows 10 RS5 (KB 5006744)\nNote\nSupport for advanced classification is available for Office (Word, Excel, PowerPoint) and PDF file types.\nDLP policy evaluation always occurs in the cloud, even if user content is not being sent.\nTip\nTo use advanced classification for Windows 10 devices, you must install KB5016688. To use advanced classification for Windows 11 devices, KB5016691 must be installed on those Windows 11 devices. Additionally, you must enable advanced classification before\nActivity explorer\nwill display contextual text for DLP rule-matched events. To learn more about contextual text, see\nContextual summary\n.\nAdvanced label-based protection for all files on devices\nTurning this feature on allows users to work on files, including files other than Office and PDF files, that have sensitivity labels that apply access control settings in an unencrypted state, on their devices. Endpoint DLP will continue to monitor and enforce access control and label-based protections on these files even in unencrypted state and automatically encrypt them before they're transferred outside from a user's device. For more information on this feature, see\nLearn about Advanced Label Based Protection\n.\nNote\nThis feature is supported only on onboarded Windows devices.\nFile path exclusions\nIf you want to exclude certain paths from DLP monitoring, DLP alerts, and DLP policy enforcement on your devices, you can turn off those configuration settings by setting up file path exclusions. Files in excluded locations aren't audited and any files that are created or modified in those locations aren't subject to DLP policy enforcement. To configure path exclusions in DLP settings, navigate to\nMicrosoft Purview portal\n>\nData loss prevention\n>\nOverview\n>\nData loss prevention settings\n>\nEndpoint settings\n>\nFile path exclusions for Windows\n.\nFile path exclusions for Windows\nYou can use the following logic to construct your exclusion paths for Windows 10/11 devices:\nValid file path that ends with\n\\\n, means only files directly under the specified folder are excluded.\nExample:\nC:\\Temp\\\nValid file path that ends with\n\\*\n, means only files within subfolders of the specified folder are excluded. Files directly under the specified folder itself aren't excluded.\nExample:\nC:\\Temp\\*\nValid file path that ends without\n\\\nor\n\\*\n, means all files directly under the specified folder and all of its subfolders are excluded.\nExample:\nC:\\Temp\nA path with wildcard between\n\\\nfrom each side.\nExample:\nC:\\Users\\*\\Desktop\\\nA path with wildcard between\n\\\nfrom each side and with\n(number)\nto specify the exact number of subfolders to be excluded.\nExample:\nC:\\Users\\*(1)\\Downloads\\\nA path with SYSTEM environment variables.\nExample:\n%SystemDrive%\\Test\\*\nA mix of all the patterns described here.\nExample:\n%SystemDrive%\\Users\\*\\Documents\\*(2)\\Sub\\\nWindows file paths excluded by default\n%SystemDrive%\\\\Users\\\\*(1)\\\\AppData\\\\Roaming\n%SystemDrive%\\\\Users\\\\*(1)\\\\AppData\\\\Local\\\\Temp\n%%SystemDrive%\\\\Users\\\\*(1)\\\\AppData\\\\Local\\\\Microsoft\\\\Windows\\\\INetCache\nFile path exclusions for Mac\nYou can also add your own exclusions for macOS devices.\nFile path definitions are case insensitive, so\nUser\nis the same as\nuser\nWildcard values are supported. So a path definition can contain an asterisk (\n*\n) in the middle of the path or at the end of the path.\nExample:\n/Users/*/Library/Application Support/Microsoft/Teams/*\nmacOS file paths excluded by default\n/System\nRecommended file path exclusions for macOS\nFor performance reasons, Endpoint DLP includes a list of recommended file path exclusions for macOS devices. If the\nInclude recommended file path exclusions for Mac\ntoggle is set to\nOn\n, the following paths are also excluded:\n/Applications\n/usr\n/Library\n/private\n/opt\n/Users/*/Library/Logs\n/Users/*/Library/Containers\n/Users/*/Library/Application Support\n/Users/*/Library/Group Containers\n/Users/*/Library/Caches\n/Users/*/Library/Developer\nWe recommend leaving this toggle set to\nOn\n. However, you can stop excluding these paths by setting the toggle to\nOff\n.\nSet up evidence collection for file activities on devices\nWhen it identifies items that match policies on devices, DLP can copy them to an\nAzure storage account\n. This feature is useful for auditing policy activity and troubleshooting specific matches. Use this section to add the name and URL of the storage account.\nNote\nBefore you enable this feature, you must create an Azure storage account and a container in that storage account. You must also configure permissions for the account. As you set up your Azure storage account, keep in mind that you'll probably want to use a storage account that's in the same Azure region/geopolitical boundary as your tenant. You should also consider configuring\nAzure storage account access tiers\nand\nAzure storage account pricing\n.\nFor more information on this feature, see\nLearn about collecting files that match data loss prevention policies from devices\n.\nFor more information on how to configure this feature, see\nGet started with collecting files that match data loss prevention policies from devices\n.\nNetwork share coverage and exclusions\nNetwork share coverage and exclusions\nextends endpoint DLP policies and actions to new and edited files on network shares and mapped network drives. If\njust in time protection\nis also enabled, just in time protection coverage and exclusions are extended to network shares and mapped drives. If you want to exclude a specific network path for all monitored devices, add the path value in\nExclude these network share paths\n.\nImportant\nTo use\nNetwork share coverage and exclusions\n, devices must have the following updates applied:\nWindows 10 -\nMarch 21, 2023—KB5023773 (OS Builds 19042.2788, 19044.2788, and 19045.2788) Preview\n,\nMarch 28, 2023—KB5023774 (OS Build 22000.1761) Preview\nWindows 11 -\nMarch 28, 2023—KB5023778 (OS Build 22621.1485) Preview\nMicrosoft Defender\nApril-2023 (Platform: 4.18.2304.8 | Engine: 1.1.20300.3)\nMac OS Last 3 OS versions supported ; Minimum Defender app version supported - 101.24122.0005\nThis table shows the default settings for network share coverage and exclusions.\nNetwork share coverage and exclusions\nJust in time protection\nResultant behavior\nEnabled\nDisabled\n- DLP policies scoped to Devices are applied to all network shares and mapped drives that the device is connected to.\nSupported actions: Devices\nDisabled\nEnabled\n- Just-in-time protection is applied only to the files on storage devices that are local to the endpoint.\nEnabled\nEnabled\n- DLP policies scoped to Devices are applied to all network shares and mapped drives that the device is connected to.\nSupported actions: Devices\n- Just-in-time protection is applied to all network shares and mapped drives that the device is connected to.\nNetwork share coverage and exclusions\ncomplements\nDLP On-premises repository actions\n. This table shows the exclusion settings and the resulting behavior depending on whether DLP is enabled or disabled for on-premises repositories.\nNetwork share coverage and exclusions\nDLP on-premises repositories\nResultant behavior\nEnabled\nDisabled\n- DLP policies scoped to Devices are applied to all network shares and mapped drives that the device is connected to.\nSupported actions: Devices\nDisabled\nEnabled\n- Policies that are scoped to On-premises repositories can enforce protective actions on on-premises data-at-rest in file shares and SharePoint document libraries and folders.\nDLP On-premises repository actions\nEnabled\nEnabled\n- DLP policies scoped to Devices are applied to all network shares and mapped drives that the device is connected to.\nSupported actions: Devices\n- Policies that are scoped to On-premises repositories can enforce protective actions on on-premises data-at-rest in file shares and SharePoint document libraries and folders.\nDLP On-premises repository actions\nRestricted apps and app groups\nRestricted apps\nThe\nRestricted apps\nlist, is a custom list of applications that you create. You configure what actions DLP takes when someone uses an app on the list to\naccess\na DLP-protected file on a device. The\nRestricted apps\nlist is available for Windows 10/11 and macOS devices running any of the three latest macOS releases.\nSome apps have a web based interface in addition to a locally installed version of the application. In preview, when you add an app that can be accessed both locally and via a web based interface, to a\nRestricted app group\nor as a\nRestricted app\n, any DLP policies applicable to accessing a protected file will be enforced via Edge for the browser app interface and on the device for the application based interface.\nImportant\nDo not include the path to the executable for Windows devices. Include only the executable name (such as browser.exe).\nThe action (\naudit\n,\nblock with override\n, or\nblock\n) defined for apps that are on the restricted apps list only applies when a user attempts to\naccess\na protected item.\nWhen\nAccess by restricted apps\nis selected in a policy and a user uses an app that is on the restricted apps list to access a protected file, the activity is\naudited\n,\nblocked\n, or\nblocked with override\n, depending on how you configured the\nRestricted apps\nlist. EXCEPTION: If an app on the\nRestricted apps\nlist is also a member of a\nRestricted app group\n, the actions configured for activities in the\nRestricted app group\noverride the actions configured for the\nRestricted apps\nlist. All activity is audited and available for review in activity explorer.\nRestricted app groups\nRestricted app groups are collections of apps that you create in DLP settings and then add to a rule in a policy. When you add a restricted app group to a policy, you can take the actions defined in the following table.\nRestricted App group option\nWhat it allows you to do\nDon't restrict file activity\nTells DLP to allow users to access DLP protected items using apps in the app group without taking any action when the user attempts to\nCopy to clipboard\n,\nCopy to a USB removable drive\n,\nCopy to a network drive\n, or\nPrint\nfrom the app.\nApply a restriction to all activity\nTells DLP to\nAudit only\n,\nBlock with override\n, or\nBlock\nwhen a user attempts to access a DLP-protected item using an app that's in the relevant app group\nApply restrictions to a specific activity\nThis setting allows a user to access a DLP-protected item using an app that is in the app group. It also allows you to select a default action (\nAudit only\n,\nBlock\n, or\nBlock with override\n) for DLP to take when a user attempts to\nCopy to clipboard\n,\nCopy to a USB removable drive\n,\nCopy to a network drive\n, and\nPrint\n.\nYou can add a maximum of 50 apps into a single group and you can create a maximum of 10 groups. This gives a maximum of 500 apps that the policy actions can be assigned to.\nImportant\nSettings in a restricted app\ngroup\noverride any restrictions set in the restricted apps\nlist\nwhen they are in the same rule. So, if an app is on the restricted apps list and is also a member of a restricted apps group, the settings of the restricted apps group is applied.\nBlock all apps except for a list of allowed apps\nYou can create a list of allowed applications and block all others. This way, you don't need to create and manage a comprehensive list of untrusted applications. This feature helps simplify policy management and enhances your control over app-based file activities.\nNote\nCommon background applications such as\nteamsupdate.exe\nor\nsvchost.exe\nare preconfigured to bypass enforcement to prevent unintentional interference with essential operations.\nIn this procedure, we apply the restriction level of\nAllow\nto explicitly allow activity for a defined app group, and then block any apps that are not on this list. Therefore, apps that have no restriction level defined are effectively blocked, and apps that have a restriction level defined as\nAllow\nare explicitly allowed. Basically, we define a restricted app group in order to allow that app group, but we do this in order to block any apps that have no defined restrictions.\nNavigate to endpoint DLP settings.\nDefine allowed or sanctioned apps in the\nRestricted Apps and app groups\nlist.\nIn your existing or new endpoint DLP policy, locate the\nFile activities for apps in restricted app groups\nsetting.\nAdd the desired restricted app group.\nSelect\nApply restriction to all/specific activity\n, and select\nAllow\n.\nFor all other apps, set the\nAccess by apps that aren’t on the 'unallowed apps' list\nsetting to\nBlock\n.\nHow DLP applies restrictions to activities\nInteractions between\nFile activities for apps in restricted app groups\n,\nFile activities for all apps\n, and the\nRestricted app activities\nlist are scoped to the same rule.\nRestricted app groups overrides\nConfigurations defined in\nFile activities for apps in restricted app groups\noverride the configurations in the\nRestricted app activities\nlist and\nFile activities for all apps\nin the same rule.\nRestricted app activities and File activities for all apps\nThe configurations of\nRestricted app activities\nand\nFile activities for all apps\nwork in concert if the action defined for\nRestricted app activities\nis either\nAudit only\n, or\nBlock with override\nin the same rule. Why? Actions defined for\nRestricted app activities\nonly apply when a user accesses a file using an app that's on the list. Once the user has access, the actions defined for activities in\nFile activities for all apps\napply.\nFor example, you add Notepad.exe to\nRestricted apps\nand configure\nFile activities for all apps\nto\nApply restrictions to specific activity\n. You configure both as shown in the following table:\nSetting in policy\nApp name\nUser activity\nDLP action to take\nRestricted app activities\nNotepad\nAccess a DLP protected item\nAudit only\nFile activities for all apps\nAll apps\nCopy to clipboard\nAudit only\nFile activities for all apps\nAll apps\nCopy to a USB removeable device\nBlock\nFile activities for all apps\nAll apps\nCopy to a network share\nAudit only\nFile activities for all apps\nAll apps\nPrint\nBlock\nFile activities for all apps\nAll apps\nCopy or move using unallowed Bluetooth app\nBlocked\nFile activities for all apps\nAll apps\nRemote desktop services\nBlock with override\nWhen User A opens a DLP-protected file using Notepad, DLP allows the access and audits the activity. While still in Notepad, User A then tries to copy content from the protected item to the clipboard. This action is successful, and DLP audits the activity. User A then tries to print the protected item from Notepad and the activity is blocked.\nNote\nWhen the DLP action to take in\nRestricted app activities\nis set to\nblock\n, all access is blocked and the user cannot perform any activities on the file.\nFile activities for all apps only\nIf an app\nisn't\nin the\nFile activities for apps in restricted app groups\nor the\nRestricted app activities\nlist, or\nis\nin the\nRestricted app activities\nlist, with an action of either\nAudit only\n, or\nBlock with override\n, any restrictions defined in the\nFile activities for all apps\nare applied in the same rule.\nmacOS devices\nYou can also prevent macOS apps from accessing sensitive data by defining them in the\nRestricted app activities\nlist.\nNote\nCross-platform apps must be entered with their unique paths respective to the OS they are running.\nTo find the full path of Mac apps:\nOn the macOS device, open\nActivity Monitor\n. Find and double-click the process you want to restrict.\nSelect the\nOpen Files and Ports\ntab.\nMake a note of the full path name, including the name of the app. For example,\n/System/Applications/TextEdit.app/Contents/MacOS/TextEdit\nAuto-quarantine\nTo prevent sensitive items from being synced to the cloud by cloud sync apps such as\nonedrive.exe\n, add the cloud sync app to the\nRestricted apps\nlist with\nAuto-quarantine\nWhen enabled, auto-quarantine is triggered when a restricted app attempts to access a DLP-protected sensitive item. Auto-quarantine moves the sensitive item to an admin-configured folder. If configured to do so, auto-quarrantine can leave a placeholder (\n.txt\n) file in place of the original. You can configure the text in the placeholder file to tell users the new location of the item, and other pertinent information.\nUse the auto-quarrantine feature when an unallowed cloud-sync app tries to access an item that is protected by a blocking DLP policy. DLP might generate repeated notifications. You can avoid these repeated notifications by enabling\nAuto-quarantine\n.\nYou can use also auto-quarantine to prevent an endless chain of DLP notifications for the user and admins. For more information, see\nScenario 4: Avoid looping DLP notifications from cloud synchronization apps with auto-quarantine\n.\nUnsupported file extension exclusions\nYou can use the\nDocument could not be scanned\ncondition together with\nApply restrictions to only unsupported file extensions\nin your DLP policies to restrict activities involving files with extensions that aren’t supported by endpoint DLP. Because this can potentially include many unsupported file extensions, you can refine detection by adding unsupported extensions to exclude. For more information, see\nHelp protect files that Endpoint Data Loss Prevention doesn't scan\n,\nHelp protect against sharing of a defined set of unsupported files\n.\nNote\nDo not add ‘.’ while you add extension and use latest Antimalware client version.\nImportant\nWhen you select\nActions\n>\nAudit or restrict activities on devices\nrule configuration the\nApply restrictions to only unsupported file extensions\nshows up.\nApply restrictions to only unsupported file extensions\nconfiguration option does not support scoping by\nDevice and device groups\nin the policy location setting.\nBlocking specific file extensions in DLP policies can lead to unexpected behavior if an application that is marked as unallowed needs to access files with those extensions as part of its normal operation. For example, certain apps may read or temporarily open files, like\n.dll\n,\n.json\n,\n.tmp\nduring routine processes such as rendering, caching, or validating content. If these extensions are blocked, the app may fail to function properly, causing errors, incomplete workflows, or enforcement pop-ups unrelated to user intent. Before implementing extension-based restrictions, make sure you know which apps interact with these file types during standard operations, and whether alternative controls, such as app restrictions or contextual rules can achieve the security goal without disrupting functionality.\nUnallowed Bluetooth apps\nTo prevent people from transferring files protected by your policies via specific Bluetooth apps, add those apps to the\nUnallowed Bluetooth apps\nlist in Endpoint DLP settings.\nBrowser and domain restrictions to sensitive data\nRestrict sensitive files that match your policies from being shared with unrestricted cloud service domains.\nUnallowed browsers\nFor Windows devices you can restrict the use of specified web browsers, identified by their executable names. The specified browsers are blocked from accessing files that match the conditions of an enforced a DLP policy where the upload-to-cloud services restriction is set to\nblock\nor\nblock override\n. When these browsers are blocked from accessing a file, end users see a toast notification asking them to open the file through Microsoft Edge.\nFor macOS devices, you must add the full file path. To find the full path of Mac apps:\nOn the macOS device, open\nActivity Monitor\n. Find and double-click the process you want to restrict.\nChoose\nOpen Files and Ports\ntab.\nMake sure to make a note of the full path name, including the name of the app.\nService domains\nThe\nService domains\nhere work together with the\nAudit or restrict activities on devices\nsetting found in the workflow for creating a rule within a DLP policy.\nWhen you create a rule, you use actions to protect your content when certain conditions are met. When creating rules for endpoint devices, you need to choose the\nAudit or restrict activities on devices\noption, and select one of these options:\nAudit only\nBlock with override\nBlock\nTo control whether sensitive files that are protected by your policies can be uploaded to specific service domains, you next need to navigate to\nEndpoint DLP Settings\n>\nBrowser and domain restrictions to sensitive data\nand choose whether to\nblock\nor\nallow\nService domains\nby default.\nNote\nThe\nService domains\nsetting only applies to files uploaded using Microsoft Edge, or using instances of Google Chrome or Mozilla Firefox that have the\nMicrosoft Purview Chrome Extension\ninstalled.\nBlock\nWhen the\nService domains\nlist is set to\nBlock\n, you use the\nAdd cloud service domain\nto specify domains that should be blocked. All other service domains are allowed. In this case, DLP policy restrictions are only applied when a user attempts to upload a sensitive file to any of the domains on the list.\nFor example, consider the following configurations:\nA DLP policy is configured to detect sensitive items that contain physical addresses and the\nAudit or restrict activities on devices\noption is set to\nAudit only\n.\nThe\nService domains\nsetting is set to\nBlock\n.\ncontoso.com IS NOT ON the list.\nwingtiptoys.com IS ON the list.\nIn this case, if a user attempts to upload a sensitive file with physical addresses to contoso.com, the upload is allowed to complete and an audit event is generated but no alert is triggered.\nIn contrast, if a user attempts to upload a sensitive file with credit card numbers to wingtiptoys.com, the user activity--the upload--is also allowed to complete and both an audit event and an alert are generated.\nAnother example, consider the following configuration:\nA DLP policy is configured to detect sensitive items that contain physical addresses and the\nAudit or restrict activities on devices\noption is set to\nBlock\n.\nThe\nService domains\nsetting is set to\nBlock\n.\ncontoso.com IS NOT ON the list.\nwingtiptoys.com IS ON the list.\nIn this case, if a user attempts to upload a sensitive file with physical addresses to contoso.com, the upload is allowed to complete and an audit event is triggered, an audit event is generated but no alert is triggered.\nIn contrast, if a user attempts to upload a sensitive file with credit card numbers to wingtiptoys.com, the user activity--the upload--is blocked and both an audit event and an alert are generated.\nAllow\nWhen the\nService domains\nlist is set to\nAllow\n, you use the\nAdd cloud service domain\nto specify domains that are allowed. All other service domains will have DLP Policy restrictions enforced. In this case, DLP policies are only applied when a user attempts to upload a sensitive file to any of the listed domains.\nFor example, here are two starting configurations:\nA DLP policy is configured to detect sensitive items that contain credit card numbers and the\nAudit or restrict activities on devices\noption is set to\nBlock with override\n.\nThe\nService domains\nsetting is set to\nAllow\n.\ncontoso.com IS NOT ON the\nAllow\nlist.\nwingtiptoys.com IS ON the\nAllow\nlist.\nIn this case, if a user attempts to upload a sensitive file with credit card numbers to contoso.com, the upload is blocked, a warning displays, giving the user the option to override the block. If the user chooses to override the block, an audit event is generated and an alert is triggered.\nHowever, if a user attempts to upload a sensitive file with credit card numbers to wingtiptoys.com, the policy restriction\nisn't\napplied. The upload is allowed to complete, and an audit event is generated but no alert is triggered.\nA DLP policy is configured to detect sensitive items that contain physical addresses and the Audit or restrict activities on devices option is set to Audit only.\nThe\nService domains\nsetting is set to Allow.\ncontoso.com is NOT on the list.\nwingtiptoys.com IS on the list.\nIn this case, if a user attempts to upload a sensitive file with physical addresses to contoso.com, the upload is allowed to complete and both an audit event and an alert are generated.\nIn contrast, if a user attempts to upload a sensitive file with credit card numbers to wingtiptoys.com, the user activity--the upload- is also allowed to complete, an audit event is generated but no alert is triggered.\nImportant\nWhen the service restriction mode is set to\nAllow\n, you must have at least one service domain configured before restrictions are enforced.\nSummary table: Allow/Block behavior\nThe following table shows how the system behaves depending on the settings listed.\nEndpoint DLP Service domain setting\nDLP policy rule Audit or restrict activities on devices setting\nUser goes to a listed site\nUser goes to a site NOT listed\nAllow\nAudit only\n- User activity is audited\n- No alert is generated\n- No DLP policies are applied\n- User activity is audited\n- An alert is generated\n- DLP policies are applied in Audit mode\nAllow\nBlock with override\n- User activity is audited\n- No alert is generated\n- No DLP policies are applied\n- User activity is audited\n- An alert is generated\n- DLP policies are applied in Block with override mode\nAllow\nBlock\n- User activity is audited\n- No alert is generated\n- No DLP policies are applied\n- User activity is audited\n- An alert is generated\n- DLP policies are applied in Block mode\nBlock\nAudit only\n- User activity is audited\n- An alert is generated\n- DLP policies are applied in Audit mode\n- User activity is audited\n- No alert is generated\n- No DLP policies are applied\nBlock\nBlock with override\n- User activity is audited\n- An alert is generated\n- DLP policies are applied in Block with override mode\n- User activity is audited - No alert is generated\n- No DLP policies are applied\nBlock\nBlock\n- User activity is audited\n- An alert is generated\n- DLP policies are applied in Block mode\n- User activity is audited\n- No alert is generated\n- No DLP policies are applied\nWhen adding a domain to the list, use the FQDN format of the service domain without the ending period (\n.\n). Use\n*.\nas a wildcard to specify all domains or subdomains. Exclude the protocol (anything before\n//\n) from the domain. Only include the host name, without any subsites.\nFor example:\nInput\nURL matching behavior\ncontoso.com\nMatches the specified domain name, and any subsite\n:\n://contoso.com\n://contoso.com/\n://contoso.com/anysubsite1\n://contoso.com/anysubsite1/anysubsite2 (etc.)\nDoes not match sub-domains or unspecified domains\n:\n://anysubdomain.contoso.com\n://anysubdomain.contoso.com.AU\n*\n.contoso.com\nMatches the specified domain name, any subdomain, and any site\n:\n://contoso.com\n://contoso.com/anysubsite\n://contoso.com/anysubsite1/anysubsite2\n://anysubdomain.contoso.com/\n://anysubdomain.contoso.com/anysubsite/\n://anysubdomain1.anysubdomain2.contoso.com/anysubsite/\n://anysubdomain1.anysubdomain2.contoso.com/anysubsite1/anysubsite2 (etc.)\nDoes not match unspecified domains\n://anysubdomain.contoso.com.AU/\nwww.contoso.com\nMatches the specified domain name\n:\nwww.contoso.com\nDoes not match unspecified domains or subdomains\n://anysubdomain.contoso.com/, in this case, you have to put the FQDN domain name itself\nwww.contoso.com\nYou can configure up to 50 domains under\nSensitive Service domains\n.\nNote\nThe Service domains list setting only applies to file uploads to websites. Actions like pasting into a browser do not follow the Service Domain list.\nSensitive service domain groups\nWhen you list a website in\nSensitive service domains\n, you can\naudit\n,\nblock with override\n, or fully\nblock\nuser activity when users attempt to take any of the following actions:\nprint from a website\ncopy data from a website\nsave a website as local files\nupload or drag/drop a sensitive file to an excluded website\npaste sensitive data to an excluded website\nThe following table shows which browsers support these features:\nBrowser\nSupported Feature\nMicrosoft Edge\n- Print the site\n- Copy data from the site\n- Save the site as local files (save-as)\n- Paste to supported browsers\n- Upload to a restricted cloud service domain\nGoogle Chrome (with the Microsoft Purview extension)\n- Paste to supported browsers\n- Upload to a restricted cloud service domain\nMozilla Firefox (with the Microsoft Purview extension)\n- Upload to a restricted cloud service\n- Paste to supported browsers\nFor the\nPaste to supported browsers\naction, there may be a brief time lag between when the user attempts to paste text into a web page and when the system finishes classifying it and responds. If this classification latency happens, you may see both policy-evaluation and check-complete notifications in Edge or policy-evaluation toast on Chrome and Firefox. Here are some tips for minimizing the number of notifications:\nNotifications are triggered when a policy for the target website is configured to\nBlock\nor\nBlock with override\nthe\nPaste to supported browsers\nfor that user. You can configure the overall action to\nAudit\nand then using the exceptions,\nBlock\nthe target websites. Alternately, you can set the overall action to\nBlock\nand then using the exceptions,\nAudit\nthe secure websites.\nUse latest Antimalware client version.\nEnsure your version of Microsoft Edge is 120 or higher.\nInstall these Windows KBs:\nWindows 10:\nKB5032278\n,\nKB5023773\nWindows 11 21H2:\nKB5023774\nWin 11 22H2:\nKB5032288\n,\nKB5023778\nOn macOS, ensure your anti-malware Client version is 101.25022.0003 or later\nThe\nPaste to supported browsers\naction does not follow the behavior defined in the Service Domain list. However, if Sensitive Service Domain Groups are configured on the rule for Paste To Browser, those are honored.\nNote\nThe\nService domains\nsetting only applies to files uploaded using Microsoft Edge or an instance of Google Chrome or Mozilla Firefox that has the\nMicrosoft Purview Chrome Extension\ninstalled.\nThe Generative AI Websites group contains these\nsupported sites\n. The group is used for default policies within Data Security Posture Management for AI and cannot be edited or deleted.\nFor devices, you must configure\nSensitive service domains\nlist to use the\nUpload to a restricted cloud service domain\naction in a DLP policy. You can also define website groups that you want to assign policy actions to that are different from the global website group actions. You can add a maximum of 100 websites into a single group and you can create a maximum of 150 groups. This gives a maximum of 15,000 websites that the policy actions can be assigned to. For more information, see\nScenario 6: Monitor or restrict user activities on sensitive service domains\n.\nImportant\nRegarding the\nPaste to supported browser\naction. If 'Collect original file as evidence for all selected file activities on Endpoint' is enabled on the rule for this feature, garbage characters might appear in the source text if the user's\nWindows device doesn't have Antimalware Client Version 4.18.23110 or newer installed. Select\nActions\n>\nDownload\nto view the actual content.\nFor more information, see\nScenario 7: Restrict pasting sensitive content into a browser\n.\nSupported syntax for designating websites in a website group\nIf you use URLs to identify websites, don't include the networking protocol as part of the URL (for instance,\nhttps://\nor\nfile://\n). Instead, use a flexible syntax to include and exclude domains, subdomains, websites, and subsites in your website groups. For example,\nUse\n*.\nas a wildcard to specify all domains or all subdomains.\nUse\n/\nas a terminator at the end of a URL to scope to that specific site only.\nWhen you add a URL without a terminating slash mark (\n/\n), that URL is scoped to that site and all subsites. You can only add\n*.\nto the beginning of a domain. The\n/\nterminator is only supported at the end of a domain.\nThis syntax applies to all http/https websites. Here are some examples:\nURL added to the website group\nURL will match\nURL won't match\ncontoso.com\nhttp://\ncontoso.com\nhttps://\ncontoso.com\nhttps://\ncontoso.com/\nhttps://\ncontoso.com/allsubsites1\nhttps://\ncontoso.com/allsubsites1/allsubsites2\nhttps://\nallsubdomains.contoso.com\nhttps://\nallsubdomains.contoso.com.au\nhttps://\nwww.contoso.com\ncontoso.com/\nhttp://\ncontoso.com\nhttps://\ncontoso.com\nhttps://\ncontoso.com/\nhttps://\ncontoso.com/allsubsites1\nhttps://\ncontoso.com/allsubsites1/allsubsites2\nhttps://\nallsubdomains.contoso.com\nhttps://\nallsubdomains.contoso.com/au\nhttps://\nwww.contoso.com\n*.contoso.com\nhttp://\ncontoso.com\nhttps://\ncontoso.com\nhttps://\nwww.contoso.com\nhttps://\nwww.contoso.com/allsubsites\nhttps://\ncontoso.com/allsubsites1/allsubsites2\nhttps://\nallsubdomains.contoso.com\nhttps://\nallsubdomains.contoso.com/allsubsites\nhttps://\nallsubdomains1.allsubdomains2.contoso.com/allsubsites1/allsubsites2\nhttps://\nallsubdomains.contoso.com.au\n*.contoso.com/xyz\nhttp://\ncontoso.com/xyz/\nhttps://\ncontoso.com/xyz/\nhttps://\ncontoso.com/xyz/allsubsites/\nhttps://\nallsubdomains.contoso.com/xyz/\nhttps://\nallsubdomains.contoso.com/xyz/allsubsites\nhttps://\nallsubdomains1.allsubdomains2.contoso.com/xyz/allsubsites\nhttps://\nallsubdomains1.allsubdomains2.contoso.com/xyz/allsubsites1/allsubsites2\nhttps://\ncontoso.com/xyz\nhttps://\nallsubdomains.contoso.com/xyz\n*.contoso.com/xyz/\nhttp://\ncontoso.com/xyz\nhttps://\ncontoso.com/xyz\nhttps://\ncontoso.com/xyz/\nhttps://\nallsubdomains.contoso.com/xyz\nhttps://\nallsubdomains.contoso.com/xyz/\nhttps://\ncontoso.com\nhttps://\ncontoso.com/xyz/allsubsites/\nhttps://\nallsubdomains.contoso.com/xyz/allsubsites/\nhttps://\nallsubdomains1.allsubdomains2.contoso.com/xyz/allsubsites/\nhttps://\nallsubdomains1.allsubdomains2.contoso.com/xyz/allsubsites1/allsubsites2\nSupported syntax for designating IP ranges or IP addresses (preview)\nIn preview, DLP supports using IP address and address ranges to identify websites. Set the Match type to\nIP address\nor\nIP address range\nand then enter a specific IP address or an IP range in the\nSensitive service domain\nfield, and click\nAdd site\nto add the selection to the Sensitive service domain group.\nExamples of supported syntax:\n1.1.1.1\n1.1.1.1-2.2.2.2\n2001:0db8:85a3:0000:0000:8a2e:0370:7334\n2001:0db8:85a3:0000:0000:8a2e:0370:7320-2001:0db8:85a3:0000:0000:8a2e:0370:7334\nImportant\nURLs support these actions:\nPrint the site\nCopy data from the site\nSave the site as local files (save-as)\nPaste to supported browsers\nUpload to a restricted cloud service domain\nIP address and IP address range support these actions:\nPrint the site\nCopy data from the site\nSave the site as local files (save-as)\nUpload to a restricted cloud service domain\n(Windows only)\nSensitive service domain groups contains a preconfigured group for\nGenerative AI websites\n. For a list of all the websites in this group see,\nList of AI sites supported by Microsoft Purview Data Security Posture Management for AI\nAdditional settings for Endpoint DLP\nBusiness justification in policy tips\nYou can control how users interact with the business justification option in\nOptions for configuring policy tips\n. This option appears when users perform an activity that's protected by the\nBlock with override\nsetting in a DLP policy. This is a global setting. You can choose from one the following options:\nShow default options and custom text box\n: By default, users can select either a built-in justification, or enter their own text.\nOnly show default options\n: Users are limited to selecting from a list of built-in justifications.\nOnly show custom text box\n: Users are limited to entering a custom justification. The text box appears in the end-user policy tip notification, without a list of options.\nCustomizing the options in the drop-down menu\nYou can create up to five customized options that appear when users interact with the policy notification tip by selecting the\nCustomize the options drop-down menu\n.\nOption\nDefault text\noption 1\nThis is part of an established business workflow\nor you can enter customized text\noption 2\nMy manager has approved this action\nor you can enter customized text\noption 3\nUrgent access required; I'll notify my manager separately\nor you can enter customized text\nShow false positive option\nThe information in these files is not sensitive\nor you can enter customized text\noption 5\nOther\nor you can enter customized text\nTurn on automatic diagnostic logging for endpoint DLP\nMicrosoft Purview Always-on diagnostics feature automatically records comprehensive trace logs, saving you time and enabling faster troubleshooting. For more information, see\nAlways-on diagnostics for endpoint DLP\n.\nEnable Endpoint DLP for Windows Servers\nEndpoint DLP supports the following versions of Windows Server:\nWindows Server 2019 (\nNovember 14, 2023—KB5032196 (OS Build 17763.5122) - Microsoft Support\n)\nWindows Server 2022 (\nNovember 14, 2023 Security update (KB5032198) - Microsoft Support\n)\nOnce you\nonboard a Windows Server\nyou must turn on Endpoint DLP support before endpoint protection will be applied.\nTo work with the DLP alert management dashboard:\nIn the Microsoft Purview portal, navigate to\nData loss prevention\n>\nOverview\n.\nChoose\nSettings\nin the top right corner.\nOn the\nSettings\npage, select\nEndpoint settings\nand expand\nEndpoint DLP support for onboarded servers\n.\nSet the toggle to\nOn\n.\nAlways audit file activity for devices\nBy default, when devices are onboarded, activity for Office, PDF, and CSV files is automatically audited and available for review in activity explorer. Turn off this feature if you want this activity to be audited only when onboarded devices are included in an active policy. The Always audit file activity for devices setting enables the auditing of file activities for documents where a DLP Rule did not match: File Created, File Modified, File Renamed, File created on removable media, and File created on network share.\nFile activity is always audited for onboarded devices, regardless of whether they're included in an active policy.\nPrinter groups\nUse this setting to define groups of printers that you want to assign policy actions to that are different from the global printing actions.\nThe most common use case for creating printer groups is to use them for limiting the printing of contracts to only those printers in an organization's Legal department. After you define a printer group here, you can use it in all of your policies that are scoped to\nDevices\n. For more information on configuring policy actions to use authorization groups, see\nScenario 8 Authorization groups\n.\nYou can create a maximum of 20 printer groups. Each group can contain a maximum of 50 printers.\nNote\nThis feature is available for devices running any of the following Windows versions:\nWindows 10 and later (21H1, 21H2, and later) -\nKB5020030\nWin 11 21H2 -\nKB5019157\nWin 11 22H2 -\nKB5020044\nWindows Server 2022 -\nKB5020032\nLet's look at an example. Say you want your DLP policy to block printing of contracts to all printers except for those that are in the legal department.\nUse the following parameters to assign printers in each group.\nFriendly printer\nname - the Friendly printer name is the value showing on the UX when you select the Printer.\nUSB printer\n- A printer connected through a computer's USB port. Select this option if you want to enforce any USB printer while leaving the USB product ID and USB vendor ID unselected. You can also assign a specific USB printer by specifying its USB product ID and USB vendor ID.\nUSB product ID\n- Get the\nDevice Instance\npath value from the printer device property details in device manager. Convert that value the to Product ID and Vendor ID format. For more information, see\nStandard USB identifiers\n.\nUSB vendor ID\n- Get the\nDevice Instance\npath value from the printer device property details in device manager. Convert that value to the Product ID and Vendor ID format. For more information, see\nStandard USB identifiers\n.\nIP range\nPrint to file\n- Microsoft Print to PDF or Microsoft XPS Document Writer. If you only want to enforce Microsoft Print to PDF, you should use Friendly printer name with 'Microsoft Print to PDF'.\nUniversal print deployed on a printer\n- For more information on universal printers, see\nSet up Universal Print\n.\nCorporate printer\n- is a print queue shared through on-premises Windows print server in your domain. Its path might look like this: \\print-server\\contoso.com\\legal_printer_001.\nPrint to local\n- Any printer connecting through Microsoft print port but not any of above types. For example: print through remote desktop or redirect printer.\nNote\nYou should not use multiple parameters of\nUSB printer\n,\nIP range\n,\nPrint to file\n,\nUniversal print deployed on a printer\n,\nCorporate printer\n, and\nPrint to local\n.\nAssign each printer in the group a\nDisplay name\n. These names appear only in the Microsoft Purview console.\nCreate a printer group\nnamed\nLegal printers\nand add individual printers (with an alias) by their friendly name; for instance:\nlegal_printer_001\n,\nlegal_printer_002\n, and\nlegal_color_printer\n.\n(You can select multiple parameters at once to help you unambiguously identify a specific printer.)\nAssign the policy actions to the group in a DLP policy:\nAllow\n(audit with no user notifications or alerts)\nAudit only\n(you can add notifications and alerts)\nBlock with override\n(blocks the action, but the user can override)\nPreview, a fix to address the unnecessary resubmission to queue the print job after initial override, has been implemented.\nBlock\n(blocks no matter what)\nCreate a Printer group\nOpen\nMicrosoft Purview portal\nand navigate to\nData Loss Prevention\n>\nOverview\n> settings gear icon in the upper right corner >\nData Loss Prevention\n>\nEndpoint DLP settings\n>\nPrinter groups\n.\nSelect\n+ Create printer group\n.\nGive the group a name.\nSelect\nAdd printer\n.\nGive the printer a\nFriendly name\n. Ensure the name matches the value from the printer's device property details in Device Manager.\nSelect the parameters and provide the values to unambiguously identify the specific printer.\nSelect\nAdd\n.\nAdd other printers as needed.\nSelect\nSave\nand then\nClose\n.\nFile extension groups\nUse this setting to define groups of file extensions that you want to assign policy actions to. For example, only apply a\nFile could not be scanned\npolicy to file extensions in the created groups.\nNote\nDo not add ‘.’ while you add extension.\nDisable classification\nUse this setting to exclude specific file extensions from Endpoint DLP classification.\nFor files that are on the\nMonitored files\nlist, you can disable classification through this setting. Once you put a file extension in this setting, Endpoint DLP will not scan content in files with this extension. As a result, Endpoint DLP will not policy evaluation based on the content of those files. You will not be able to see content information for the purposes of conducting investigations.\nNote\nDo not add ‘.’ while you add extension.\nRemovable USB device groups\nUse this setting to define groups of removable storage devices, such as USB thumb drives, that you want to assign policy actions to that are different from the global printing actions. For example, say you want your DLP policy to block items with engineering specifications from being copied to removable storage devices, except for designated USB-connected hard drives that are used to back up data for offsite storage.\nYou can create a maximum of 20 groups, with a maximum 50 removable storage devices in each group.\nNote\nThis feature is available for devices running any of the following Windows versions:\nWindows 10 and later (21H1, 21H2) with KB 5018482\nWin 11 21H2, 22H2 with KB 5018483\nWindows 10 RS5 (KB 5006744) and Windows Server 2022\nUse the following parameters to define your removable storage devices.\nStorage device friendly name\n- Get the Friendly name value from the storage device property details in device manager. Wildcard values (*) are supported.\nUSB product ID\n- Get the Device Instance path value from the USB device property details in device manager. Convert it to Product ID and Vendor ID format. For more information, see\nStandard USB identifiers\n.\nUSB vendor ID\n- Get the Device Instance path value from the USB device property details in device manager. Convert it to Product ID and Vendor ID format. For more information, see\nStandard USB identifiers\n.\nSerial number ID\n- Get the serial number ID value from the storage device property details in device manager. Wildcard values (*) are supported.\nDevice ID\n- Get the device ID value from the storage device property details in device manager. Wildcard values (*) are supported.\nInstance path ID\n- Get the device ID value from the storage device property details in device manager. Wildcard values (*) are supported.\nHardware ID\n- Get the hardware ID value from the storage device property details in device manager. Wildcard values (*) are supported.\nYou assign each removable storage device in the group an\nAlias\n. The alias is a friendly name that only appears in the Microsoft Purview console. So, continuing with the example, you would create a removable storage device group named\nBackup\nand add individual devices (with an alias) by their friendly name, like\nbackup_drive_001\n, and\nbackup_drive_002\n.\nYou can multi-select the parameters and then the printer group includes all devices that satisfy those parameters.\nYou can assign these policy actions to the group in a DLP policy:\nAllow\n(audit with no user notifications or alerts)\nAudit only\n(you can add notifications and alerts)\nBlock with\noverride (blocks the action, but the user can override)\nBlock\n(blocks no matter what)\nCreate a removable USB device group\nOpen\nMicrosoft Purview portal\nand navigate to\nData Loss Prevention\n>\nOverview\n> settings gear icon in the upper right corner >\nData Loss Prevention\n>\nEndpoint DLP settings\n>\nRemovable USB device groups\n.\nSelect\n+ Create removable storage device group\n.\nProvide a\nGroup name\n.\nSelect\nAdd removable storage device\n.\nProvide an\nAlias\n.\nSelect the parameters and provide the values to unambiguously identify the specific device.\nSelect\nAdd\n.\nAdd other devices to the group as needed.\nSelect\nSave\nand then\nClose\n.\nThe most common use case for creating removable storage groups is to use them to specify which removable storage devices users can copy files to. Generally, copying is only allowed for devices in a designated\nBackup\ngroup.\nAfter you define a removable storage device group, you can use it in all of your policies that are scoped to\nDevices\n. See\nScenario 8: Authorization groups\nfor more information on configuring policy actions to use authorization groups.\nNetwork share groups\nUse this setting to define groups of network share paths that you want to assign policy actions to that are different from the global network share path actions. For example, say you want your DLP policy to prevent users from saving or copying protected files to network shares except the network shares in a particular group.\nNote\nThis feature is available for devices running any of the following Windows versions:\nWindows 10 and later (21H1, 21H2) with KB 5018482\nWin 11 21H2, 22H2 with KB 5018483\nWindows 10 RS5 (KB 5006744) and Windows Server 2022\nTo include network share paths in a group, define the prefix that they all the shares start with. For example:\n'\\Library' will match:\n\\Library folder and all its subfolders.\nYou can use Wildcards, for example '\\Users*\\Desktop' will match:\n'\\Users\\user1\\Desktop'\n'\\Users\\user1\\user2\\Desktop'\n'\\Users*\\Desktop'\nYou can also use Environmental variables, for example:\n%AppData%\\app123\nWildcard values are supported. So a path definition can contain an asterisk (\n*\n) in the middle of the path or at the end of the path.\nExample:\n\\\\Lib*\nwill cover\n\\\\Libray\nYou can assign the following policy actions to the group in a DLP policy:\nAllow\n(audit with no user notifications or alerts)\nAudit only\n(you can add notifications and alerts)\nBlock with override\n(blocks the action, but the user can override)\nBlock\n(blocks no matter what)\nOnce you define a network share group, you can use it in all of your DLP policies that are scoped to\nDevices\n. For more information about configuring policy actions to use authorization groups, see\nScenario 8 Authorization groups\n.\nCreate a Network Share group\nOpen\nMicrosoft Purview portal\nand navigate to\nData Loss Prevention\n>\nOverview\n> settings gear icon in the upper right corner >\nData Loss Prevention\n>\nEndpoint DLP settings\n>\nNetwork share groups\n.\nSelect\n+ Create network share group\n.\nProvide a\nGroup name\n.\nAdd the file path to the share.\nSelect\nAdd\n.\nAdd other share paths to the group as needed.\nSelect\nSave\nand then\nClose\n.\nVPN settings\nUse the VPN list to control only those actions that are being carried out over that VPN.\nNote\nThis feature is available for devices running any of these versions of Windows:\nWindows 10 and later (21H1, 21H2) with KB 5018482\nWin 11 21H2, 22H2 with KB 5018483\nWindows 10 RS5 (KB 5006744)\nWhen you list a VPN in\nVPN Settings\n, you can assign the following policy actions to them:\nAllow\n(audit with no user notifications or alerts)\nAudit only\n(you can add notifications and alerts)\nBlock with override\n(blocks the action, but the user can override)\nBlock\n(blocks no matter what)\nThese actions can be applied individually or collectively to the following user activities:\nCopy to clipboard\nCopy to a USB removable device\nCopy to a network share\nPrint\nCopy or move using unallowed (restricted) Bluetooth app\nCopy or move using RDP\nWhen configuring a DLP policy to restrict activity on devices, you can control what happens to each activity performed when users are connected to your organization within any of the VPNs listed.\nUse the\nServer address\nor\nNetwork address\nparameters to define the VPN allowed.\nGet the Server address or Network address\nOn a DLP monitored Windows device, open a\nWindows PowerShell\nwindow as an administrator.\nRun the following cmdlet, which returns multiple fields and values.\nGet-VpnConnection\nAmong the results of the cmdlet, find the\nServerAddress\nfield and record that value. You use the\nServerAddress\nwhen you create a VPN entry in the VPN list.\nFind the\nName\nfield and record that value. The\nName\nfield maps to the\nNetwork address\nfield when you create a VPN entry in the VPN list.\nAdd a VPN\nOpen\nMicrosoft Purview portal\nand navigate to\nData Loss Prevention\n>\nOverview\n> settings gear icon in the upper right corner >\nData Loss Prevention\n>\nEndpoint DLP settings\n>\nVPN settings\n.\nSelect\nAdd or edit VPN addresses\n.\nProvide either the\nServer address\nor\nNetwork address\nthat you recorded after running\nGet-VpnConnection\n.\nSelect\nSave\n.\nClose the item.\nImportant\nUnder the\nNetwork restrictions\nsetting, you will also see\nCorporate network\nas an option.\nCorporate network\nconnections are all connections to your organizations resources. You can see if device is using a\nCorporate network\nby running the\nGet-NetConnectionProfile\ncmdlet as an administrator. If the\nNetworkCategoryId\nin the output is\nDomainAuthenticated\n, it means the machine is connected to the Corporate network. If the output is anything else, the machine is not .\nIn some cases, a machine can be both VPN connected and Corporate network connected. If both are selected under the\nNetwork restrictions\n, Endpoint DLP will apply the action based on the order. If you want the action for VPN to be the one that's applied, move the VPN entry above\nCorporate network to\nhave higher priority than the action for\nCorporate network\n.\nSee\nScenario 9: Network exceptions\nfor more information on configuring policy actions to use network exceptions.\nSee also\nLearn about Endpoint data loss prevention\nGet started with Endpoint data loss prevention\nLearn about data loss prevention\nGet started with Activity explorer\nMicrosoft Defender for Endpoint\nOnboard Windows 10 and Windows 11 devices into Microsoft Purview overview\nMicrosoft 365 subscription\nMicrosoft Entra joined\nDownload the new Microsoft Edge based on Chromium\nCreate and Deploy data loss prevention policies\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Configure Settings", "section": "Microsoft Purview" }, "https://learn.microsoft.com/en-us/purview/information-barriers": { - "content_hash": "sha256:1437ec61fdc141f09036aceef697a30ef84676632f131a43ff057fa4d57c5c3b", - "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nLearn about Information Barriers\nFeedback\nSummarize this article for me\nMicrosoft Purview Information Barriers (IB) is a compliance solution that restricts two-way communication and collaboration between groups and users in Microsoft Teams, SharePoint, and OneDrive. Often used in highly regulated industries, IB helps avoid conflicts of interest and safeguards internal information between users and organizational areas.\nWhen you create IB policies, users who can't communicate or share files with other specific users can't find, select, chat, or call those users. IB policies automatically put checks in place to detect and prevent unauthorized communication and collaboration among defined groups and users. IB policies are independent from\ncompliance boundaries\nfor eDiscovery investigations that control user content locations that eDiscovery managers can search.\nIB policies can allow or prevent communication and collaboration between groups and users for the following example scenarios:\nUsers in the\nDay Trader\ngroup can't communicate or share files with the\nMarketing Team\n.\nInstructors in one school can't communicate or share files with students in another school in the same school district.\nFinance personnel working on confidential company information can't communicate or share files with certain groups within their organization.\nAn internal team with trade secret material can't call or chat online with users in certain groups within their organization.\nA research team can only call or chat online with a product development team.\nA SharePoint site for\nDay Trader\ngroup can't be shared or accessed by anyone outside of the\nDay Trader\ngroup.\nA lawyer's data obtained from one client can't be accessed by a lawyer at the same firm who represents a different client.\nGovernment information access and control are limited across departments and groups.\nA group of people in a company can only chat with a client or a specific customer via guest access during a customer engagement.\nImportant\nInformation Barriers\nonly supports\ntwo-way communication and collaboration restrictions. For example, a scenario where Marketing can communicate and collaborate with Day Traders, but Day Traders can't communicate and collaborate with Marketing\nisn't supported\n.\nInformation Barriers and Microsoft Teams\nIn Microsoft Teams, IB policies determine and prevent the following kinds of unauthorized communication and collaboration:\nSearching for a user\nAdding a member to a team\nStarting a chat session with someone\nStarting a group chat\nInviting someone to join a meeting\nSharing a screen\nPlacing a call\nSharing a file with another user\nAccessing a file through a sharing link\nIf the users conducting these activities in Microsoft Teams are included in an IB policy to prevent the activity, they can't proceed. In addition, everyone included in an IB policy can be potentially blocked from communicating with other users in Microsoft Teams. When users affected by IB policies are part of the same team or group chat, they might be removed from those chat sessions and further communication with the group might not be allowed.\nFor more information, see\nInformation Barriers in Microsoft Teams\n.\nInformation Barriers and SharePoint and OneDrive\nIn SharePoint and OneDrive, IB policies detect and prevent the following kinds of unauthorized collaboration:\nAdding a member to a site\nAccessing site or content by a user\nSharing site or content with another user\nSearching a site\nFor more information, see\nInformation Barriers in SharePoint\nand\nInformation Barriers in OneDrive\n.\nInformation Barriers and Microsoft Planner\nAs a work management tool,\nMicrosoft Planner\nenables users to collaborate on plans and tasks. If your compliance admin configures IB policies to restrict communication and collaboration between user segments, Microsoft Planner supports these restrictions.\nIB policies allow administrators to enable or disable search restrictions in the people picker. By using IB support in Planner, when a user searches for others in the people picker to share a plan or to assign a task, they don't see users from segments they're restricted from communicating with. This restriction prevents users from one segment from sharing plans or assigning tasks to users in another segment.\nSupported Planner applications\nYou can use IB support in Microsoft Planner for basic plans in the following applications:\nPlanner Web\nPlanner in Teams web\nPlanner in Teams desktop\nPlanner in Teams mobile app\nPolicy behavior\nWhen IB policy administrators create a new policy or modify an existing policy, users can still access existing plans shared with them or already assigned tasks. For any subsequent plan sharing or task assignment, an IB policy check is triggered and collaboration is permitted or restricted as defined by the policy.\nInformation Barriers and Exchange Online\nInformation barrier (IB) policies can't restrict communication and collaboration between groups and users in email messages. Only Exchange Online deployments currently support IB policies. If your organization needs to define and control email communications, consider using\nExchange mail flow rules\n.\nThe following table summarizes the key differences between IB modes for Exchange Online Address Book Policies (ABPs):\nFeature\nSingle and multi-segment modes\nLegacy mode\nABP management\nAutomatic; no reliance on existing ABPs\nAutomatic; existing ABPs must be removed first\nABP for unsegmented users\nSystem creates ABP with empty address lists\nNot applicable\nCustom ABP changes\nAllowed; keep consistent with IB segments\nNot supported; IB controls all ABPs\nHierarchical address book\nAvailable for users not in an IB segment\nAvailable for users not in an IB segment\nKey prerequisite\nNone for ABPs\nRemove all existing ABPs before enabling IB\nInformation Barriers and Exchange for single and multi-segment modes\nIf your organization uses\nsingle\nor\nmultisegment\nmode\n, Information Barriers no longer relies on Exchange Online Address Book Policies (ABPs). Enabling Information Barriers doesn't affect organizations that use ABPs. If users don't have an ABP defined with associated IB segments and policies, the system automatically creates an ABP with empty address lists for these users. You can change these ABPs as needed. Keep your ABPs consistent with the segments you configure in Information Barriers. Avoid user visibility differences between your existing ABPs and your new Information Barriers configuration.\nInformation Barriers and Exchange for legacy mode\nIf your organization uses\nlegacy\nmode\n, IB policies rely on\nExchange Online Address Book Policies (ABPs)\n. ABPs let organizations virtually assign users into specific groups to provide customized views of the organization's global address list (GAL). When you create IB policies, the system automatically creates ABPs for the policies. As you add IB policies in your organization, the structure and behavior of your GAL changes to comply with IB policies.\nBefore you define and apply IB policies, remove all existing Exchange address book policies in your organization. IB policies rely on address book policies, and existing ABPs aren't compatible with the ABPs that IB creates. To remove your existing address book policies, see\nRemove an address book policy in Exchange Online\n. When you enable IB policies and enable hierarchical address book, all users not included in an IB segment see the\nhierarchical address book\nin Exchange Online.\nReady to get started?\nGet started with Information Barriers\nManage IB policies\nUse multi-segment support in Information Barriers\nSee the attributes that can be used for IB policies\nInformation Barriers in Microsoft Teams\nInformation Barriers in SharePoint\nInformation Barriers in OneDrive\nSharePoint & OneDrive insights report\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "content_hash": "sha256:0b0e5cba7ed447a86856982da6edbdcbbc23038e28159b854d02afdce066e00f", + "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nLearn about Information Barriers\nFeedback\nSummarize this article for me\nMicrosoft Purview Information Barriers (IB) is a compliance solution that restricts two-way communication and collaboration between groups and users in Microsoft Teams, SharePoint, and OneDrive. Often used in highly regulated industries, IB helps avoid conflicts of interest and safeguards internal information between users and organizational areas.\nWhen you create IB policies, users who can't communicate or share files with other specific users can't find, select, chat, or call those users. IB policies automatically put checks in place to detect and prevent unauthorized communication and collaboration among defined groups and users. IB policies are independent from\ncompliance boundaries\nfor eDiscovery investigations that control user content locations that eDiscovery managers can search.\nIB policies can allow or prevent communication and collaboration between groups and users for the following example scenarios:\nUsers in the\nDay Trader\ngroup can't communicate or share files with the\nMarketing Team\n.\nInstructors in one school can't communicate or share files with students in another school in the same school district.\nFinance personnel working on confidential company information can't communicate or share files with certain groups within their organization.\nAn internal team with trade secret material can't call or chat online with users in certain groups within their organization.\nA research team can only call or chat online with a product development team.\nA SharePoint site for\nDay Trader\ngroup can't be shared or accessed by anyone outside of the\nDay Trader\ngroup.\nA lawyer's data obtained from one client can't be accessed by a lawyer at the same firm who represents a different client.\nGovernment information access and control are limited across departments and groups.\nA group of people in a company can only chat with a client or a specific customer via guest access during a customer engagement.\nImportant\nInformation Barriers\nonly supports\ntwo-way communication and collaboration restrictions. For example, a scenario where Marketing can communicate and collaborate with Day Traders, but Day Traders can't communicate and collaborate with Marketing\nisn't supported\n.\nInformation Barriers and Microsoft Teams\nIn Microsoft Teams, IB policies determine and prevent the following kinds of unauthorized communication and collaboration:\nSearching for a user\nAdding a member to a team\nStarting a chat session with someone\nStarting a group chat\nInviting someone to join a meeting\nSharing a screen\nPlacing a call\nSharing a file with another user\nAccessing a file through a sharing link\nIf the users conducting these activities in Microsoft Teams are included in an IB policy to prevent the activity, they can't proceed. In addition, everyone included in an IB policy can be potentially blocked from communicating with other users in Microsoft Teams. When users affected by IB policies are part of the same team or group chat, they might be removed from those chat sessions and further communication with the group might not be allowed.\nFor more information, see\nInformation Barriers in Microsoft Teams\n.\nInformation Barriers and SharePoint and OneDrive\nIn SharePoint and OneDrive, IB policies detect and prevent the following kinds of unauthorized collaboration:\nAdding a member to a site\nAccessing site or content by a user\nSharing site or content with another user\nSearching a site\nFor more information, see\nInformation Barriers in SharePoint\nand\nInformation Barriers in OneDrive\n.\nInformation Barriers and Microsoft Planner\nAs a work management tool,\nMicrosoft Planner\nenables users to collaborate on plans and tasks. If your compliance admin configures IB policies to restrict communication and collaboration between user segments, Microsoft Planner supports these restrictions.\nIB policies allow administrators to enable or disable search restrictions in the people picker. By using IB support in Planner, when a user searches for others in the people picker to share a plan or to assign a task, they don't see users from segments they're restricted from communicating with. This restriction prevents users from one segment from sharing plans or assigning tasks to users in another segment.\nIB support in Microsoft Planner is available only for basic plans in the following applications:\nPlanner Web\nPlanner in Teams web\nPlanner in Teams desktop\nPlanner in Teams mobile app\nPolicy behavior\nWhen IB policy administrators create a new policy or modify an existing policy, users can still access existing plans shared with them or already assigned tasks. For any subsequent plan sharing or task assignment, an IB policy check is triggered and collaboration is permitted or restricted as defined by the policy.\nInformation Barriers and Exchange Online\nInformation barrier (IB) policies can't restrict communication and collaboration between groups and users in email messages. Only Exchange Online deployments currently support IB policies. If your organization needs to define and control email communications, consider using\nExchange mail flow rules\n.\nThe following table summarizes the key differences between IB modes for Exchange Online Address Book Policies (ABPs):\nFeature\nSingle and multi-segment modes\nLegacy mode\nABP management\nAutomatic; no reliance on existing ABPs\nAutomatic; existing ABPs must be removed first\nABP for unsegmented users\nSystem creates ABP with empty address lists\nNot applicable\nCustom ABP changes\nAllowed; keep consistent with IB segments\nNot supported; IB controls all ABPs\nHierarchical address book\nAvailable for users not in an IB segment\nAvailable for users not in an IB segment\nKey prerequisite\nNone for ABPs\nRemove all existing ABPs before enabling IB\nInformation Barriers and Exchange for single and multi-segment modes\nIf your organization uses\nsingle\nor\nmultisegment\nmode\n, Information Barriers no longer relies on Exchange Online Address Book Policies (ABPs). Enabling Information Barriers doesn't affect organizations that use ABPs. If users don't have an ABP defined with associated IB segments and policies, the system automatically creates an ABP with empty address lists for these users. You can change these ABPs as needed. Keep your ABPs consistent with the segments you configure in Information Barriers. Avoid user visibility differences between your existing ABPs and your new Information Barriers configuration.\nInformation Barriers and Exchange for legacy mode\nIf your organization uses\nlegacy\nmode\n, IB policies rely on\nExchange Online Address Book Policies (ABPs)\n. ABPs let organizations virtually assign users into specific groups to provide customized views of the organization's global address list (GAL). When you create IB policies, the system automatically creates ABPs for the policies. As you add IB policies in your organization, the structure and behavior of your GAL changes to comply with IB policies.\nBefore you define and apply IB policies, remove all existing Exchange address book policies in your organization. IB policies rely on address book policies, and existing ABPs aren't compatible with the ABPs that IB creates. To remove your existing address book policies, see\nRemove an address book policy in Exchange Online\n. When you enable IB policies and enable hierarchical address book, all users not included in an IB segment see the\nhierarchical address book\nin Exchange Online.\nReady to get started?\nGet started with Information Barriers\nManage IB policies\nUse multi-segment support in Information Barriers\nSee the attributes that can be used for IB policies\nInformation Barriers in Microsoft Teams\nInformation Barriers in SharePoint\nInformation Barriers in OneDrive\nSharePoint & OneDrive insights report\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, - "last_changed": "2026-03-11T12:59:29.613756+00:00", + "last_changed": "2026-03-14T06:51:10.083468+00:00", "topic": "Information Barriers", "section": "Microsoft Purview" }, @@ -989,7 +989,7 @@ "https://learn.microsoft.com/en-us/purview/data-classification-activity-explorer": { "content_hash": "sha256:b6704c3ead8e4457526c0dbc9ca1e514510d62ce03db79dd935589af4bb100fc", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nGet started with activity explorer\nFeedback\nSummarize this article for me\nActivity explorer\nlets you monitor what's being done with your labeled content. Activity explorer provides a historical view of activities on your labeled content. The activity information comes from the Microsoft 365 unified audit logs. It's transformed and then made available in the activity explorer UI. Activity explorer reports on up to 30 days worth of data.\nActivity explorer gives you multiple ways to sort and view the data.\nFilters\nFilters are the building blocks of activity explorer. Each filter focuses on a different dimension of the collected data. You can use about 50 different individual filters, including:\nDate range\nActivity type\nLocation\nSensitivity label\nUser\nClient IP\nDevice name\nIs protected\nTo see all the filters, open the filter pane in activity explorer and look at the dropdown list.\nNote\nFilter options are generated based on the first 500 records to ensure optimal performance. This limitation might cause some values to not appear in the filter dropdown.\nFor endpoint events, only the most restrictive DLP rule appears. Filters you apply in activity explorer also operate based on this most restrictive rule.\nFilter sets\nActivity explorer comes with predefined sets of filters to help save time when you want to focus on a specific activity. Use filter sets to quickly provide you with a view of higher level activities than individual filters do. Some of the predefined filter sets are:\nEndpoint DLP activities\nSensitivity labels applied, changed, or removed\nEgress activities\nDLP policies that detected activities\nNetwork DLP activities\nProtected Browser\nYou can also create and save your own filter sets by combining individual filters.\nMicrosoft Security Copilot in activity explorer (preview)\nIn\npreview\n,\nMicrosoft Security Copilot in Microsoft Purview\nis embedded in activity explorer. It can help efficiently drill down into Activity data and help you identify activities, files with sensitive info, users, and other details that are relevant to an investigation.\nImportant\nBe sure to check the responses from Security Copilot for accuracy and completeness before taking any action based on the information provided. You can provide feedback to help improve the accuracy of the responses.\nData hunting\nSecurity Copilot skills use all the data available to Microsoft Purview, filters, and filter sets available in activity explorer and use machine learning to provide you with insights into the activity (sometimes referred to as\ndata hunting\n) on your data that is most important to you.\nShow me the top 5 activities from the past week\nFilter and investigate activities\nFind files used in specific activities\nSelecting a prompt automatically opens the Security Copilot side card and shows you the results of the query. You can then further refine the query.\nNatural language to filter set generation\nUse the prompt box to enter complex natural language queries to generate filter sets. For example, you can enter:\nFilter and investigate files copied to cloud with sensitive info type credit card number for past 30 days.\nSecurity Copilot generates a filter set for your query. Review the filter to make sure it fits your needs, then apply it to the data.\nPrerequisites\nSKU/subscriptions licensing\nFor information on licensing, see\nMicrosoft 365 Enterprise Plans\nMicrosoft 365 Service Descriptions\nPermissions\nAn account must be explicitly assigned membership in any one of these role groups, or must be explicitly granted the role.\nRoles and role groups\nUse roles and role groups to fine-tune your access controls. For more information, see\nPermissions in the Microsoft Purview portal\n.\nMicrosoft Purview roles\nInformation Protection Admin\nInformation Protection Analyst\nInformation Protection Investigator\nMicrosoft Purview role groups\nInformation Protection\nInformation Protection Admins\nInformation Protection Investigators\nInformation Protection Analysts\nMicrosoft 365 roles\nCompliance Admins\nSecurity Admins\nCompliance Data Admins\nMicrosoft 365 role groups\nCompliance Administrator\nSecurity Administrator\nSecurity Reader\nActivity types\nActivity explorer gathers information from the audit logs of multiple sources of activities.\nSome examples of the\nSensitivity label activities\nand\nRetention labeling activities\nfrom applications native to Microsoft Office, the Microsoft Purview Information Protection client and scanner, SharePoint, Exchange (sensitivity labels only), and OneDrive include:\nLabel applied\nLabel changed (upgraded, downgraded, or removed)\nAuto-labeling simulation\nFile read\nFor the current list of activities listed in Activity explorer, go into Activity explorer and open the activity filter. The list of activities is available in the dropdown list.\nLabeling activity specific to the Microsoft Purview Information Protection client and scanner that comes into Activity explorer includes:\nProtection applied\nProtection changed\nProtection removed\nFiles discovered\nFor more detailed information on what labeling activity makes it into Activity explorer, see\nLabeling events available in Activity explorer\n.\nAdditionally, Activity Explorer gathers DLP policy match events from Microsoft 365 workloads such as Exchange, SharePoint, OneDrive, Teams chat and channels, and on-premises SharePoint folders, libraries, and file shares. When you enable Endpoint data loss prevention (DLP), Activity Explorer also includes device-level activities from onboarded Windows 10, Windows 11, and the three most recent major macOS versions.\nSome example events gathered from devices include the following actions taken on files:\nDeletion\nCreation\nCopy to clipboard\nModify\nRead\nPrint\nRename\nCopy to network share\nAccess by an unallowed app\nUnderstanding the actions that are taken on content with sensitivity labels helps you determine whether the controls that you have in place, such as\nMicrosoft Purview Data Loss Prevention\npolicies, are effective. If not, or if you discover something unexpected (such as a large number of items labeled\nhighly confidential\nthat are downgraded to\ngeneral\n), you can manage your policies and take new actions to restrict the undesired behavior.\nNote\nActivity explorer doesn't currently monitor retention activities for Exchange.\nNote\nIf a user reports the Teams DLP verdict as a false positive, the activity shows as\nDLP info\nin the list on Activity explorer. The entry doesn't have any rule and policy match details but shows synthetic values. There's also no incident report generated for false positive reporting.\nActivity type events and alerts\nThis table shows the events that Activity Explorer triggers for three sample policy configurations. The events depend on whether a policy match is detected.\nPolicy configuration\nActivity Explorer event triggered for this action type\nActivity Explorer event triggered when a DLP rule is matched\nActivity Explorer alert triggered\nPolicy contains a single rule allowing the activity without auditing it.\nYes\nNo\nNo\nPolicy contains two rules: Matches for Rule #1 are allowed; policy matches for Rule #2 are audited.\nYes\n(Rule #2 only)\nYes\n(Rule #2 only)\nYes\n(Rule #2 only)\nPolicy contains two rules: Matches for both rules are allowed and not audited.\nYes\nNo\nNo\nSee also\nLearn about sensitivity labels\nLearn about retention policies and retention labels\nLearn about sensitive information types\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Activity Explorer", @@ -998,7 +998,7 @@ "https://learn.microsoft.com/purview/compliance-manager": { "content_hash": "sha256:c635b9fdb91d5536a8fffeffe0d1651634bd66fd9832cc4533d5915187934a84", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nMicrosoft Purview Compliance Manager\nFeedback\nSummarize this article for me\nWhat is Compliance Manager?\nMicrosoft Purview Compliance Manager is a solution that helps you automatically assess and manage compliance across your multicloud environment. Compliance Manager can help you throughout your compliance journey, from taking inventory of your data protection risks to managing the complexities of implementing controls, staying current with regulations and certifications, and reporting to auditors.\nWatch the video below to learn how Compliance Manager can help simplify how your organization manages compliance:\nCompliance Manager helps simplify compliance and reduce risk by providing:\nPre-built assessments for common industry and regional standards and regulations, or custom assessments to meet your unique compliance needs. Available assessments depend on your licensing agreement;\nlearn more\n.\nWorkflow capabilities to help you efficiently complete your risk assessments through a single tool.\nDetailed step-by-step guidance on suggested improvement actions to help you comply with the standards and regulations that are most relevant for your organization. For actions that are managed by Microsoft, you’ll see implementation details and audit results.\nA risk-based compliance score to help you understand your compliance posture by measuring your progress in completing improvement actions.\nThe Compliance Manager overview page shows your current compliance score, helps you see what needs attention, and guides you to key improvement actions.\nUnderstanding your compliance score\nCompliance Manager awards you points for completing improvement actions taken to comply with a regulation, standard, or policy, and combines those points into an overall compliance score. Each action has a different impact on your score depending on the potential risks involved. Your compliance score can help prioritize which action to focus on to improve your overall compliance posture. Compliance Manager gives you an initial score based on the Microsoft 365 data protection baseline. This baseline is a set of controls that includes key regulations and standards for data protection and general data governance.\nLearn more\nUnderstand scoring in Compliance Manager\n.\nLearn how to work with improvement actions\n.\nKey elements: controls, assessments, regulations, improvement actions\nCompliance Manager uses several data elements to help you manage your compliance activities. As you use Compliance Manager to assign, test, and monitor compliance activities, it’s helpful to have a basic understanding of the key elements: controls, assessments, regulations, and improvement actions.\nBe sure to check out the\nCompliance Manager glossary of terms\n.\nControls\nA control is a requirement of a regulation, standard, or policy. It defines how you assess and manage system configuration, organizational process, and people responsible for meeting a specific requirement of a regulation, standard, or policy. Compliance Manager tracks the following types of controls:\nMicrosoft managed controls\n: controls for Microsoft cloud services, which Microsoft is responsible for implementing\nYour controls\n: sometimes referred to as customer managed controls, these are controls implemented and managed by your organization\nShared controls\n: these are controls that both your organization and Microsoft share responsibility for implementing\nLearn more about\nmonitoring control progress\n.\nAssessments\nAn assessment is grouping of controls from a specific regulation, standard, or policy. Completing the actions within an assessment help you meet the requirements of a standard, regulation, or law. For example, you may have an assessment that, when you complete all actions within it, helps to bring your Microsoft 365 settings in line with ISO 27001 requirements. Assessments have several components:\nIn-scope services\n: the specific set of Microsoft services applicable to the assessment\nMicrosoft managed controls\n: controls for Microsoft cloud services, which Microsoft implements on your behalf\nYour controls\n: sometimes referred to as customer managed controls, these are controls implemented and managed by your organization\nShared controls\n: these are controls that both your organization and Microsoft share responsibility for implementing\nAssessment score\n: shows your progress in achieving total possible points from actions within the assessment that are managed by your organization and by Microsoft\nLearn more about\ncreating and managing assessments\n.\nRegulations\nCompliance Manager provides over 360 regulatory templates to help you quickly create assessments. For organizations with unique compliance needs, you can also create custom regulation templates. Learn more about working with\nregulations in Compliance Manager\nand view the full\nlist of regulations\n. Learn more about\ncreating regulation templates\n.\nImprovement actions\nImprovement actions help centralize your compliance activities. Each improvement action provides recommended guidance that’s intended to help you align with data protection regulations and standards. Improvement actions can be assigned to users in your organization to perform implementation and testing work. You can also store evidence, notes, and record status updates within the improvement action. Learn more about\nworking with improvement actions\n.\nSupported languages\nCompliance Manager is available in the following languages:\nEnglish\nBahasa Indonesian\nBahasa Malay\nChinese (Simplified)\nChinese (Traditional)\nCzech\nDanish\nDutch\nFinnish\nFrench\nGerman\nHebrew\nHungarian\nItalian\nJapanese\nKorean\nNorwegian\nPolish\nPortuguese (Brazilian)\nRussian\nSpanish\nSwedish\nThai\nTurkish\nNext steps\nSign in, assign permissions and roles, configure settings, and personalize your dashboard view\n.\nLearn about and set up multicloud support\n.\nCreate assessments to help you comply with industry standards that matter most to your organization\n.\nGet detailed scenario-based guidance on using Compliance Manager and other Purview solutions to help you manage data privacy and data protection\n.\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Compliance Manager", @@ -1007,7 +1007,7 @@ "https://learn.microsoft.com/purview/compliance-manager-assessments": { "content_hash": "sha256:b3f8ba6b047fd07a37d7a7cb85ff27268f05adc2e1632f216b04a5459c84b704", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nBuild and manage assessments in Compliance Manager\nFeedback\nSummarize this article for me\nCompliance Manager assessments help your organization evaluate its compliance with industry and regional regulations. Setting up the most relevant assessments for your organization can help you implement policies and operational procedures to limit your compliance risk. Ready-to-use regulatory templates for over 360 regulations contain the necessary controls and improvement actions for completing the assessment.\nTip\nGet a comprehensive compliance overview before you deploy Microsoft services in your organization. Learn more about\npredeployment compliance with Compliance Manager (preview)\n.\nAssessments page\nAll of your assessments are listed on the\nAssessments\npage Compliance Manager. You can create one assessment that covers multiple services. For example, you can create a single EU GDPR assessment that covers Microsoft 365, Microsoft Azure, Amazon Web Services (AWS), and Google Cloud Platform (GCP). The assessment details page shows a breakdown of control progress by service to help you evaluate how you’re doing across all your services. Learn more about\nmonitoring assessment progress from the assessment details page\n.\nImportant\nThe regulations that are available for your organization's use by default depend on your licensing agreement.\nReview licensing details\n.\nThe\nFree regulation licenses used/Purchased regulation licenses used\ncounter near the top of the page shows the number of regulations currently in use out of the total number available for your organization to use. Learn more about\nregulation availability\n.\nAssessment status and details\nThe assessments page summarizes key information about each assessment:\nAssessment\n: Name of the assessment.\nStatus\n: See status types below.\nComplete\n: All controls have a status of “Passed,” or at least one is passed and the rest are “Out of scope.”\nIncomplete\n: At least one control has a status of “Failed.\" Review the failed controls and, within those controls, review both your improvement actions and Microsoft actions to see which have a \"Failed\" status.\nNone\n: Not all controls have been tested.\nIn progress\n: Improvement actions have a status of “In progress,” “Partial credit,” or “Undetected.\"\nProgress\n: The percentage of the work done toward completion, as measured by the number of controls successfully tested.\nYour improvement actions\n: The number of completed actions to satisfy implementation of your controls.\nMicrosoft actions\n: The number of completed actions to satisfy implementation of Microsoft controls.\nGroup\n: The name of the group to which the assessment belongs.\nService\n: The services covered by the assessment, such as Microsoft 365, Microsoft Azure, or other cloud services.\nRegulation\n: The regulatory template serving as the basis for the assessment.\nTo filter your view of assessments:\nSelect\nFilter\nat the top-left corner of your assessments list.\nOn the\nFilters\nflyout pane, check your desired criteria.\nSelect the\nApply\nbutton. The filter pane closes and you see your filtered view.\nYou can also modify your view to see assessments by group, product, or regulation by selecting the type of grouping from the\nGroup\ndrop-down menu above your assessments list.\nData protection baseline default assessment\nTo get you started, Microsoft provides a default\nData Protection Baseline\nassessment that's included at all subscription levels. This baseline assessment has a set of controls for key regulations and standards for data protection and general data governance. This baseline draws elements primarily from NIST CSF (National Institute of Standards and Technology Cybersecurity Framework) and ISO (International Organization for Standardization), as well as from FedRAMP (Federal Risk and Authorization Management Program) and GDPR (General Data Protection Regulation of the European Union).\nThis assessment is used to calculate your initial compliance score the first time you come to Compliance Manager, before you configure any other assessments. Compliance Manager collects initial signals from your Microsoft 365 solutions. You see at a glance how your organization is performing relative to key data protection standards and regulations, and see suggested improvement actions to take. Compliance Manager becomes more helpful as you build and manage your own assessments to meet your organization's particular needs.\nAssessments for AI regulations\nCompliance Manager provides four premium\nregulatory templates\nto help your organization assess, implement, and strengthen its compliance against AI regulations. These templates are applicable to\nall generative AI apps that Microsoft Purview supports for AI interactions\n, such as Microsoft 365 Copilot, Security Copilot, ChatGPT Enterprise, Microsoft Foundry, Gemini and DeepSeek.\nThe AI regulations listed below align with compliance requirements such as monitoring AI interactions and preventing data loss in AI applications:\nEU Artificial Intelligence Act\nISO/IEC 23894:2023\nISO/IEC 42001:2023\nNIST AI Risk Management Framework (RMF) 1.0\nWhere to find them\nOn the\nRegulations\npage in Compliance Manager, the AI regulations are listed under the\nPremium AI templates\nheader. All other premium regulations are listed under the\nPremium templates\nheader.\nUsing an AI regulation counts toward your purchased premium licenses. Learn more about\nregulation availability and licensing\n.\nHow to use them\nThe\nRecommendations\nsection from\nData Security Posture Management for AI\nprovides guided assistance on working with AI regulations. This solution displays recent interactions with sensitive data and recommends actions to take to help you stay compliant with AI regulations.\nAutomatic Assessments for AI Apps and Agents in Compliance Manager\nCompliance Manager integrates with Azure AI Foundry to automate compliance evaluations for AI models and agents. This integration syncs evaluation results directly from AI Foundry into Compliance Manager, reducing manual effort and improving regulatory alignment. Organizations benefit from built-in assessments for key AI regulations—including the EU AI Act, NIST AI RMF, and ISO/IEC standards—provided free for six months with Copilot or Agent licenses. This capability enables quick, automated compliance checks without additional purchases, streamlining governance for AI deployments.\nPrerequisites\nThe admin account must have either the Azure AI Project Manager or Azure AI User RBAC role assigned to the relevant AI Foundry accounts.\nThis access is required to add Agents during assessment creation for the AI Foundry service in Compliance Manager.\nKey Features\nScoped Assessments\n– Users define the assessment scope directly in Compliance Manager.\nComprehensive Action Set\n– AI Foundry provides 75 actions, including 15 automated evaluation actions for metrics like reliability, BLEU score, coherence, and fluency.\nAutomated Sync\n– Compliance Manager regularly syncs these 15 evaluation actions from AI Foundry, displaying real-time pass/fail status and detailed metrics.\nFlexible Action Management\n– Users can convert automated actions to manual actions through Compliance Manager settings.\nManual Update Support\n– Allows manual updates for organizations without AI Foundry access.\nEffort Reduction\n– Eliminates repetitive tasks by automatically syncing evaluation results.\nDetailed Insights\n– Provides granular evaluation metrics alongside compliance status for transparency.\nBaseline Assessment for Microsoft 365 Copilot and Copilot Chat\nMicrosoft Purview Compliance Manager provides a baseline assessment for Microsoft Copilot for Microsoft 365 and Copilot Chat. This assessment is derived from the global AI regulations and includes recommended controls to help organizations maintain compliance and reduce risk.\nWhen a Microsoft Copilot for Microsoft 365 or Copilot Chat license is purchased, the baseline assessment is automatically provisioned in the Microsoft 365 admin center. Administrators can view completed actions and pending tasks within the portal to track compliance progress and implement required controls.\nInitial steps before creating assessments\nListed below are details about steps and information that will help you prepare for creating an assessment:\nPlan a\ngrouping strategy\nfor your assessments.\nUnderstand\nregulatory templates\n, which contain the controls and action recommendations for assessments.\nSet up\nconnectors\nif you're assessing non-Microsoft services.\nGroups for assessments\nWhen you create an assessment, you must assign it to a group. Groups are containers that allow you to organize assessments in a way that is logical to you, such as by year or regulation, or based on your organization's divisions or geographies. This is why we recommend planning a grouping strategy before you create assessments. Below are examples of two groups and their underlying assessments:\nFFIEC IS assessment 2020\nFFIEC IS\nData security and privacy assessments\nISO 27001:2013\nISO 27018:2014\nDifferent assessments within a group or groups can share improvement actions. Improvement actions can be changes you make within technical solutions mapped to your tenant, like turning on two-factor authentication, or to nontechnical actions you perform outside the system, like instituting a new workplace policy. Any updates in details or status that you make to a technical improvement action will be picked up by assessments across all groups. Nontechnical improvement action updates will be recognized by assessments within the group where you apply them. This allows you to implement one improvement action and meet several requirements simultaneously.\nWhat to know when working with groups\nYou can create a group during the process of creating an assessment.\nGroups can't be standalone entities. A group must contain at least one assessment.\nGroup names must be unique within your organization.\nGroups don't have security properties. All permissions are associated with assessments.\nOnce you add an assessment to a group, the grouping can't be changed.\nIf you add a new assessment to an existing group, common information from assessments in that group are copied to the new assessment.\nRelated assessment controls in different assessments within the same group automatically update when completed.\nGroups can contain assessments for the same certification or regulation, but each group can only contain one assessment for a specific product-certification pair. For example, a group can't contain two assessments for Office 365 and NIST CSF. A group can contain multiple assessments for the same product only if the corresponding certification or regulation for each one is different.\nDeleting an assessment breaks the relationship between that assessment and the group.\nGroups can't be deleted.\nSet up connectors\nCompliance Manager has an integrated set of connectors to build assessments that cover non-Microsoft services like Salesforce and Zoom. Visit\nWorking with connectors\nto learn more and start the setup process.\nCreate assessments\nTo create and modify an assessment, a user must hold a role of Compliance Manager Administration, Compliance Manager Assessor, or Global Administrator. Learn more about\nroles and permissions\n.\nImportant\nMicrosoft recommends that you use roles with the fewest permissions. Minimizing the number of users with the Global Administrator role helps improve security for your organization. Learn more about Microsoft Purview\nroles and permissions\n.\nBefore starting to create an assessment, be sure you know which group you'll assign it to, or be prepared to create a new group for this assessment. Read details about\ngroups and assessments\n. To create an assessment, you use a guided process to select a regulation and designate services.\nBuild a custom assessment\nYou can modify a regulatory template by adding controls and improvement actions to create a customized assessment. Visit\nBuild custom assessments (preview)\nfor instructions.\nCreate an assessment using a guided process\nFrom your\nAssessments\npage, select\nAdd assessment\nto begin the assessment creation wizard.\nOn the\nBase your assessment on a regulation\npage, select\nSelect regulation\nto choose the regulatory template for the assessment. The\nSelect regulation\nflyout page opens.\nUse the search box to find your desired regulation, then select the check bubble to the left of the regulation name. Select\nSave\n, confirm your selection, then select\nNext\n.\nOn the\nAdd name and group\npage, enter values in the following fields:\nAssessment name\n: Assessment names must be unique. If the name matches another assessment in any group, you receive an error asking you to create a different name.\nAssessment group\n: Assign your assessment to a group in one of two ways:\nUse existing group\nto assign it to a group you created; or\nCreate new group\nto which you'll assign the assessment. Enter a name for this group. You can also\nCopy data from an existing group\n, such as implementation and testing details and documents, by selecting the appropriate boxes.\nWhen finished, select\nNext\n.\nOn the\nSelect services\npage, designate which services this assessment applies to (learn more about\nmulticloud support\n) using the\nSelect services\ncommand. The flyout pane shows which services are available for your chosen regulation. Place a check next to your desired services, then select\nAdd\n. Then select\nNext\n.\nIf your desired service isn't listed, you can add it as a new service. When you add a new service, the\nuniversal version of the underlying regulation\nis used, and you perform manual implementation and testing work. To add a new service:\nOn the\nSelect services\npage, select\nAdd new service\n.\nEnter a name and description for the service.\nSelect\nAdd\n. The service is listed on the\nService\nsection\nof the assessment's details page.\nIf you selected a service that has more than one subscription covered by Microsoft Defender for Cloud, you arrive at a substep for\nSelect service subscriptions\n. Select\nManage subscriptions\n. On the flyout pane, a tab for each service displays a list of all subscriptions within that service. All subscriptions are selected by default, but you can remove any by selecting the\nX\nnext to the name. On the\nSelect services\npage, select\nNext\n.\nReview and finish:\nReview all your selections and make any necessary edits. When you're satisfied with the settings, select\nCreate assessment\n.\nThe next screen confirms the assessment was created. When you select\nDone\n, you're taken to your new assessment's details page. If you see an\nAssessment failed\nscreen after selecting\nCreate assessment\n, select\nTry again\nto re-create your assessment.\nEdit an assessment\nAfter creating an assessment, you can edit it to update its name and add or remove services and subscriptions. To update an assessment:\nFrom the assessment details page, select the ellipses in the upper right corner and select\nEdit assessment\n. The assessment update wizard opens.\nYou can update the assessment name on the\nUpdate assessment name\npage, or leave it as-is, then select\nNext\n.\nOn the\nSelect services\npage, add or remove services, then select\nNext\n.\nOn the\nSelect service subscriptions\npage, select\nManage subscriptions\nto make any changes to your subscriptions. Then select\nNext\n.\nReview your updates, then select\nModify assessment\nto save your changes.\nMonitor assessment progress and controls\nEach assessment has a details page that gives an at-a-glance view of your progress in completing the assessment. The page shows how your services are performing, and the status of controls and improvement actions. Expand the\nOverview\nsection at the left side of the page to see basic details about the assessment, including its group, regulation, associated services, completion status, and a description.\nThe\nProgress\ntab shows the percentage of progress toward assessment completion. The progress bar displays a breakdown showing the number of points achieved within each service covered by the assessment. Get details on each service by\nviewing service details\n. See all controls within the assessment and their current status on the\nControls tab\n. Quickly access the status of all your improvement actions for the assessment the\nYour improvement actions tab\n. The actions handled by Microsoft for the assessment are listed on the\nMicrosoft actions tab\n.\nAssessment progress by service\nThe\nService\nsection on the assessment’s\nProgress\ntab helps you understand how you’re doing with respect to a regulation with each of your services individually, even at the subscription level, and collectively across your organization. The assessment gets its data on available subscriptions and improvement action status from Microsoft Defender for Cloud. Any errors associated with subscription accessibility should be addressed in your Defender for Cloud. See\nConfigure cloud settings\nfor more information.\nSelect the\nView service details\ncommand, located next to or under the\nAssessment progress\nbar graph or in the upper-right command bar, to view a flyout pane with more details. The\nView service details\nflyout pane lists each service and its progress toward completing the assessment. Selecting\nView\nnext to a service name displays another pane that lists each subscription within the service and its status.\nOn a service's details panel, you see the list of subscriptions within the service that are covered by the assessment. The\nService progress\ncounter indicates the number of points achieved so far by improvement actions pertaining to the service for the assessment out of the total number of achievable points.\nYou can add more subscriptions to the service that you want the assessment to cover by\nediting the assessment\n.\nControls tab\nThe\nControls\ntab displays detailed information for each control in the assessment. The\nControl status breakdown\nchart shows the status of controls by family (for example, Configuration Management and Incident Response) so you can see at a glance which groupings of controls need attention. The table underneath the breakdown chart lists all controls. You can filter the list by control family, status, and service. The table shows the following details about each control:\nControl title\nStatus\n: The test status of the improvement actions within the control:\nPassed\n: All improvement actions have a test status of \"passed,\" or at least one is passed and the rest are \"out of scope.\"\nFailed\nAt least one improvement action has a test status of \"failed.\"\nNone\n: All improvement actions haven't been tested.\nOut of scope\n: All improvement actions are out of scope for this assessment.\nIn progress\n: Improvement actions have a status other than the ones listed above, which could include \"in progress,\" \"partial credit,\" or \"undetected.\"\nControl ID\n: The control's identification number, assigned by its corresponding regulation, standard, or policy.\nPoints achieved\n: The number of points earned by completing actions, out of the total number achievable.\nYour improvement actions\n: The number of your actions completed out of the total number to be done.\nMicrosoft actions\n: The number of actions completed by Microsoft.\nSelect a control from the list to view its details page. A graph indicates the test status of the improvement actions within the control. A table below the graph lists the improvement actions for that control. Select an improvement action from the list to drill into the improvement action's details page, from where you can manage implementation and testing. Get details about\nworking with improvement actions\n.\nYour improvement actions tab\nThe\nImprovement actions\ntab on the assessment details page lists all your improvement actions for the control. The status bar chart details the aggregated test status of your improvement actions in the assessment so you can quickly gauge what has been tested and what still needs to be done. Hover over or select a test status label to highlight only that status on the bar.\nBeneath the bar, a table lists all the actions and key details, including: service, test status, the number of potential and earned points, associated regulations and standards, applicable solution, action type, and control family.\nFilter by\nService\nto view actions related to a service and their progress. From the table, select an improvement action to go to its details page, from where you can manage implementation and testing.\nGet details about\nworking with improvement actions\n.\nMicrosoft actions tab\nThe Microsoft actions tab appears for assessments based on templates that support Microsoft products. It lists all the actions in the assessment that are managed by Microsoft. The list shows key action details, including: service, test status, points that contribute to your overall compliance score, associated regulations and standards, applicable solution, action type, and control family. Select an improvement action to view its details page.\nGrant user access to individual assessments\nWhen you assign users a Compliance Manager role in the Microsoft Purview portal, they can view or edit data within all assessments by default (review the\nCompliance Manager role types\n). You can restrict user access to only certain assessments by managing user roles from within an assessment. Restricting access in this way can help ensure that users who play a role in overseeing compliance with particular regulations or standards have access only to the data and information they need to perform their duties. (You can also set\nuser access for regulations\n, which allows users to access all assessments created for that regulation.)\nExternal users who need access for auditing or other purposes can also be assigned a role for viewing assessments and editing test data. You provide access to external individual by assigning them a Microsoft Entra role. Learn more about\nassigning roles\n.\nSteps for granting access\nFollow the steps to grant user access to an assessment.\nFrom your\nAssessments\npage, find the assessment you want to grant access to. Select it to open its details page.\nIn the upper-right corner, select\nManage user access\n.\nA\nManage user access\nflyout pane appears. It has three tabs, one for each role of Readers, Assessors, and Contributors. Navigate to the tab for the role you want your user to hold for this assessment. Users who currently have access to the assessment will have a blue box with a check mark to the left of their name.\nSelect the\n+ Add\ncommand for the role tab you're on:\nAdd reader\n, or\nAdd assessor\nor\nAdd contributor\n.\nAnother flyout pane appears which lists all the users in your organization. You can select the checkbox next to the username you want to add, or you can enter their name in the search bar and select the user from there. You can select multiple users at once.\nAfter making all your selections, select\nAdd\n.\nNote\nIf you assign a role to someone who already has an existing role, the new role assignment you choose will override their existing role. In this case, you'll see a confirmation box asking you to confirm the change in role.\nThe flyout pane closes and you arrive back at your assessment details page. A confirmation message at the top confirms the new role assignment for that assessment.\nSteps for removing access\nYou can remove a user's access to individual assessments by following the steps below:\nOn the assessment's details page, select\nManage user access\n.\nOn the\nManage user access\nflyout pane, go the tab corresponding to the user's role you want to remove.\nFind the user whose role you want to remove. Check the circle to the left of their name, then select the\nRemove\ncommand just below the role tab. To remove all users at once, select the\nRemove all\ncommand without checking the circle next to every user's name.\nA\nRemove access?\ndialog appears, asking you to confirm the removal. Select\nRemove access\nto confirm the role removal.\nSelect\nSave\non the flyout pane. The users' roles will now be removed from the assessment.\nLearn how to get a broad\nview of all users with access to assessments\n.\nNote about multiple roles\nA user can have one role that applies to an assessment, while also holding another role that applies broadly to overall Compliance Manager access.\nFor example, if you assigned a user a\nCompliance Manager Reader\nrole, you can also assign that user a\nCompliance Manager Assessor\nrole for a specific assessment. In effect, the user holds the two roles at the same time, but their ability to edit data is limited to the assessment to which they've been assigned the\nAssessor\nrole.\nRemoving an assessment-based role won't remove the user's overall Compliance Manager role if they have one. If you want to change a user's overall role, you have to change it the Microsoft Purview portal. Learn more about\nassigning roles and permissions\n.\nFor an individual assessment, one user can only hold one assessment-based role at a time.\nFor example, if a user holds a reader role for a GDPR assessment and you want to change them to a contributor role, you'll first need to remove their reader role, and then reassign them the reader role.\nNote\nAdmins whose permissions for Compliance Manager were set in Microsoft Entra ID won't appear on the\nManage user access\nflyout pane. This means that if a user has access to one or more assessments, and their role is Global Administrator, Compliance Administrator, Compliance Data Administrator, or Security Administrator, they won't appear on this pane. Learn more about\nsetting Compliance Manager permissions and roles\n.\nAccept updates to assessments\nWhen an update is available for an assessment, you see a notification and have the option to accept the update or defer it for a later time. Updates are available for assessments based on the regulatory templates provided in Compliance Manager. If your organization is using universal templates for assessing other products, inheritance might not be supported.\nWhat causes an update\nAn assessment update occurs when there are underlying template changes that impact scoring. Changes might involve adjusting control mapping or other guidance based on regulatory changes or product changes. Assessment updates can originate from your organization and from Microsoft.\nIf Microsoft updates a Compliance Manager template that you extended, your assessment inherits those updates once you accept them. Your assessment retains the other attributes you applied to the assessment when you extended it.\nCustom assessments that you create don't receive any template updates from Microsoft. Custom assessments can receive improvement action updates, but any Microsoft updates to control mapping between assessments and improvement actions don't apply to custom templates.\nNote\nUpdates to assessments apply only at the group level. If you have two assessments built from the same template that exist in two different groups, each assessment will have a pending update notification, and you'll need to accept the update to each assessment in its respective group individually.\nWhere you see assessment update notifications\nThe assessment details page also shows a\nPending update\nlabel next to the assessment with an update. Select that assessment to get to its details page.\nA message near the top of the assessment details page shows that an update is available for that assessment. Select the\nReview update\nbutton in the banner to review the specific changes and accept or defer the update.\nThe assessment details page might also list improvement actions that have a\nPending update\nlabel next to them. Those updates are for specific changes to the improvement actions themselves and need to be accepted separately. Visit\nAccepting updates to improvement actions\nto learn more.\nReview update to accept or defer\nWhen you select\nReview update\nfrom the assessment details page, a flyout pane appears on the right side of your screen. The flyout pane provides the key details below about the pending update:\nThe template title\nSource of the update (Microsoft, your organization, or a specific user)\nThe date the update was created\nAn overview explaining the update\nSpecific details about the changes, including the impact to your compliance score, the amount of progress toward completion of the assessment, and the specific number of changes to improvement actions and controls.\nSelecting the\nUpdated template\ncommand downloads an Excel file containing control data for the version of the template with the pending updates. Selecting the\nCurrent template\ncommand downloads a file of the existing template without the updates.\nTo accept the update and make the changes to your assessment, select\nAccept update\n. Accepted changes are permanent.\nIf you select\nCancel\n, the update won't be applied to the assessment. However, you continue to see the\nPending update\nnotification until you accept the update.\nWhy we recommend accepting updates\n: Accepting updates helps ensure you have the most updated guidance on using solutions and taking appropriate improvement actions to help you meet the requirements of the certification at hand.\nWhy you might want to defer an update\n: If you're in the middle of completing an assessment, you might want to ensure you finished work on it before you accept an update to the assessment that could disrupt control mapping. You can defer the update for a later time by selecting\nCancel\non the review update flyout pane.\nExport an assessment report\nYou can export an assessment to an Excel file for compliance stakeholders in your organization or for external auditors and regulators. On the assessment details page, select the\nExport actions\nin the top right corner of the page, which creates an Excel file you can save and share. The report is a snapshot of the assessment as of the date and time of the export. It contains the details for controls managed by both you and Microsoft, including implementation status, test date, and test results.\nDelete an assessment\nDeleting an assessment removes it from the list on your assessments page. Note these important points about deleting assessments:\nDeleting an assessment is permanent; you cannot get it back.\nIf you want to use the same assessment again, you need to re-create it.\nIf the improvement actions in the assessment don't appear in any other assessment, they're deleted when the assessment is deleted.\nWe recommend\nexporting a report\nof the assessment before you permanently delete it.\nTo delete an assessment, follow the steps below:\nFrom the\nAssessments\npage, select the assessment you wish to delete.\nOn the assessment's details page, select\nDelete assessment\nin the upper-right corner of your screen. If you don't see this option, select the ellipsis (...) in the upper-right corner, then select\nDelete assessment\nfrom the list.\nA window appears asking you to confirm that you want to permanently delete the assessment. Select\nDelete assessment\nto close the window. You get a confirmation window that your assessment was deleted from Compliance Manager.\nNote\nYou can't delete all of your assessments. Organizations need at least one assessment for Compliance Manager to function properly. If the assessment you want to delete is the only one, add another assessment before deleting the other assessment.\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Assessments", @@ -1016,7 +1016,7 @@ "https://learn.microsoft.com/en-us/entra/identity/conditional-access/overview": { "content_hash": "sha256:553b2293d3e1381c0e3eede2e268c3c0d37d17288edc6ebff039424a9adfa37c", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nWhat is Conditional Access?\nFeedback\nSummarize this article for me\nModern security extends beyond an organization's network perimeter to include user and device identity. Organizations now use identity-driven signals as part of their access control decisions. Microsoft Entra Conditional Access brings signals together, to make decisions, and enforce organizational policies. Conditional Access is Microsoft's\nZero Trust policy engine\ntaking signals from various sources into account when enforcing policy decisions.\nConditional Access policies at their simplest are if-then statements:\nif\na user wants to access a resource,\nthen\nthey must complete an action. For example: If a user wants to access an application or service like Microsoft 365, then they must perform multifactor authentication to gain access.\nAdmins are faced with two primary goals:\nEmpower users to be productive wherever and whenever\nProtect the organization's assets\nUse Conditional Access policies to apply the right access controls when needed to keep your organization secure and don't interfere with productivity.\nImportant\nConditional Access policies are enforced after first-factor authentication is completed. Conditional Access isn't intended to be an organization's frontline defense for scenarios like denial-of-service (DoS) attacks, but it can use signals from these events to determine access.\nCommon signals\nConditional Access uses signals from various sources to make access decisions.\nSome of these signals include:\nUser, group, or agent\nPolicies can be targeted to specific users, groups, and agents (Preview) giving admins fine-grained control over access.\nSupport for agent identities and agent users extends Zero Trust principles to AI workloads.\nIP location information\nOrganizations can create IP address ranges that can be used when making policy decisions.\nAdmins can specify entire countries/regions IP ranges to block or allow traffic from.\nDevice\nUsers with devices of specific platforms or marked with a specific state can be used when enforcing Conditional Access policies.\nUse filters for devices to target policies to specific devices like privileged access workstations.\nApplication\nTrigger different Conditional Access policies when users attempt to access specific applications.\nApply policies to traditional cloud apps, on-premises applications, and agent resources.\nReal-time and calculated risk detection\nIntegrates signals from\nMicrosoft Entra ID Protection\nto identify and remediate risky users, sign-in behavior, and agent activities.\nMicrosoft Defender for Cloud Apps\nMonitors and controls user application access and sessions in real time. This integration improves visibility and control over access and activities in your cloud environment.\nCommon decisions\nBlock access is the most restrictive decision.\nGrant access\nA less restrictive decision that might require one or more of the following options:\nRequire multifactor authentication\nRequire authentication strength\nRequire the device to be marked as compliant\nRequire a Microsoft Entra hybrid joined device\nRequire an approved client app\nRequire an app protection policy\nRequire a password change\nRequire terms of use\nCommonly applied policies\nMany organizations have\ncommon access concerns that Conditional Access policies can help with\n, such as:\nRequiring multifactor authentication for users with administrative roles\nRequiring multifactor authentication for Azure management tasks\nBlocking sign-ins for users who try to use legacy authentication protocols\nRequiring trusted locations for security information registration\nBlocking or granting access from specific locations\nBlocking risky sign-in behaviors\nRequiring organization-managed devices for specific applications\nAdmins can create policies from scratch or start with a template policy in the portal or by using the Microsoft Graph API.\nAdmin experience\nAdmins with at least the\nSecurity Reader\nrole can find Conditional Access in the\nMicrosoft Entra admin center\nunder\nEntra ID\n>\nConditional Access\n.\nThe\nOverview\npage shows a summary of recent activity that relates to Conditional Access policies. Here you can see how many policies are enabled vs report-only, agent and user activity, applications, devices, and general security alerts with suggestions.\nThe\nCoverage\ntab shows a summary of applications with and without Conditional Access policy coverage over the past seven days.\nThe\nPolicies\npage lists all of the polices in your tenant, including report-only policies and policies created by the Conditional Access Optimization Agent (if applicable). Options to filter, view \"What if\" scenarios, and create new policies are available here.\nConditional Access Optimization Agent\nThe\nConditional Access Optimization Agent\nwith Microsoft Security Copilot suggests new policies and changes to existing ones based on Zero Trust principles and Microsoft best practices. With one click, apply the suggestion to automatically update or create a Conditional Access policy. The agent needs at least the Microsoft Entra ID P1 license and\nsecurity compute units (SCU)\n.\nLicense requirements\nUsing this feature requires Microsoft Entra ID P1 licenses. To find the right license for your requirements, see\nCompare generally available features of Microsoft Entra ID\n.\nCustomers with\nMicrosoft 365 Business Premium licenses\ncan also use Conditional Access features.\nOther products and features that interact with Conditional Access policies require appropriate licensing for those products and features, including Microsoft Entra Workload ID, Microsoft Entra ID Protection, and Microsoft Purview.\nWhen the licenses required for Conditional Access expire, policies aren't automatically disabled or deleted. This graceful state lets customers migrate away from Conditional Access policies without a sudden change in their security posture. You can view and delete remaining policies, but you can't update them.\nSecurity defaults\nhelp protect against identity-related attacks and are available for all customers.\nZero Trust\nThis feature helps organizations to align their\nidentities\nwith the three guiding principles of a Zero Trust architecture:\nVerify explicitly\nUse least privilege\nAssume breach\nTo find out more about Zero Trust and other ways to align your organization to the guiding principles, see the\nZero Trust Guidance Center\n.\nNext steps\nPlan your Conditional Access deployment\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Conditional Access", @@ -1025,7 +1025,7 @@ "https://learn.microsoft.com/en-us/entra/identity/conditional-access/concept-conditional-access-policies": { "content_hash": "sha256:fed4a07b9a14f3bc3dec8cdda7530595643aa00a37472fc4d2708c935d71c7ce", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nBuilding a Conditional Access policy\nFeedback\nSummarize this article for me\nAs explained in the article\nWhat is Conditional Access\n, a Conditional Access policy is an if-then statement of\nAssignments\nand\nAccess controls\n. A Conditional Access policy combines signals to make decisions and enforce organizational policies.\nHow does an organization create these policies? What is required? How are they applied?\nMultiple Conditional Access policies can apply to an individual user at any time. In this case, all applicable policies must be satisfied. For example, if one policy requires multifactor authentication and another requires a compliant device, you must complete MFA, and use a compliant device. All assignments are logically combined using\nAND\n. If you have more than one assignment configured, all assignments must be satisfied to trigger a policy.\nIf a policy with \"Require one of the selected controls\" is selected, prompts appear in the defined order. Once the policy requirements are satisfied, access is granted.\nAll policies are enforced in two phases:\nPhase 1\n: Collect session details\nGather session details, like network location and device identity necessary for policy evaluation.\nPhase 1 of policy evaluation occurs for enabled policies and policies in\nreport-only mode\n.\nPhase 2\n: Enforcement\nUse the session details gathered in phase 1 to identify any requirements that aren't met.\nIf there's a policy that is configured with the\nblock\ngrant control, enforcement stops here and the user is blocked.\nThe user is prompted to complete more grant control requirements that weren't satisfied during phase 1 in the following order, until policy is satisfied:\nMultifactor authentication​\nDevice to be marked as compliant\nMicrosoft Entra hybrid joined device\nApproved client app\nApp protection policy\nPassword change\nTerms of use\nCustom controls\nOnce all grant controls are satisfied, session controls are applied (App Enforced, Microsoft Defender for Cloud Apps, and token lifetime).\nPhase 2 of policy evaluation occurs for all enabled policies.\nAssignments\nThe assignments section defines who, what, and where for the Conditional Access policy.\nUsers and groups\nUsers and groups\nassign who the policy include or exclude when applied. This assignment can include all users, specific groups of users, directory roles, or external guest users. Organizations with Microsoft Entra Workload ID licenses might target\nworkload identities\nas well.\nPolicies targeting roles or groups are evaluated only when a token is issued. This means:\nNewly added users to a role or group aren't subject to the policy until they get a new token.\nIf a user already has a valid token before being added to the role or group, the policy doesn't apply retroactively.\nThe best practice is to trigger Conditional Access evaluation during role activation or group membership activation using\nMicrosoft Entra Privileged Identity Management\n.\nTarget resources\nTarget resources\ncan include or exclude cloud applications, user actions, or authentication contexts that are subjected to the policy.\nNetwork\nNetwork\ncontains IP addresses, geographies, and\nGlobal Secure Access' compliant network\nto Conditional Access policy decisions. Admins can define locations and mark some as trusted, such as their organization's primary network locations.\nConditions\nA policy can contain multiple\nconditions\n.\nSign-in risk\nFor organizations with\nMicrosoft Entra ID Protection\n, the risk detections generated there can influence your Conditional Access policies.\nDevice platforms\nOrganizations with multiple device operating system platforms might enforce specific policies on different platforms.\nThe information used to determine the device platform comes from unverified sources, like user agent strings that can be changed.\nClient apps\nThe software the user is employing to access the cloud app. For example, 'Browser' and 'Mobile apps and desktop clients'. By default, all newly created Conditional Access policies apply to all client app types even if the client apps condition isn't configured.\nFilter for devices\nThis control allows targeting specific devices based on their attributes in a policy.\nAccess controls\nThe access controls portion of the Conditional Access policy controls how a policy is enforced.\nGrant\nGrant\nprovides administrators with a means of policy enforcement where they can block or grant access.\nBlock access\nBlock access blocks access under the specified assignments. This control is powerful and requires appropriate knowledge to use effectively.\nGrant access\nThe grant control triggers enforcement of one or more controls.\nRequire multifactor authentication\nRequire authentication strength\nRequire device to be marked as compliant (Intune)\nRequire Microsoft Entra hybrid joined device\nRequire approved client app\nRequire app protection policy\nRequire password change\nRequire terms of use\nAdministrators choose to require one of the previous controls or all selected controls using the following options. By default, multiple controls require all.\nRequire all the selected controls (control and control)\nRequire one of the selected controls (control or control)\nSession\nSession controls\ncan limit the experience of users.\nUse app enforced restrictions:\nWorks only with Exchange Online and SharePoint Online.\nPasses device information to control the experience, granting full or limited access.\nUse Conditional Access App Control:\nUses signals from Microsoft Defender for Cloud Apps to do things like:\nBlock download, cut, copy, and print of sensitive documents.\nMonitor risky session behavior.\nRequire labeling of sensitive files.\nSign-in frequency:\nAbility to change the default sign in frequency for modern authentication.\nPersistent browser session:\nAllows users to remain signed in after closing and reopening their browser window.\nCustomize continuous access evaluation.\nDisable resilience defaults.\nSimple policies\nA Conditional Access policy must include at least the following to be enforced:\nName\nof the policy\nAssignments\nUsers and/or groups\nto apply the policy to\nTarget resources\nto apply the policy to\nAccess controls\nGrant\nor\nBlock\ncontrols\nThe article\nCommon Conditional Access policies\nincludes policies that might be useful to most organizations.\nRelated content\nCommon Conditional Access policies\nManaging device compliance with Intune\nMicrosoft Defender for Cloud Apps and Conditional Access\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Conditional Access Policies", @@ -1034,7 +1034,7 @@ "https://learn.microsoft.com/en-us/entra/identity/conditional-access/concept-conditional-access-cloud-apps#authentication-context": { "content_hash": "sha256:f913fe9fc34ea13f2b0cf044577965dca1ecaebffeea584b943d3f55a1b230c9", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nConditional Access: Target resources\nFeedback\nSummarize this article for me\nTarget resources (formerly cloud apps, actions, and authentication context) are key signals in a Conditional Access policy. Conditional Access policies let admins assign controls to specific applications, services, actions, or authentication context.\nAdmins can choose from the list of applications or services that include built-in Microsoft applications and any\nMicrosoft Entra integrated applications\n, including gallery, non-gallery, and applications published through\nApplication Proxy\n.\nAdmins might define a policy based on a\nuser action\nlike\nRegister security information\nor\nRegister or join devices\n, letting Conditional Access enforce controls around those actions.\nAdmins can target\ntraffic forwarding profiles\nfrom Global Secure Access for enhanced functionality.\nAdmins can use\nauthentication context\nto provide an extra layer of security in applications.\nMicrosoft cloud applications\nAdmins can assign a Conditional Access policy to Microsoft cloud apps if the service principal appears in their tenant, except for Microsoft Graph. Microsoft Graph functions as an umbrella resource. Use\nAudience Reporting\nto see the underlying services and target those services in your policies. Some apps like\nMicrosoft 365/Office 365\nand\nWindows Azure Service Management API\ninclude multiple related child apps or services. When new Microsoft cloud applications are created, they appear in the app picker list as soon as the service principal is created in the tenant.\nOffice 365\nMicrosoft 365 offers cloud-based productivity and collaboration services like Exchange, SharePoint, and Microsoft Teams. In Conditional Access, the Microsoft 365 suite of applications appears under 'Office 365'. Microsoft 365 cloud services are deeply integrated to ensure smooth and collaborative experiences. This integration might cause confusion when creating policies because some apps, like Microsoft Teams, depend on others, like SharePoint or Exchange.\nThe Office 365 app grouping in Conditional Access makes it possible to target these services all at once. We recommend using the Microsoft 365 grouping, instead of targeting individual cloud apps to avoid issues with\nservice dependencies\n.\nTargeting this group of applications helps to avoid issues that might arise because of inconsistent policies and dependencies. For example: The Exchange Online app is tied to traditional Exchange Online data like mail, calendar, and contact information. Related metadata might be exposed through different resources like search. To ensure that all metadata is protected by as intended, admins should assign policies to the Microsoft 365 app.\nAdmins can exclude the entire Microsoft 365 suite or specific Microsoft 365 cloud apps from Conditional Access policies.\nA complete list of all services included can be found in the article\nApps included in Conditional Access Microsoft 365 app suite\n.\nWindows Azure Service Management API\nWhen you target the Windows Azure Service Management API application, policy is enforced for tokens issued to a set of services closely bound to the portal. This grouping includes the application IDs of:\nAzure Resource Manager\nAzure portal, which also covers the Microsoft Entra admin center and the Microsoft Engage Center\nAzure Data Lake\nApplication Insights API\nLog Analytics API\nBecause the policy is applied to the Azure management portal and API, any services or clients that depend on the Azure API can be indirectly affected. For example:\nAzure CLI\nAzure Data Factory portal\nAzure Event Hubs\nAzure PowerShell\nAzure Service Bus\nAzure SQL Database\nAzure Synapse\nClassic deployment model APIs\nMicrosoft 365 admin center\nMicrosoft IoT Central\nMicrosoft Defender Multitenant management\nSQL Managed Instance\nVisual Studio subscriptions administrator portal\nCaution\nConditional Access policies associated with the Windows Azure Service Management API\nno longer cover Azure DevOps\n.\nNote\nThe Windows Azure Service Management API application applies to\nAzure PowerShell\n, which calls the\nAzure Resource Manager API\n. It doesn't apply to\nMicrosoft Graph PowerShell\n, which calls the\nMicrosoft Graph API\n.\nFor Azure Government, you should target the Azure Government Cloud Management API application.\nMicrosoft Admin Portals\nWhen a Conditional Access policy targets the Microsoft Admin Portals cloud app, the policy is enforced for tokens issued to specific underlying resource application IDs associated with Microsoft admin portals. The app grouping doesn't include the backend services that those portals might call or depend on. To identify service dependencies of the admin portals, use the\nConditional Access audience reporting in sign-in logs\nThe following applications comprise the Microsoft Admin Portals:\nExchange Admin Center app ID: 497effe9-df71-4043-a8bb-14cf78c4b63b\nAzure portal app ID: c44b4083-3bb0-49c1-b47d-974e53cbdf3c\nMicrosoft Office 365 Portal app ID: 00000006-0000-0ff1-ce00-000000000000\nMicrosoft 365 Security And Compliance Center (Protection Center) app ID: 80ccca67-54bd-44ab-8625-4b79c4dc7775\nThe Admin Portal grouping is primarily intended for include scenarios,for a simplified way to target one or more admin portals with Conditional Access policies (for example, enforcing MFA). This grouping is leveraged in our\nMFA for admins Microsoft-managed policy\nto streamline policy creation.\nThis option is not intended to function as a bulk exclusion mechanism for all backend services associated with the underlying application IDs.\nNote\nBlock policies that target the Microsoft Admin Portals will block end users from accessing the Microsoft 365 self-install page, as this page is currently located in the Microsoft 365 admin center. For information on alternative deployment options, see\nPlan your enterprise deployment of Microsoft 365 Apps\n.\nOther applications\nAdmins can add any Microsoft Entra registered application to Conditional Access policies. These applications might include:\nApplications published through\nMicrosoft Entra application proxy\nApplications added from the gallery\nCustom applications not in the gallery\nLegacy applications published through app delivery controllers and networks\nApplications that use\npassword based single sign-on\nNote\nBecause Conditional Access policy sets the requirements for accessing a service, you aren't able to apply it to a client (public/native) application. In other words, the policy isn't set directly on a client (public/native) application, but is applied when a client calls a service. For example, a policy set on SharePoint service applies to all clients calling SharePoint. A policy set on Exchange applies to the attempt to access the email using Outlook client. That is why client (public/native) applications aren't available for selection in the app picker and Conditional Access option isn't available in the application settings for the client (public/native) application registered in your tenant.\nSome applications don't appear in the picker at all. The only way to include these applications in a Conditional Access policy is to include \nAll resources (formerly 'All cloud apps')\nor add the missing service principal using the\nNew-MgServicePrincipal\nPowerShell cmdlet or by using the\nMicrosoft Graph API\n.\nConditional Access for different client types\nConditional Access applies to resources not clients, except when the client is a confidential client requesting an ID token.\nPublic client\nPublic clients are those that run locally on devices like Microsoft Outlook on the desktop or mobile apps like Microsoft Teams.\nConditional Access policies don't apply to public clients themselves but are based on the resources they request.\nConfidential client\nConditional Access applies to the resources requested by the client and the confidential client itself if it requests an ID token.\nFor example: If Outlook Web requests a token for scopes\nMail.Read\nand\nFiles.Read\n, Conditional Access applies policies for Exchange and SharePoint. Additionally, if Outlook Web requests an ID token, Conditional Access also applies the policies for Outlook Web.\nTo view\nsign-in logs\nfor these client types from the Microsoft Entra admin center:\nSign in to the\nMicrosoft Entra admin center\nas at least a\nReports Reader\n.\nBrowse to\nEntra ID\n>\nMonitoring & health\n>\nSign-in logs\n.\nAdd a filter for\nClient credential type\n.\nAdjust the filter to view a specific set of logs based on the client credential used in the sign-in.\nFor more information, see the article\nPublic client and confidential client applications\n.\nConditional Access for ALL resources\nApplying a Conditional Access policy to\nAll resources (formerly 'All cloud apps')\nwithout any resource exclusions enforces the policy for all token requests from websites and services, including\nGlobal Secure Access traffic forwarding profiles\n. This option includes applications that aren't individually targetable in Conditional Access policy, such as\nWindows Azure Active Directory\n(00000002-0000-0000-c000-000000000000).\nImportant\nMicrosoft recommends creating a baseline multifactor authentication policy targeting all users and all resources (without any resource exclusions), like the one explained in\nRequire multifactor authentication for all users\n.\nLegacy Conditional Access behavior when an ALL resources policy has a resource exclusion\nWarning\nThe following Conditional Access behavior is changing\n. Those low privileged scopes that were previously excluded from policy enforcement will\nno longer be excluded\n. This change means that users who were previously able to access the application without any Conditional Access enforcement might now receive Conditional Access challenges. The change is rolling out in phases starting in March, 2026.\nIf any app is excluded from the policy, in order to not inadvertently block user access, certain low privilege scopes were\npreviously\nexcluded from policy enforcement. These scopes allowed calls to the underlying Graph APIs, like\nWindows Azure Active Directory\n(00000002-0000-0000-c000-000000000000) and\nMicrosoft Graph\n(00000003-0000-0000-c000-000000000000), to access user profile and group membership information commonly used by applications as part of authentication. For example: when Outlook requests a token for Exchange, it also asks for the\nUser.Read\nscope to be able to display the basic account information of the current user.\nMost apps have a similar dependency, which is why these low privilege scopes were automatically excluded in\nAll resources\npolicies. The\npreviously\nexcluded scopes are listed as follows, consent is still required for apps to use these permissions.\nNative clients and Single page applications (SPAs) have access to the following low privilege scopes:\nAzure AD Graph:\nemail\n,\noffline_access\n,\nopenid\n,\nprofile\n,\nUser.Read\nMicrosoft Graph:\nemail\n,\noffline_access\n,\nopenid\n,\nprofile\n,\nUser.Read\n,\nPeople.Read\nConfidential clients have access to the following low privilege scopes, if they're excluded from an\nAll resources\npolicy:\nAzure AD Graph:\nemail\n,\noffline_access\n,\nopenid\n,\nprofile\n,\nUser.Read\n,\nUser.Read.All\n,\nUser.ReadBasic.All\nMicrosoft Graph:\nemail\n,\noffline_access\n,\nopenid\n,\nprofile\n,\nUser.Read\n,\nUser.Read.All\n,\nUser.ReadBasic.All\n,\nPeople.Read\n,\nPeople.Read.All\n,\nGroupMember.Read.All\n,\nMember.Read.Hidden\nNew Conditional Access behavior when an ALL resources policy has a resource exclusion\nThe scopes listed in the previous section are now evaluated as directory access and mapped to Azure AD Graph (resource: Windows Azure Active Directory, ID: 00000002-0000-0000-c000-000000000000) for Conditional Access evaluation purposes.\nConditional Access policies that target All resources with one or more resource exclusions, or policies that explicitly target Azure AD Graph, are enforced in user sign-in flows where the client application requests only these scopes. There is no change in behavior when an application requests any additional scope beyond those listed above.\nNote\nThe\nAzure AD Graph retirement\ndoes not affect the Azure AD Graph (Windows Azure Active Directory) resource registered in your tenant.\nUser experience\nIn user sign-in flows where client applications request only the scopes listed above, users might now receive Conditional Access challenges (such as MFA or device compliance). The exact challenge depends on the access controls configured in your policies that target All resources (with or without resource exclusions) or policies that explicitly target Azure AD Graph.\nIn the following example, the tenant has a Conditional Access policy with the following details:\nTargeting All users and All resources\nResource exclusions for a confidential client application and Exchange Online\nMFA is configured as the grant control\nExample scenarios\nExample scenario\nUser impact (before → after)\nConditional Access evaluation\nA user signs into Visual Studio Code desktop client, which requests openid and profile scopes.\nBefore\n: User not prompted for MFA\nAfter\n: User is prompted for MFA\nConditional Access is now evaluated using Windows Azure Active Directory as the enforcement audience.\nA user signs in using Azure CLI, which requests only\nUser.Read\n.\nBefore\n: User not prompted for MFA\nAfter\n: User is prompted for MFA\nConditional Access is now evaluated using Windows Azure Active Directory as the enforcement audience.\nA user signs in through a confidential client application (excluded from the policy) that requests only\nUser.Read\nand\nPeople.Read\n.\nBefore\n: User not prompted for MFA\nAfter\n: User is prompted for MFA\nConditional Access is now evaluated using Windows Azure Active Directory as the enforcement audience.\nThere is no change in behavior when a client application requests a scope beyond those listed previously, as illustrated in the following examples.\nExample scenarios\nExample scenario\nUser impact\nConditional Access evaluation\nA user signs in to a confidential client application (excluded from the policy) that requests offline_access and SharePoint access (\nFiles.Read\n).\nNo change in behavior\nConditional Access continues to be enforced based on the SharePoint resource.\nA user signs in to the OneDrive desktop sync client. OneDrive requests offline_access and Exchange Online access (\nMail.Read\n).\nNo change in behavior\nConditional Access is not enforced because Exchange Online is excluded from the policy.\nMost applications request scopes beyond the previously listed scopes and are already subject to Conditional Access enforcement, unless the application is explicitly excluded from the policy. In such cases, there is no change in behavior.\nCustom applications that are intentionally designed to request only the previously listed scopes and are not designed to handle Conditional Access challenges might need to be updated so that they can handle Conditional Access challenges. Refer to the\nMicrosoft Conditional Access developer guidance\nfor implementation details.\nHow to identify applications affected by the low-privilege scope enforcement change\nApplications can be pre-authorized to request only one or more of the previously listed scopes. Use the following options to identify affected applications.\nPowerShell\nUsage and Insights report\nSign-in logs\nUse the following PowerShell script to list all applications in your tenant that are pre-authorized to request only one or more of the scopes that are affected by this change.\nNote\nApplications can request additional scopes dynamically (with admin consent). This script will not identify such applications.\n# ==============================\n# Inventory of apps whose delegated consent grants include ONLY\n# the OIDC scopes + specific directory scopes listed below.\n#\n# Enhancements incorporated:\n# - Supported both PowerShell 5.1 and 7.x\n# - Add user sign-in count (last 7 days) per app\n#\n# Output:\n# - ServicePrincipalObjectId (oauth2PermissionGrants.clientId = SP object id)\n# - AppId\n# - AppDisplayName\n# - AppOwnerOrganizationId (for classification)\n# - Scopes (union of delegated scopes granted)\n# - UserSigninsLast7Days (Successful + Failed)\n# ==============================\n\n$TenantId = Read-Host \"Enter your Microsoft Entra tenant ID (GUID)\"\n\n$BaselineScopes = @(\n \"openid\", \"profile\", \"email\", \"offline_access\",\n \"User.Read\", \"User.Read.All\", \"User.ReadBasic.All\",\n \"People.Read\", \"People.Read.All\",\n \"GroupMember.Read.All\", \"Member.Read.Hidden\"\n)\n\nDisconnect-MgGraph -ErrorAction SilentlyContinue\n\nConnect-MgGraph -TenantId $TenantId -Scopes @(\n \"DelegatedPermissionGrant.Read.All\",\n \"Directory.Read.All\",\n \"Reports.Read.All\"\n)\n\n# ------------------------------\n# Pull oauth2PermissionGrants (paging)\n# ------------------------------\n$uri = \"https://graph.microsoft.com/beta/oauth2PermissionGrants?`$select=clientId,scope,consentType\"\n$grants = @()\nwhile ($uri) {\n $resp = Invoke-MgGraphRequest -Method GET -Uri $uri\n $grants += $resp.value\n $uri = $resp.'@odata.nextLink'\n}\n\n# ------------------------------\n# Build baseline-only candidate set (Jun: HashSet per clientId)\n# ------------------------------\n$scopesByClient = @{} # key: clientId (SP objectId), value: HashSet[string] (case-insensitive)\n\nforeach ($g in $grants) {\n $cid = $g.clientId.ToString().Trim()\n if (-not $cid) { continue }\n\n if (-not $scopesByClient.ContainsKey($cid)) {\n $scopesByClient[$cid] = [System.Collections.Generic.HashSet[string]]::new(\n [System.StringComparer]::OrdinalIgnoreCase\n )\n }\n\n foreach ($s in ($g.scope -split '\\s+')) {\n if ($s -and $s.Trim().Length -gt 0) {\n [void]$scopesByClient[$cid].Add($s.Trim())\n }\n }\n}\n\n$candidates = foreach ($cid in $scopesByClient.Keys) {\n $scopes = $scopesByClient[$cid]\n if ($scopes.Count -le 0) { continue }\n\n $outside = $scopes | Where-Object { $_ -notin $BaselineScopes }\n if ($outside.Count -eq 0) {\n [PSCustomObject]@{\n ServicePrincipalObjectId = $cid\n Scopes = ($scopes -join ' ')\n }\n }\n}\n\n# ------------------------------\n# Pull per-app sign-in summary for last 7 days (Graph REST via Invoke-MgGraphRequest)\n# Endpoint: GET /beta/reports/getAzureADApplicationSignInSummary(period='D7')\n# In this API output, 'id' corresponds to the appId (clientId)\n# ------------------------------\n$signInSummary = @()\n$signInUri = \"https://graph.microsoft.com/beta/reports/getAzureADApplicationSignInSummary(period='D7')\"\n\nwhile ($signInUri) {\n $resp = Invoke-MgGraphRequest -Method GET -Uri $signInUri\n\n if ($resp -and $resp.value) {\n $signInSummary += $resp.value\n }\n\n # Paging (if present)\n $signInUri = $resp.'@odata.nextLink'\n}\n\n# appId -> total sign-ins (7d)\n$signInCountByAppId = @{}\nforeach ($s in $signInSummary) {\n $appId = $s.id\n if (-not $appId) { continue }\n\n # PS5.1-safe null handling\n $success = 0\n $failed = 0\n if ($null -ne $s.successfulSignInCount) { $success = [int]$s.successfulSignInCount }\n if ($null -ne $s.failedSignInCount) { $failed = [int]$s.failedSignInCount }\n\n $signInCountByAppId[$appId] = $success + $failed\n}\n\n$resultsTenantOwned = @()\n$resultsNotTenantOwned = @()\n\n# ------------------------------\n# Filter to tenant-owned or external apps; enrich with appId/displayName + sign-in counts\n# ------------------------------\nforeach ($c in $candidates) {\n try {\n $spUri = \"https://graph.microsoft.com/beta/servicePrincipals/$($c.ServicePrincipalObjectId)?`$select=id,appId,displayName,appOwnerOrganizationId\"\n $sp = Invoke-MgGraphRequest -Method GET -Uri $spUri\n\n $signinCount7d = 0\n if ($sp.appId -and $signInCountByAppId.ContainsKey($sp.appId)) {\n $signinCount7d = $signInCountByAppId[$sp.appId]\n }\n\n $row = [PSCustomObject]@{\n ServicePrincipalObjectId = $c.ServicePrincipalObjectId\n AppId = $sp.appId\n AppDisplayName = $sp.displayName\n AppOwnerOrganizationId = $sp.appOwnerOrganizationId\n Scopes = $c.Scopes\n UserSigninsLast7Days = $signinCount7d\n }\n\n if ($sp.appOwnerOrganizationId -eq $TenantId) {\n $resultsTenantOwned += $row\n }\n else {\n $resultsNotTenantOwned += $row\n }\n }\n catch {\n # Ignore non-enumerable / missing service principals\n }\n}\n\n# ------------------------------\n# Output\n# ------------------------------\n'=== Tenant-owned apps whose delegated consent grants include ONLY baseline scopes + user sign-ins (last 7 days) ==='\n$resultsTenantOwned |\n Sort-Object UserSigninsLast7Days -Descending |\n Format-Table -AutoSize\n\n'=== External apps whose delegated consent grants include ONLY baseline scopes + user sign-ins (last 7 days) ==='\n$resultsNotTenantOwned |\n Sort-Object UserSigninsLast7Days -Descending |\n Format-Table -AutoSize\nThe\nUsage and Insights report in Microsoft Entra\ncan help monitor application sign-in activity for specific applications.\nSign in to the \nMicrosoft Entra admin center\n as at least a \nReports Reader\n.\nBrowse to \nEntra ID\n > \nMonitoring & health\n > \nUsage & insights\n.\nSelect an application > select\nUsage & insights\n.\nMicrosoft Entra ID sign-in logs can also provide a detailed list of sign-ins for specific applications.\nSign in to the \nMicrosoft Entra admin center\n as at least a \nReports Reader\n.\nBrowse to \nEntra ID\n > \nMonitoring & health\n > \nSign-in logs\n.\nSet the filter\nApplication\nand enter the application name or application ID as the value.\nProtecting directory information\nNote\nThe following section applies until the rollout of the low-privilege scope enforcement change is complete.\nIf the\nrecommended baseline MFA policy without resource exclusions\ncan't be configured because of business reasons, and your organization's security policy must include directory-related low privilege scopes (\nUser.Read\n,\nUser.Read.All\n,\nUser.ReadBasic.All\n,\nPeople.Read\n,\nPeople.Read.All\n,\nGroupMember.Read.All\n,\nMember.Read.Hidden\n), create a separate Conditional Access policy targeting\nWindows Azure Active Directory\n(00000002-0000-0000-c000-000000000000). Windows Azure Active Directory (also called Azure AD Graph) is a resource representing data stored in the directory such as users, groups, and applications. The Windows Azure Active Directory resource is included in\nAll resources\nbut can be individually targeted in Conditional Access policies by using the following steps:\nSign in to the\nMicrosoft Entra admin center\nas an\nAttribute Definition Administrator\nand\nAttribute Assignment Administrator\n.\nBrowse to\nEntra ID\n>\nCustom security attributes\n.\nCreate a new attribute set and attribute definition. For more information, see\nAdd or deactivate custom security attribute definitions in Microsoft Entra ID\n.\nBrowse to\nEntra ID\n>\nEnterprise apps\n.\nRemove the\nApplication type\nfilter and search for\nApplication ID\nthat starts with 00000002-0000-0000-c000-000000000000.\nSelect\nWindows Azure Active Directory\n>\nCustom security attributes\n>\nAdd assignment\n.\nSelect the attribute set and attribute value that you plan to use in the policy.\nBrowse to\nEntra ID\n>\nConditional Access\n>\nPolicies\n.\nCreate or modify an existing policy.\nUnder\nTarget resources\n>\nResources (formerly cloud apps)\n>\nInclude\n, select >\nSelect resources\n>\nEdit filter\n.\nAdjust the filter to include your attribute set and definition from earlier.\nUnder\nAccess controls\n>\nGrant\n, select\nGrant access\n,\nRequire authentication strength\n, select\nMultifactor authentication\n, then select\nSelect\n.\nConfirm your settings and set\nEnable policy\nto\nReport-only\n.\nSelect\nCreate\nto enable your policy.\nNote\nConfigure this policy as described in the guidance above. Any deviations in creating the policy as described (such as defining resource exclusions) may result in low privilege scopes being excluded and the policy not applying as intended.\nAll internet resources with Global Secure Access\nThe\nAll internet resources with Global Secure Access\noption allows admins to target the\ninternet access traffic forwarding profile\nfrom\nMicrosoft Entra Internet Access\n.\nThese profiles in Global Secure Access enable admins to define and control how traffic is routed through Microsoft Entra Internet Access and Microsoft Entra Private Access. Traffic forwarding profiles can be assigned to devices and remote networks. For an example of how to apply a Conditional Access policy to these traffic profiles, see the article\nHow to apply Conditional Access policies to the Microsoft 365 traffic profile\n.\nFor more information about these profiles, see the article\nGlobal Secure Access traffic forwarding profiles\n.\nAll agent resources (Preview)\nApplying a Conditional Access policy to All agent resources enforces the policy for all token requests to agent identity blueprint principals and agent identities.\nUser actions\nUser actions are tasks that a user performs. Conditional Access supports two user actions:\nRegister security information\n: This user action lets Conditional Access policies enforce rules when users try to register their security information. For more information, see\nCombined security information registration\n.\nNote\nIf admins apply a policy targeting user actions for registering security information and the user account is a guest from a\nMicrosoft personal account (MSA)\n, the 'Require multifactor authentication' control requires the MSA user to register security information with the organization. If the guest user is from another provider such as\nGoogle\n, access is blocked.\nRegister or join devices\n: This user action enables admins to enforce Conditional Access policy when users\nregister\nor\njoin\ndevices to Microsoft Entra ID. It lets admins configure multifactor authentication for registering or joining devices with more granularity than a tenant-wide policy. There are three key considerations with this user action:\nRequire multifactor authentication\nand\nRequire auth strength\nare the only access controls available with this user action and all others are disabled. This restriction prevents conflicts with access controls that are either dependent on Microsoft Entra device registration or not applicable to Microsoft Entra device registration.\nWindows Hello for Business and device-bound passkeys aren't supported because those scenarios require the device to be already registered.\nClient apps\n,\nFilters for devices\n, and\nDevice state\nconditions aren't available with this user action because they're dependent on Microsoft Entra device registration to enforce Conditional Access policies.\nWarning\nIf a Conditional Access policy is configured with the\nRegister or join devices\nuser action, set\nEntra ID\n>\nDevices\n>\nOverview\n>\nDevice Settings\n-\nRequire Multifactor Authentication to register or join devices with Microsoft Entra\nto\nNo\n. Otherwise, Conditional Access policies with this user action aren't properly enforced. Learn more about this device setting in\nConfigure device settings\n.\nAuthentication context\nAuthentication context secures data and actions in applications. These applications include custom applications, line-of-business (LOB) applications, SharePoint, or applications protected by Microsoft Defender for Cloud Apps. It can also be used with Microsoft Entra Privileged Identity Management (PIM) to enforce Conditional Access policies during role activation.\nFor example, an organization might store files in SharePoint sites like a lunch menu or a secret BBQ sauce recipe. Everyone might access the lunch menu site, but users accessing the secret BBQ sauce recipe site might need to use a managed device and agree to specific terms of use. Similarly, an administrator activating a privileged role through PIM might be required to perform multifactor authentication or use a compliant device.\nAuthentication context works with users or\nworkload identities\n, but not in the same Conditional Access policy.\nConfigure authentication contexts\nManage authentication contexts by going to\nEntra ID\n>\nConditional Access\n>\nAuthentication context\n.\nSelect\nNew authentication context\nto create an authentication context definition. Organizations can create up to 99 authentication context definitions (\nc1-c99\n). Configure these attributes:\nDisplay name\nis the name that is used to identify the authentication context in Microsoft Entra ID and across applications that consume authentication contexts. We recommend names that can be used across resources, like\ntrusted devices\n, to reduce the number of authentication contexts needed. Having a reduced set limits the number of redirects and provides a better end to end-user experience.\nDescription\nprovides more information about the policies. This information is used by admins and those applying authentication contexts to resources.\nPublish to apps\ncheckbox, when selected, advertises the authentication context to apps and makes it available to be assigned. If not selected, the authentication context is unavailable to downstream resources.\nID\nis read-only and used in tokens and apps for request-specific authentication context definitions. Listed here for troubleshooting and development use cases.\nAdd to Conditional Access policy\nAdmins can select published authentication contexts in Conditional Access policies by going to\nAssignments\n>\nCloud apps or actions\nand selecting\nAuthentication context\nfrom the\nSelect what this policy applies to\nmenu.\nDelete an authentication context\nBefore deleting an authentication context, ensure no applications use it. Otherwise, access to app data isn't protected. Confirm this by checking sign-in logs for cases where authentication context Conditional Access policies are applied.\nTo delete an authentication context, ensure it has no assigned Conditional Access policies and isn't published to apps. This prevents accidental deletion of an authentication context still in use.\nTag resources with authentication contexts\nTo learn more about using authentication contexts, see the following articles.\nUse sensitivity labels to protect content in Microsoft Teams, Microsoft 365 groups, and SharePoint sites\nMicrosoft Defender for Cloud Apps\nCustom applications\nPrivileged Identity Management - On activation, require Microsoft Entra Conditional Access authentication context\nRelated content\nConditional Access: Conditions\n– Learn how to configure conditions to refine your policies.\nConditional Access common policies\n– Explore common policy templates to get started quickly.\nClient application dependencies\n– Understand how dependencies impact Conditional Access policies.\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Authentication Contexts", @@ -1043,7 +1043,7 @@ "https://learn.microsoft.com/en-us/entra/identity/conditional-access/howto-conditional-access-session-lifetime": { "content_hash": "sha256:ea88b780628f373ed9328c0bb5f212975a93c6bfdf89795374ac293267a13bf9", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nConfigure adaptive session lifetime policies\nFeedback\nSummarize this article for me\nWarning\nIf you're using the\nconfigurable token lifetime\nfeature currently in public preview, we don't support creating two different policies for the same user or app combination: one with this feature and another with the configurable token lifetime feature. Microsoft retired the configurable token lifetime feature for refresh and session token lifetimes on January 30, 2021, and replaced it with the Conditional Access authentication session management feature.\nBefore enabling Sign-in Frequency, make sure other reauthentication settings are disabled in your tenant. If \"Remember MFA on trusted devices\" is enabled, disable it before using Sign-in Frequency, as using these two settings together might prompt users unexpectedly. To learn more about reauthentication prompts and session lifetime, see the article,\nOptimize reauthentication prompts and understand session lifetime for Microsoft Entra multifactor authentication\n.\nPolicy deployment\nTo ensure your policy works as expected, test it before rolling it out into production. Use a test tenant to verify that your new policy works as intended. For more information, see the article\nPlan a Conditional Access deployment\n.\nPolicy 1: Sign-in frequency control\nSign in to the\nMicrosoft Entra admin center\nas at least a\nConditional Access Administrator\n.\nBrowse to\nEntra ID\n>\nConditional Access\n>\nPolicies\n.\nSelect\nNew policy\n.\nGive your policy a name. Create a meaningful standard for naming policies.\nChoose all required conditions for customer’s environment, including the target cloud apps.\nNote\nIt's recommended to set equal authentication prompt frequency for key Microsoft 365 apps such as Exchange Online and SharePoint Online for best user experience.\nUnder\nAccess controls\n>\nSession\n.\nSelect\nSign-in frequency\n.\nChoose\nPeriodic reauthentication\nand enter a value of hours or days or select\nEvery time\n.\nSave your policy.\nPolicy 2: Persistent browser session\nSign in to the\nMicrosoft Entra admin center\nas at least a\nConditional Access Administrator\n.\nBrowse to\nEntra ID\n>\nConditional Access\n>\nPolicies\n.\nSelect\nNew policy\n.\nGive your policy a name. We recommend that organizations create a meaningful standard for the names of their policies.\nChoose all required conditions.\nNote\nThis control requires selecting \"All Cloud Apps\" as a condition. Browser session persistence is controlled by authentication session token. All tabs in a browser session share a single session token and therefore they all must share persistence state.\nUnder\nAccess controls\n>\nSession\n.\nSelect\nPersistent browser session\n.\nNote\nPersistent browser session configuration in Microsoft Entra Conditional Access overrides the \"Stay signed in?\" setting in the company branding pane for the same user if both policies are configured.\nSelect a value from dropdown.\nSave your policy.\nNote\nSession lifetime settings, including sign-in frequency and persistent browser sessions, determine how often users must reauthenticate and whether sessions persist across browser restarts. Shorter lifetimes enhance security for high-risk apps, while longer ones improve convenience for trusted or managed devices.\nPolicy 3: Sign-in frequency control every time risky user\nSign in to the\nMicrosoft Entra admin center\nas at least a\nConditional Access Administrator\n.\nBrowse to\nEntra ID\n>\nConditional Access\n.\nSelect\nNew policy\n.\nGive your policy a name. We recommend that organizations create a meaningful standard for the names of their policies.\nUnder\nAssignments\n, select\nUsers or workload identities\n.\nUnder\nInclude\n, select\nAll users\n.\nUnder\nExclude\n, select\nUsers and groups\nand choose your organization's emergency access or break-glass accounts.\nSelect\nDone\n.\nUnder\nTarget resources\n>\nInclude\n, select\nAll resources (formerly 'All cloud apps')\n.\nUnder\nConditions\n>\nUser risk\n, set\nConfigure\nto\nYes\n.\nUnder\nConfigure user risk levels needed for policy to be enforced\n, select\nHigh\n.\nThis guidance is based on Microsoft recommendations and might be different for each organization\nSelect\nDone\n.\nUnder\nAccess controls\n>\nGrant\n, select\nGrant access\n.\nSelect\nRequire authentication strength\n, then select the built-in\nMultifactor authentication\nauthentication strength from the list.\nSelect\nRequire password change\n.\nSelect\nSelect\n.\nUnder\nSession\n.\nSelect\nSign-in frequency\n.\nEnsure\nEvery time\nis selected.\nSelect\nSelect\n.\nConfirm your settings and set\nEnable policy\nto\nReport-only\n.\nSelect\nCreate\nto enable your policy.\nAfter confirming your settings using\nreport-only mode\n, move the\nEnable policy\ntoggle from\nReport-only\nto\nOn\n.\nValidation\nUse the\nWhat If tool\nto simulate a sign-in to the target application and other conditions based on your policy configuration. The authentication session management controls show up in the result of the tool.\nPrompt tolerance\nWe account for five minutes of clock skew when\nevery time\nis selected in policy, so we don’t prompt users more often than once every five minutes. If the user completes MFA in the last 5 minutes and encounters another Conditional Access policy that requires reauthentication, we don't prompt the user. Prompting users too often for reauthentication can affect their productivity and increase the risk of users approving MFA requests they didn’t initiate. Use \"Sign-in frequency – every time\" only when there are specific business needs.\nKnown issues\nIf you configure sign-in frequency for mobile devices: Authentication after each sign-in frequency interval might be slow and can take 30 seconds on average. This issue might also occur across various apps simultaneously.\nOn iOS devices: If an app configures certificates as the first authentication factor and has both sign-in frequency and\nIntune mobile application management policies\napplied, users are blocked from signing in to the app when the policy triggers.\nMicrosoft Entra Private Access doesn't support setting sign-in frequency to every time.\nNext steps\nReady to configure Conditional Access policies for your environment? See\nPlan a Conditional Access deployment\n.\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Session Controls", @@ -1052,7 +1052,7 @@ "https://learn.microsoft.com/en-us/entra/identity/authentication/concept-authentication-strengths": { "content_hash": "sha256:cf5dd4dcfe3eb4824cc9f8e3113ac946e1ea6e412f82a081216b1319c26dacae", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nConditional Access authentication strengths\nFeedback\nSummarize this article for me\nAn authentication strength is a Microsoft Entra Conditional Access control that specifies which combinations of authentication methods users can use to access a resource. Users can satisfy the strength requirements by authenticating with any of the allowed combinations.\nFor example, an authentication strength can require users to use only phishing-resistant authentication methods to access a sensitive resource. To access a nonsensitive resource, administrators can create another authentication strength that allows less secure multifactor authentication (MFA) combinations, such as a password and a text message.\nAn authentication strength is based on the\npolicy for authentication methods\n. That is, administrators can scope authentication methods for specific users and groups to be used across Microsoft Entra ID federated applications. An authentication strength allows further control over the usage of these methods, based on specific scenarios such as sensitive resource access, user risk, and location.\nPrerequisites\nTo use Conditional Access, your tenant needs to have Microsoft Entra ID P1 license. If you don't have this license, you can start a\nfree trial\n.\nScenarios for authentication strengths\nAuthentication strengths can help customers address these scenarios:\nRequire specific authentication methods to access a sensitive resource.\nRequire a specific authentication method when a user takes a sensitive action within an application (in combination with Conditional Access authentication context).\nRequire users to use a specific authentication method when they access sensitive applications outside the corporate network.\nRequire more secure authentication methods for users at high risk.\nRequire specific authentication methods from guest users who access a resource tenant (in combination with cross-tenant settings).\nBuilt-in and custom authentication strengths\nAdministrators can specify an authentication strength to access a resource by creating a Conditional Access policy with the\nRequire authentication strength\ncontrol. They can choose from three built-in authentication strengths:\nMultifactor authentication strength\n,\nPasswordless MFA strength\n, and\nPhishing-resistant MFA strength\n. They can also create a custom authentication strength based on the authentication method combinations that they want to allow.\nBuilt-in authentication strengths\nBuilt-in authentication strengths are combinations of authentication methods that Microsoft predefines. Built-in authentication strengths are always available and can't be modified. Microsoft updates built-in authentication strengths when new methods become available.\nFor example, the built-in\nPhishing-resistant MFA strength\nauthentication strength allows combinations of:\nWindows Hello for Business or platform credential\nFIDO2 security key\nMicrosoft Entra certificate-based authentication (multifactor)\nThe following table lists combinations of authentication methods for each built-in authentication strength. These combinations include methods that users need to register and that admins need to enable in the policy for authentication methods or the policy for legacy MFA settings:\nMFA strength\n: The same set of combinations that can be used to satisfy the\nRequire multifactor authentication\nsetting.\nPasswordless MFA strength\n: Includes authentication methods that satisfy MFA but don't require a password.\nPhishing-resistant MFA strength\n: Includes methods that require an interaction between the authentication method and the sign-in surface.\nAuthentication method combination\nMFA strength\nPasswordless MFA strength\nPhishing-resistant MFA strength\nFIDO2 security key\nâœ\nâœ\nâœ\nWindows Hello for Business or platform credential\nâœ\nâœ\nâœ\nCertificate-based authentication (multifactor)\nâœ\nâœ\nâœ\nMicrosoft Authenticator (phone sign-in)\nâœ\nâœ\nTemporary Access Pass (one-time use and multiple use)\nâœ\nPassword plus something the user has\n1\nâœ\nFederated single-factor plus something the user has\n1\nâœ\nFederated multifactor\nâœ\nCertificate-based authentication (single-factor)\nSMS sign-in\nPassword\nFederated single-factor\n1\nSomething the user has\nrefers to one of the following methods: text message, voice, push notification, software OATH token, or hardware OATH token.\nYou can use the following API call to list definitions of all the built-in authentication strengths:\nGET https://graph.microsoft.com/beta/identity/conditionalAccess/authenticationStrength/policies?$filter=policyType eq 'builtIn'\nCustom authentication strengths\nConditional Access administrators can also create custom authentication strengths to exactly suit their access requirements. For more information, see\nCreate and manage custom Conditional Access authentication strengths\n.\nLimitations\nEffect of an authentication strength on authentication\n: Conditional Access policies are evaluated only after the initial authentication. As a result, an authentication strength doesn't restrict a user's initial authentication.\nSuppose you're using the built-in\nPhishing-resistant MFA strength\nauthentication strength. A user can still enter a password but must sign in by using a phishing-resistant method, such as a FIDO2 security key, before they can continue.\nUnsupported combination of grant controls\n: You can't use the\nRequire multifactor authentication\nand\nRequire authentication strength\ngrant controls together in the same Conditional Access policy. The reason is that the built-in\nMultifactor authentication\nauthentication strength is equivalent to the\nRequire multifactor authentication\ngrant control.\nUnsupported authentication method\n: The\nEmail one-time pass (Guest)\nauthentication method isn't currently supported in the available combinations.\nWindows Hello for Business\n: If the user signs in with Windows Hello for Business as the primary authentication method, it can be used to satisfy an authentication strength requirement that includes Windows Hello for Business. But if the user signs in with another method (like a password) as the primary authentication method, and the authentication strength requires Windows Hello for Business, the user isn't prompted to sign in with Windows Hello for Business. The user needs to restart the session, select\nSign-in options\n, and select a method that the authentication strength requires.\nKnown issues\nAuthentication strength and sign-in frequency\n: When a resource requires an authentication strength and a sign-in frequency, users can satisfy both requirements at two different times.\nFor example, let's say a resource requires a passkey (FIDO2) for the authentication strength, along with a 1-hour sign-in frequency. A user signed in with a passkey (FIDO2) to access the resource 24 hours ago.\nWhen the user unlocks their Windows device by using Windows Hello for Business, they can access the resource again. Yesterday's sign-in satisfies the authentication strength requirement, and today's device unlock satisfies the sign-in frequency requirement.\nFAQ\nShould I use an authentication strength or the policy for authentication methods?\nAn authentication strength is based on the\nAuthentication methods\npolicy. The\nAuthentication methods\npolicy helps to scope and configure authentication methods that users and groups can use across Microsoft Entra ID. An authentication strength allows another restriction of methods for specific scenarios, such as sensitive resource access, user risk, and location.\nFor example, assume that the administrator of an organization named Contoso wants to allow users to use Microsoft Authenticator with either push notifications or passwordless authentication mode. The administrator goes to the Authenticator settings in the\nAuthentication methods\npolicy, scopes the policy for the relevant users, and sets\nAuthentication mode\nto\nAny\n.\nFor Contoso's most sensitive resource, the administrator wants to restrict the access to only passwordless authentication methods. The administrator creates a new Conditional Access policy by using the built-in\nPasswordless MFA strength\nauthentication strength.\nAs a result, users in Contoso can access most of the resources in the tenant by using a password and a push notification from Authenticator, or by using only Authenticator (phone sign-in). However, when the users in the tenant access the sensitive application, they must use Authenticator (phone sign-in).\nRelated content\nCustom Conditional Access authentication strengths\nHow authentication strengths work for external users\nTroubleshoot authentication strengths\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Phishing-Resistant MFA", @@ -1079,7 +1079,7 @@ "https://learn.microsoft.com/en-us/entra/identity/role-based-access-control/custom-overview": { "content_hash": "sha256:0d0d49f627033b986d28e84a901a1162a81d96d0b74480d837a2475b6e4d661a", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nOverview of role-based access control in Microsoft Entra ID\nFeedback\nSummarize this article for me\nThis article describes how to understand Microsoft Entra role-based access control. Microsoft Entra roles allow you to grant granular permissions to your admins, abiding by the principle of least privilege. Microsoft Entra built-in and custom roles operate on concepts similar to those you find in\nthe role-based access control system for Azure resources\n(Azure roles). The\ndifference between these two role-based access control systems\nis:\nMicrosoft Entra roles control access to Microsoft Entra resources such as users, groups, and applications using the Microsoft Graph API\nAzure roles control access to Azure resources such as virtual machines or storage using Azure Resource Management\nBoth systems contain similarly used role definitions and role assignments. However, Microsoft Entra role permissions can't be used in Azure custom roles and vice versa.\nUnderstand Microsoft Entra role-based access control\nMicrosoft Entra ID supports two types of roles definitions:\nBuilt-in roles\nCustom roles\nBuilt-in roles are out of box roles that have a fixed set of permissions. These role definitions cannot be modified. There are many\nbuilt-in roles\nthat Microsoft Entra ID supports, and the list is growing. To round off the edges and meet your sophisticated requirements, Microsoft Entra ID also supports\ncustom roles\n. Granting permission using custom Microsoft Entra roles is a two-step process that involves creating a custom role definition and then assigning it using a role assignment. A custom role definition is a collection of permissions that you add from a preset list. These permissions are the same permissions used in the built-in roles.\nOnce you’ve created your custom role definition (or using a built-in role), you can assign it to a user by creating a role assignment. A role assignment grants the user the permissions in a role definition at a specified scope. This two-step process allows you to create a single role definition and assign it many times at different scopes. A scope defines the set of Microsoft Entra resources the role member has access to. The most common scope is organization-wide (org-wide) scope. A custom role can be assigned at org-wide scope, meaning the role member has the role permissions over all resources in the organization. A custom role can also be assigned at an object scope. An example of an object scope would be a single application. The same role can be assigned to one user over all applications in the organization and then to another user with a scope of only the Contoso Expense Reports app.\nHow Microsoft Entra ID determines if a user has access to a resource\nThe following are the high-level steps that Microsoft Entra ID uses to determine if you have access to a management resource. Use this information to troubleshoot access issues.\nA user (or service principal) acquires a token to the Microsoft Graph endpoint.\nThe user makes an API call to Microsoft Entra ID via Microsoft Graph using the issued token.\nDepending on the circumstance, Microsoft Entra ID takes one of the following actions:\nEvaluates the user’s role memberships based on the\nwids claim\nin the user’s access token.\nRetrieves all the role assignments that apply for the user, either directly or via group membership, to the resource on which the action is being taken.\nMicrosoft Entra ID determines if the action in the API call is included in the roles the user has for this resource.\nIf the user doesn't have a role with the action at the requested scope, access is not granted. Otherwise access is granted.\nRole assignment\nA role assignment is a Microsoft Entra resource that attaches a\nrole definition\nto a\nsecurity principal\nat a particular\nscope\nto grant access to Microsoft Entra resources. Access is granted by creating a role assignment, and access is revoked by removing a role assignment. At its core, a role assignment consists of three elements:\nSecurity principal - An identity that gets the permissions. It could be a user, group, or a service principal.\nRole definition - A collection of permissions.\nScope - A way to constrain where those permissions are applicable.\nYou can\ncreate role assignments\nand\nlist the role assignments\nusing the Microsoft Entra admin center,\nMicrosoft Graph PowerShell\n, or Microsoft Graph API. Azure CLI is not supported for Microsoft Entra role assignments.\nThe following diagram shows an example of a role assignment. In this example, Chris has been assigned the App Registration Administrator custom role at the scope of the Contoso Widget Builder app registration. The assignment grants Chris the permissions of the App Registration Administrator role for only this specific app registration.\nSecurity principal\nA security principal represents a user, group, or service principal that is assigned access to Microsoft Entra resources. A user is an individual who has a user profile in Microsoft Entra ID. A group is a new Microsoft 365 or security group that has been set as a\nrole-assignable group\n. A service principal is an identity created for use with applications, hosted services, and automated tools to access Microsoft Entra resources.\nRole definition\nA role definition, or role, is a collection of permissions. A role definition lists the operations that can be performed on Microsoft Entra resources, such as create, read, update, and delete. There are two types of roles in Microsoft Entra ID:\nBuilt-in roles created by Microsoft that can't be changed.\nCustom roles created and managed by your organization.\nScope\nA scope is a way to limit the permitted actions to a particular set of resources as part of a role assignment. For example, if you want to assign a custom role to a developer, but only to manage a specific application registration, you can include the specific application registration as a scope in the role assignment.\nWhen you assign a role, you specify one of the following types of scope:\nTenant\nAdministrative unit\nMicrosoft Entra resource\nIf you specify a Microsoft Entra resource as a scope, it can be one of the following:\nMicrosoft Entra groups\nEnterprise applications\nApplication registrations\nWhen a role is assigned over a container scope, such as the Tenant or an Administrative Unit, it grants permissions over the objects they contain but not on the container itself. On the contrary, when a role is assigned over a resource scope, it grants permissions over the resource itself but it does not extend beyond (in particular, it does not extend to the members of a Microsoft Entra group).\nFor more information, see\nAssign Microsoft Entra roles\n.\nRole assignment options\nMicrosoft Entra ID provides multiple options for assigning roles:\nYou can assign roles to users directly, which is the default way to assign roles. Both built-in and custom Microsoft Entra roles can be assigned to users, based on access requirements. For more information, see\nAssign Microsoft Entra roles\n.\nWith Microsoft Entra ID P1, you can create role-assignable groups and assign roles to these groups. Assigning roles to a group instead of individuals allows for easy addition or removal of users from a role and creates consistent permissions for all members of the group. For more information, see\nAssign Microsoft Entra roles\n.\nWith Microsoft Entra ID P2, you can use Microsoft Entra Privileged Identity Management (Microsoft Entra PIM) to provide just-in-time access to roles. This feature allows you to grant time-limited access to a role to users who require it, rather than granting permanent access. It also provides detailed reporting and auditing capabilities. For more information, see\nAssign Microsoft Entra roles in Privileged Identity Management\n.\nLicense requirements\nUsing built-in roles in Microsoft Entra ID is free. Using custom roles require a Microsoft Entra ID P1 license for every user with a custom role assignment. To find the right license for your requirements, see\nComparing generally available features of the Free and Premium editions\n.\nNext steps\nUnderstand Microsoft Entra roles\nAssign Microsoft Entra roles\nMicrosoft Entra forum\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Role-Based Access Control", @@ -1088,34 +1088,34 @@ "https://learn.microsoft.com/en-us/entra/identity/role-based-access-control/permissions-reference": { "content_hash": "sha256:8a478a7c0daca12c206996a8169d1d92bbd32fd6274ed749e9df250fc3fce1d4", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nMicrosoft Entra built-in roles\nFeedback\nSummarize this article for me\nIn Microsoft Entra ID, if another administrator or nonadministrator needs to manage Microsoft Entra resources, you assign them a Microsoft Entra role that provides the permissions they need. For example, you can assign roles to allow adding or changing users, resetting user passwords, managing user licenses, or managing domain names.\nThis article lists the Microsoft Entra built-in roles you can assign to allow management of Microsoft Entra resources. For information about how to assign roles, see\nAssign Microsoft Entra roles\n. If you are looking for roles to manage Azure resources, see\nAzure built-in roles\n.\nAll roles\nRole\nDescription\nTemplate ID\nAgent ID Administrator\nManage all aspects of agents in a tenant including identity lifecycle operations for agent blueprints, agent service principals, agent identities, and agentic users.\ndb506228-d27e-4b7d-95e5-295956d6615f\nAgent ID Developer\nCreate an agent blueprint and its service principal in a tenant. User will be added as an owner of the agent blueprint and its service principal.\nadb2368d-a9be-41b5-8667-d96778e081b0\nAgent Registry Administrator\nManage all aspects of the Agent Registry service in Microsoft Entra ID\n6b942400-691f-4bf0-9d12-d8a254a2baf5\nAI Administrator\nManage all aspects of Microsoft 365 Copilot and AI-related enterprise services in Microsoft 365.\nd2562ede-74db-457e-a7b6-544e236ebb61\nApplication Administrator\nCan create and manage all aspects of app registrations and enterprise apps.\n9b895d92-2cd3-44c7-9d02-a6ac2d5ea5c3\nApplication Developer\nCan create application registrations independent of the 'Users can register applications' setting.\ncf1c38e5-3621-4004-a7cb-879624dced7c\nAttack Payload Author\nCan create attack payloads that an administrator can initiate later.\n9c6df0f2-1e7c-4dc3-b195-66dfbd24aa8f\nAttack Simulation Administrator\nCan create and manage all aspects of attack simulation campaigns.\nc430b396-e693-46cc-96f3-db01bf8bb62a\nAttribute Assignment Administrator\nAssign custom security attribute keys and values to supported Microsoft Entra objects.\n58a13ea3-c632-46ae-9ee0-9c0d43cd7f3d\nAttribute Assignment Reader\nRead custom security attribute keys and values for supported Microsoft Entra objects.\nffd52fa5-98dc-465c-991d-fc073eb59f8f\nAttribute Definition Administrator\nDefine and manage the definition of custom security attributes.\n8424c6f0-a189-499e-bbd0-26c1753c96d4\nAttribute Definition Reader\nRead the definition of custom security attributes.\n1d336d2c-4ae8-42ef-9711-b3604ce3fc2c\nAttribute Log Administrator\nRead audit logs and configure diagnostic settings for events related to custom security attributes.\n5b784334-f94b-471a-a387-e7219fc49ca2\nAttribute Log Reader\nRead audit logs related to custom security attributes.\n9c99539d-8186-4804-835f-fd51ef9e2dcd\nAttribute Provisioning Administrator\nRead and edit the provisioning configuration of all active custom security attributes for an application.\necb2c6bf-0ab6-418e-bd87-7986f8d63bbe\nAttribute Provisioning Reader\nRead the provisioning configuration of all active custom security attributes for an application.\n422218e4-db15-4ef9-bbe0-8afb41546d79\nAuthentication Administrator\nCan access to view, set and reset authentication method information for any non-admin user.\nc4e39bd9-1100-46d3-8c65-fb160da0071f\nAuthentication Extensibility Administrator\nCustomize sign in and sign up experiences for users by creating and managing custom authentication extensions.\n25a516ed-2fa0-40ea-a2d0-12923a21473a\nAuthentication Policy Administrator\nCan create and manage the authentication methods policy, tenant-wide MFA settings, password protection policy, and verifiable credentials.\n0526716b-113d-4c15-b2c8-68e3c22b9f80\nAzure DevOps Administrator\nCan manage Azure DevOps policies and settings.\ne3973bdf-4987-49ae-837a-ba8e231c7286\nAzure Information Protection Administrator\nCan manage all aspects of the Azure Information Protection product.\n7495fdc4-34c4-4d15-a289-98788ce399fd\nB2C IEF Keyset Administrator\nCan manage secrets for federation and encryption in the Identity Experience Framework (IEF).\naaf43236-0c0d-4d5f-883a-6955382ac081\nB2C IEF Policy Administrator\nCan create and manage trust framework policies in the Identity Experience Framework (IEF).\n3edaf663-341e-4475-9f94-5c398ef6c070\nBilling Administrator\nCan perform common billing related tasks like updating payment information.\nb0f54661-2d74-4c50-afa3-1ec803f12efe\nCloud App Security Administrator\nCan manage all aspects of the Defender for Cloud Apps product.\n892c5842-a9a6-463a-8041-72aa08ca3cf6\nCloud Application Administrator\nCan create and manage all aspects of app registrations and enterprise apps except App Proxy.\n158c047a-c907-4556-b7ef-446551a6b5f7\nCloud Device Administrator\nLimited access to manage devices in Microsoft Entra ID.\n7698a772-787b-4ac8-901f-60d6b08affd2\nCompliance Administrator\nCan read and manage compliance configuration and reports in Microsoft Entra ID and Microsoft 365.\n17315797-102d-40b4-93e0-432062caca18\nCompliance Data Administrator\nCreates and manages compliance content.\ne6d1a23a-da11-4be4-9570-befc86d067a7\nConditional Access Administrator\nCan manage Conditional Access capabilities.\nb1be1c3e-b65d-4f19-8427-f6fa0d97feb9\nCustomer LockBox Access Approver\nCan approve Microsoft support requests to access customer organizational data.\n5c4f9dcd-47dc-4cf7-8c9a-9e4207cbfc91\nDesktop Analytics Administrator\nCan access and manage Desktop management tools and services.\n38a96431-2bdf-4b4c-8b6e-5d3d8abac1a4\nDirectory Readers\nCan read basic directory information. Commonly used to grant directory read access to applications and guests.\n88d8e3e3-8f55-4a1e-953a-9b9898b8876b\nDirectory Synchronization Accounts\nOnly used by Microsoft Entra Connect service.\nd29b2b05-8046-44ba-8758-1e26182fcf32\nDirectory Writers\nCan read and write basic directory information. For granting access to applications, not intended for users.\n9360feb5-f418-4baa-8175-e2a00bac4301\nDomain Name Administrator\nCan manage domain names in cloud and on-premises.\n8329153b-31d0-4727-b945-745eb3bc5f31\nDragon Administrator\nManage all aspects of the Microsoft Dragon admin center.\ne93e3737-fa85-474a-aee4-7d3fb86510f3\nDynamics 365 Administrator\nCan manage all aspects of the Dynamics 365 product.\n44367163-eba1-44c3-98af-f5787879f96a\nDynamics 365 Business Central Administrator\nAccess and perform all administrative tasks on Dynamics 365 Business Central environments.\n963797fb-eb3b-4cde-8ce3-5878b3f32a3f\nEdge Administrator\nManage all aspects of Microsoft Edge.\n3f1acade-1e04-4fbc-9b69-f0302cd84aef\nExchange Administrator\nCan manage all aspects of the Exchange product.\n29232cdf-9323-42fd-ade2-1d097af3e4de\nExchange Backup Administrator\nBack up and restore content (including granular restore) for Exchange in Microsoft 365 Backup\n49eb8f75-97e9-4e37-9b2b-6c3ebfcffa31\nExchange Recipient Administrator\nCan create or update Exchange Online recipients within the Exchange Online organization.\n31392ffb-586c-42d1-9346-e59415a2cc4e\nExtended Directory User Administrator\nManage all aspects of external user profiles in the extended directory for Teams.\ndd13091a-6207-4fc0-82ba-3641e056ab95\nExternal ID User Flow Administrator\nCan create and manage all aspects of user flows.\n6e591065-9bad-43ed-90f3-e9424366d2f0\nExternal ID User Flow Attribute Administrator\nCan create and manage the attribute schema available to all user flows.\n0f971eea-41eb-4569-a71e-57bb8a3eff1e\nExternal Identity Provider Administrator\nCan configure identity providers for use in direct federation.\nbe2f45a1-457d-42af-a067-6ec1fa63bc45\nFabric Administrator\nCan manage all aspects of the Fabric and Power BI products.\na9ea8996-122f-4c74-9520-8edcd192826c\nGlobal Administrator\nCan manage all aspects of Microsoft Entra ID and Microsoft services that use Microsoft Entra identities.\n62e90394-69f5-4237-9190-012177145e10\nGlobal Reader\nCan read everything that a Global Administrator can, but not update anything.\nf2ef992c-3afb-46b9-b7cf-a126ee74c451\nGlobal Secure Access Administrator\nCreate and manage all aspects of Global Secure Internet Access and Microsoft Global Secure Private Access, including managing access to public and private endpoints.\nac434307-12b9-4fa1-a708-88bf58caabc1\nGlobal Secure Access Log Reader\nProvides designated security personnel with read-only access to network traffic logs in Microsoft Entra Internet Access and Microsoft Entra Private Access for detailed analysis.\n843318fb-79a6-4168-9e6f-aa9a07481cc4\nGroups Administrator\nMembers of this role can create/manage groups, create/manage groups settings like naming and expiration policies, and view groups activity and audit reports.\nfdd7a751-b60b-444a-984c-02652fe8fa1c\nGuest Inviter\nCan invite guest users independent of the 'members can invite guests' setting.\n95e79109-95c0-4d8e-aee3-d01accf2d47b\nHelpdesk Administrator\nCan reset passwords for non-administrators and Helpdesk Administrators.\n729827e3-9c14-49f7-bb1b-9608f156bbb8\nHybrid Identity Administrator\nManage Active Directory to Microsoft Entra cloud provisioning, Microsoft Entra Connect, pass-through authentication (PTA), password hash synchronization (PHS), seamless single sign-on (seamless SSO), and federation settings. Does not have access to manage Microsoft Entra Connect Health.\n8ac3fc64-6eca-42ea-9e69-59f4c7b60eb2\nIdentity Governance Administrator\nManage access using Microsoft Entra ID for identity governance scenarios.\n45d8d3c5-c802-45c6-b32a-1d70b5e1e86e\nInsights Administrator\nHas administrative access in the Microsoft 365 Insights app.\neb1f4a8d-243a-41f0-9fbd-c7cdf6c5ef7c\nInsights Analyst\nAccess the analytical capabilities in Microsoft Viva Insights and run custom queries.\n25df335f-86eb-4119-b717-0ff02de207e9\nInsights Business Leader\nCan view and share dashboards and insights via the Microsoft 365 Insights app.\n31e939ad-9672-4796-9c2e-873181342d2d\nIntune Administrator\nCan manage all aspects of the Intune product.\n3a2c62db-5318-420d-8d74-23affee5d9d5\nIoT Device Administrator\nProvision new IoT devices, manage their lifecycle, configure certificates, and manage device templates.\n2ea5ce4c-b2d8-4668-bd81-3680bd2d227a\nKaizala Administrator\nCan manage settings for Microsoft Kaizala.\n74ef975b-6605-40af-a5d2-b9539d836353\nKnowledge Administrator\nCan configure knowledge, learning, and other intelligent features.\nb5a8dcf3-09d5-43a9-a639-8e29ef291470\nKnowledge Manager\nCan organize, create, manage, and promote topics and knowledge.\n744ec460-397e-42ad-a462-8b3f9747a02c\nLicense Administrator\nCan manage product licenses on users and groups.\n4d6ac14f-3453-41d0-bef9-a3e0c569773a\nLifecycle Workflows Administrator\nCreate and manage all aspects of workflows and tasks associated with Lifecycle Workflows in Microsoft Entra ID.\n59d46f88-662b-457b-bceb-5c3809e5908f\nMessage Center Privacy Reader\nCan read security messages and updates in Office 365 Message Center only.\nac16e43d-7b2d-40e0-ac05-243ff356ab5b\nMessage Center Reader\nCan read messages and updates for their organization in Office 365 Message Center only.\n790c1fb9-7f7d-4f88-86a1-ef1f95c05c1b\nMicrosoft 365 Backup Administrator\nBack up and restore content across supported services (SharePoint, OneDrive, and Exchange Online) in Microsoft 365 Backup\n1707125e-0aa2-4d4d-8655-a7c786c76a25\nMicrosoft 365 Migration Administrator\nPerform all migration functionality to migrate content to Microsoft 365 using Migration Manager.\n8c8b803f-96e1-4129-9349-20738d9f9652\nMicrosoft Entra Joined Device Local Administrator\nUsers assigned to this role are added to the local administrators group on Microsoft Entra joined devices.\n9f06204d-73c1-4d4c-880a-6edb90606fd8\nMicrosoft Graph Data Connect Administrator\nManage aspects of Microsoft Graph Data Connect service in a tenant.\nee67aa9c-e510-4759-b906-227085a7fd4d\nMicrosoft Hardware Warranty Administrator\nCreate and manage all aspects warranty claims and entitlements for Microsoft manufactured hardware, like Surface and HoloLens.\n1501b917-7653-4ff9-a4b5-203eaf33784f\nMicrosoft Hardware Warranty Specialist\nCreate and read warranty claims for Microsoft manufactured hardware, like Surface and HoloLens.\n281fe777-fb20-4fbb-b7a3-ccebce5b0d96\nNetwork Administrator\nCan manage network locations and review enterprise network design insights for Microsoft 365 Software as a Service applications.\nd37c8bed-0711-4417-ba38-b4abe66ce4c2\nOffice Apps Administrator\nCan manage Office apps cloud services, including policy and settings management, and manage the ability to select, unselect and publish 'what's new' feature content to end-user's devices.\n2b745bdf-0803-4d80-aa65-822c4493daac\nOrganizational Branding Administrator\nManage all aspects of organizational branding in a tenant.\n92ed04bf-c94a-4b82-9729-b799a7a4c178\nOrganizational Data Source Administrator\nSet up and manage the ingestion of organizational data into Microsoft 365.\n9d70768a-0cbc-4b4c-aea3-2e124b2477f4\nOrganizational Messages Approver\nReview, approve, or reject new organizational messages for delivery in the Microsoft 365 admin center before they are sent to users.\ne48398e2-f4bb-4074-8f31-4586725e205b\nOrganizational Messages Writer\nWrite, publish, manage, and review the organizational messages for end-users through Microsoft product surfaces.\n507f53e4-4e52-4077-abd3-d2e1558b6ea2\nPartner Tier1 Support\nDo not use - not intended for general use.\n4ba39ca4-527c-499a-b93d-d9b492c50246\nPartner Tier2 Support\nDo not use - not intended for general use.\ne00e864a-17c5-4a4b-9c06-f5b95a8d5bd8\nPassword Administrator\nCan reset passwords for non-administrators and Password Administrators.\n966707d0-3269-4727-9be2-8c3a10f19b9d\nPeople Administrator\nManage profile photos of users and people settings for all users in the organization.\n024906de-61e5-49c8-8572-40335f1e0e10\nPermissions Management Administrator\nManage all aspects of Microsoft Entra Permissions Management.\naf78dc32-cf4d-46f9-ba4e-4428526346b5\nPlaces Administrator\nManage all aspects of the Microsoft Places service.\n78b0ccd1-afc2-4f92-9116-b41aedd09592\nPower Platform Administrator\nCan create and manage all aspects of Microsoft Dynamics 365, Power Apps and Power Automate.\n11648597-926c-4cf3-9c36-bcebb0ba8dcc\nPrinter Administrator\nCan manage all aspects of printers and printer connectors.\n644ef478-e28f-4e28-b9dc-3fdde9aa0b1f\nPrinter Technician\nCan register and unregister printers and update printer status.\ne8cef6f1-e4bd-4ea8-bc07-4b8d950f4477\nPrivileged Authentication Administrator\nCan access to view, set and reset authentication method information for any user (admin or non-admin).\n7be44c8a-adaf-4e2a-84d6-ab2649e08a13\nPrivileged Role Administrator\nCan manage role assignments in Microsoft Entra ID, and all aspects of Privileged Identity Management.\ne8611ab8-c189-46e8-94e1-60213ab1f814\nReports Reader\nCan read sign-in and audit reports.\n4a5d8f65-41da-4de4-8968-e035b65339cf\nSearch Administrator\nCan create and manage all aspects of Microsoft Search settings.\n0964bb5e-9bdb-4d7b-ac29-58e794862a40\nSearch Editor\nCan create and manage the editorial content such as bookmarks, Q and As, locations, floorplan.\n8835291a-918c-4fd7-a9ce-faa49f0cf7d9\nSecurity Administrator\nCan read security information and reports, and manage configuration in Microsoft Entra ID and Office 365.\n194ae4cb-b126-40b2-bd5b-6091b380977d\nSecurity Operator\nCreates and manages security events.\n5f2222b1-57c3-48ba-8ad5-d4759f1fde6f\nSecurity Reader\nCan read security information and reports in Microsoft Entra ID and Office 365.\n5d6b6bb7-de71-4623-b4af-96380a352509\nService Support Administrator\nCan read service health information and manage support tickets.\nf023fd81-a637-4b56-95fd-791ac0226033\nSharePoint Administrator\nCan manage all aspects of the SharePoint service.\nf28a1f50-f6e7-4571-818b-6a12f2af6b6c\nSharePoint Advanced Management Administrator\nManage all aspects of SharePoint Advanced Management.\n99009c4a-3b3f-4957-82a9-9d35e12db77e\nSharePoint Backup Administrator\nBack up and restore content (including granular restore) for SharePoint and OneDrive in Microsoft 365 Backup\n9d3e04ba-3ee4-4d1b-a3a7-9aef423a09be\nSharePoint Embedded Administrator\nManage all aspects of SharePoint Embedded containers.\n1a7d78b6-429f-476b-b8eb-35fb715fffd4\nSkype for Business Administrator\nCan manage all aspects of the Skype for Business product.\n75941009-915a-4869-abe7-691bff18279e\nTeams Administrator\nCan manage the Microsoft Teams service.\n69091246-20e8-4a56-aa4d-066075b2a7a8\nTeams Communications Administrator\nCan manage calling and meetings features within the Microsoft Teams service.\nbaf37b3a-610e-45da-9e62-d9d1e5e8914b\nTeams Communications Support Engineer\nCan troubleshoot communications issues within Teams using advanced tools.\nf70938a0-fc10-4177-9e90-2178f8765737\nTeams Communications Support Specialist\nCan troubleshoot communications issues within Teams using basic tools.\nfcf91098-03e3-41a9-b5ba-6f0ec8188a12\nTeams Devices Administrator\nCan perform management related tasks on Teams certified devices.\n3d762c5a-1b6c-493f-843e-55a3b42923d4\nTeams Reader\nRead everything in the Teams admin center, but not update anything.\n1076ac91-f3d9-41a7-a339-dcdf5f480acc\nTeams Telephony Administrator\nManage voice and telephony features and troubleshoot communication issues within the Microsoft Teams service.\naa38014f-0993-46e9-9b45-30501a20909d\nTenant Creator\nCreate new Microsoft Entra or Azure AD B2C tenants.\n112ca1a2-15ad-4102-995e-45b0bc479a6a\nUsage Summary Reports Reader\nRead Usage reports and Adoption Score, but can't access user details.\n75934031-6c7e-415a-99d7-48dbd49e875e\nUser Administrator\nCan manage all aspects of users and groups, including resetting passwords for limited admins.\nfe930be7-5e62-47db-91af-98c3a49a38b1\nUser Experience Success Manager\nView product feedback, survey results, and reports to find training and communication opportunities.\n27460883-1df1-4691-b032-3b79643e5e63\nVirtual Visits Administrator\nManage and share Virtual Visits information and metrics from admin centers or the Virtual Visits app.\ne300d9e7-4a2b-4295-9eff-f1c78b36cc98\nViva Glint Tenant Administrator\nManage and configure Microsoft Viva Glint settings in the Microsoft 365 admin center.\n0ec3f692-38d6-4d14-9e69-0377ca7797ad\nViva Goals Administrator\nManage and configure all aspects of Microsoft Viva Goals.\n92b086b3-e367-4ef2-b869-1de128fb986e\nViva Pulse Administrator\nCan manage all settings for Microsoft Viva Pulse app.\n87761b17-1ed2-4af3-9acd-92a150038160\nWindows 365 Administrator\nCan provision and manage all aspects of Cloud PCs.\n11451d60-acb2-45eb-a7d6-43d0f0125c13\nWindows Update Deployment Administrator\nCan create and manage all aspects of Windows Update deployments through the Windows Update for Business deployment service.\n32696413-001a-46ae-978c-ce0f6b3620d2\nYammer Administrator\nManage all aspects of the Yammer service.\n810a2642-a034-447f-a5e8-41beaa378541\nAgent ID Administrator\nAssign the Agent ID Administrator role to users who need to do the following:\nManage all aspects of agents in a tenant including identity lifecycle operations for agent blueprints, agent service principals, agent identities, and agentic users.\nActions\nDescription\nmicrosoft.azure.serviceHealth/allEntities/allTasks\nRead and configure Azure Service Health\nmicrosoft.azure.supportTickets/allEntities/allTasks\nCreate and manage Azure support tickets\nmicrosoft.directory/agentIdentities/appRoleAssignedTo/update\nUpdate agent identity role assignments.\nmicrosoft.directory/agentIdentities/basic/update\nUpdate basic properties of agent identities.\nmicrosoft.directory/agentIdentities/create\nCreate agent identities.\nmicrosoft.directory/agentIdentities/delete\nDelete agent identities.\nmicrosoft.directory/agentIdentities/disable\nDisable agent identities.\nmicrosoft.directory/agentIdentities/enable\nEnable agent identities.\nmicrosoft.directory/agentIdentities/owners/update\nAdd and remove owners to agent identities.\nmicrosoft.directory/agentIdentities/tag/update\nUpdate tags for agent identities.\nmicrosoft.directory/agentIdentityBlueprintPrincipals/appRoleAssignedTo/update\nUpdate agent identity blueprint principal role assignments.\nmicrosoft.directory/agentIdentityBlueprintPrincipals/basic/update\nUpdate basic properties of agent identity blueprint principals.\nmicrosoft.directory/agentIdentityBlueprintPrincipals/create\nCreate agent identity blueprint principals.\nmicrosoft.directory/agentIdentityBlueprintPrincipals/delete\nDelete agent identity blueprint principals.\nmicrosoft.directory/agentIdentityBlueprintPrincipals/disable\nDisable agent identity blueprint principals.\nmicrosoft.directory/agentIdentityBlueprintPrincipals/enable\nEnable agent identity blueprint principals.\nmicrosoft.directory/agentIdentityBlueprintPrincipals/owners/update\nAdd and remove owners to agent identity blueprint principals.\nmicrosoft.directory/agentIdentityBlueprintPrincipals/tag/update\nUpdate tags for agent identity blueprint principals.\nmicrosoft.directory/agentIdentityBlueprints/allProperties/read\nRead all properties and settings for agent identity blueprints.\nmicrosoft.directory/agentIdentityBlueprints/allProperties/update\nUpdate all properties and settings for agent identity blueprints.\nmicrosoft.directory/agentIdentityBlueprints/appRoles/update\nModify app roles defined on agent identity blueprints.\nmicrosoft.directory/agentIdentityBlueprints/authentication/update\nUpdate authentication related settings for agent identity blueprints.\nmicrosoft.directory/agentIdentityBlueprints/audience/update\nUpdate the sign-in audience setting for agent identity blueprints.\nmicrosoft.directory/agentIdentityBlueprints/basic/update\nUpdate basic properties of agent identity blueprints.\nmicrosoft.directory/agentIdentityBlueprints/create\nCreate agent identity blueprints.\nmicrosoft.directory/agentIdentityBlueprints/credentials/update\nAdd and remove credentials to agent identity blueprints.\nmicrosoft.directory/agentIdentityBlueprints/delete\nDelete agent identity blueprints.\nmicrosoft.directory/agentIdentityBlueprints/owners/update\nAdd and remove owners to agent identity blueprints.\nmicrosoft.directory/agentIdentityBlueprints/permissions/update\nModify exposed permissions on agent identity blueprints.\nmicrosoft.directory/agentIdentityBlueprints/tag/update\nUpdate tags for agent identity blueprints.\nmicrosoft.directory/agentIdentityBlueprints/verification/update\nUpdate publisher verification setting for agent identity blueprints.\nmicrosoft.directory/agentUsers/assignLicense\nManage agent user licenses\nmicrosoft.directory/agentUsers/basic/update\nUpdate basic properties on agent users\nmicrosoft.directory/agentUsers/create\nAdd agent users\nmicrosoft.directory/agentUsers/delete\nDelete agent users\nmicrosoft.directory/agentUsers/disable\nDisable agent users\nmicrosoft.directory/agentUsers/enable\nEnable agent users\nmicrosoft.directory/agentUsers/invalidateAllRefreshTokens\nForce sign-out by invalidating agent user refresh tokens\nmicrosoft.directory/agentUsers/lifeCycleInfo/read\nRead lifecycle information of agent users, such as employeeLeaveDateTime\nmicrosoft.directory/agentUsers/lifeCycleInfo/update\nUpdate lifecycle information of agent users, such as employeeLeaveDateTime\nmicrosoft.directory/agentUsers/manager/update\nUpdate manager for agent users\nmicrosoft.directory/agentUsers/restore\nRestore deleted agent users\nmicrosoft.directory/agentUsers/revokeSignInSessions\nRevoke sign-in sessions for a agent user\nmicrosoft.directory/agentUsers/sponsors/update\nUpdate sponsors of agent users\nmicrosoft.directory/agentUsers/usageLocation/update\nUpdate usage location of agent users\nmicrosoft.directory/agentUsers/userPrincipalName/update\nUpdate User Principal Name of agent users\nmicrosoft.directory/auditLogs/allProperties/read\nRead all properties on audit logs, excluding custom security attributes audit logs.\nmicrosoft.directory/deletedItems.agentIdentityBlueprints/delete\nPermanently delete agent identity blueprints, which can no longer be restored\nmicrosoft.directory/deletedItems.agentIdentityBlueprints/restore\nRestore soft deleted agent identity blueprints to original state\nmicrosoft.directory/groups/hiddenMembers/read\nRead hidden members of Security groups and Microsoft 365 groups, including role-assignable groups\nmicrosoft.directory/groups.unified/createAsOwner\nCreate Microsoft 365 groups, excluding role-assignable groups. Creator is added as the first owner.\nmicrosoft.directory/policies/standard/read\nRead basic properties on policies\nmicrosoft.directory/signInReports/allProperties/read\nRead all properties on sign-in reports, including privileged properties.\nmicrosoft.office365.serviceHealth/allEntities/allTasks\nRead and configure Service Health in the Microsoft 365 admin center\nmicrosoft.office365.supportTickets/allEntities/allTasks\nCreate and manage Microsoft 365 service requests\nAgent ID Developer\nAssign the Agent ID Developer role to users who need to do the following:\nCreate agent blueprints and their service principals. The user is added as an owner of the agent blueprint and its service principal.\nActions\nDescription\nmicrosoft.directory/servicePrincipals/standard/read\nRead basic properties of service principals\nAgent Registry Administrator\nAssign the Agent Registry Administrator role to users who need to do the following tasks:\nManage metadata for AI agents in Microsoft Entra ID\nManage collections and visibility of agents\nAssign Agent Registry-specific roles to other users or agents to access the registry\nActions\nDescription\nmicrosoft.agentRegistry/allEntities/allProperties/allTasks\nManage all aspects of Agent Registry in Microsoft Entra ID\nAI Administrator\nAssign the AI Administrator role to users who need to do the following tasks:\nManage all aspects of Microsoft 365 Copilot\nManage AI-related enterprise services, extensibility, and copilot agents from the Integrated apps page in the Microsoft 365 admin center\nApprove and publish line-of-business copilot agents\nRead and configure Azure and Microsoft 365 service health dashboards\nView usage reports, adoption insights, and organizational insight\nCreate and manage support tickets in Azure and the Microsoft 365 admin center\nActions\nDescription\nmicrosoft.azure.serviceHealth/allEntities/allTasks\nRead and configure Azure Service Health\nmicrosoft.azure.supportTickets/allEntities/allTasks\nCreate and manage Azure support tickets\nmicrosoft.directory/entitlementManagement/allProperties/read\nRead all properties in Microsoft Entra entitlement management\nmicrosoft.office365.copilot/allEntities/allProperties/allTasks\nCreate and manage all settings for Microsoft 365 Copilot\nmicrosoft.office365.messageCenter/messages/read\nRead messages in Message Center in the Microsoft 365 admin center, excluding security messages\nmicrosoft.office365.network/performance/allProperties/read\nRead all network performance properties in the Microsoft 365 admin center\nmicrosoft.office365.search/content/manage\nCreate and delete content, and read and update all properties in Microsoft Search\nmicrosoft.office365.serviceHealth/allEntities/allTasks\nRead and configure Service Health in the Microsoft 365 admin center\nmicrosoft.office365.supportTickets/allEntities/allTasks\nCreate and manage Microsoft 365 service requests\nmicrosoft.office365.usageReports/allEntities/allProperties/read\nRead Office 365 usage reports\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nApplication Administrator\nThis is a\nprivileged role\n. Users in this role can create and manage all aspects of enterprise applications, application registrations, and application proxy settings. Note that users assigned to this role are not added as owners when creating new application registrations or enterprise applications.\nThis role also grants the ability to consent for delegated permissions and application permissions, with the exception of application permissions for Azure AD Graph and Microsoft Graph.\nImportant\nThis exception means that you can still consent to application permissions for\nother\napps (for example, other Microsoft apps, 3rd-party apps, or apps that you have registered). You can still\nrequest\nthese permissions as part of the app registration, but\ngranting\n(that is, consenting to) these permissions requires a more privileged administrator, such as Privileged Role Administrator.\nThis role grants the ability to manage application credentials. Users assigned this role can add credentials to an application, and use those credentials to impersonate the application's identity. If the application's identity has been granted access to a resource, such as the ability to create or update User or other objects, then a user assigned to this role could perform those actions while impersonating the application. This ability to impersonate the application's identity may be an elevation of privilege over what the user can do via their role assignments. It is important to understand that assigning a user to the Application Administrator role gives them the ability to impersonate an application's identity.\nActions\nDescription\nmicrosoft.azure.serviceHealth/allEntities/allTasks\nRead and configure Azure Service Health\nmicrosoft.azure.supportTickets/allEntities/allTasks\nCreate and manage Azure support tickets\nmicrosoft.directory/adminConsentRequestPolicy/allProperties/allTasks\nManage admin consent request policies in Microsoft Entra ID\nmicrosoft.directory/appConsent/appConsentRequests/allProperties/read\nRead all properties of consent requests for applications registered with Microsoft Entra ID\nmicrosoft.directory/applicationPolicies/basic/update\nUpdate standard properties of application policies\nmicrosoft.directory/applicationPolicies/create\nCreate application policies\nmicrosoft.directory/applicationPolicies/delete\nDelete application policies\nmicrosoft.directory/applicationPolicies/owners/read\nRead owners on application policies\nmicrosoft.directory/applicationPolicies/owners/update\nUpdate the owner property of application policies\nmicrosoft.directory/applicationPolicies/policyAppliedTo/read\nRead application policies applied to objects list\nmicrosoft.directory/applicationPolicies/standard/read\nRead standard properties of application policies\nmicrosoft.directory/applications/applicationProxy/read\nRead all application proxy properties\nmicrosoft.directory/applications/applicationProxy/update\nUpdate all application proxy properties\nmicrosoft.directory/applications/applicationProxyAuthentication/update\nUpdate authentication on all types of applications\nmicrosoft.directory/applications/applicationProxySslCertificate/update\nUpdate SSL certificate settings for application proxy\nmicrosoft.directory/applications/applicationProxyUrlSettings/update\nUpdate URL settings for application proxy\nmicrosoft.directory/applications/appRoles/update\nUpdate the appRoles property on all types of applications\nmicrosoft.directory/applications/audience/update\nUpdate the audience property for applications\nmicrosoft.directory/applications/authentication/update\nUpdate authentication on all types of applications\nmicrosoft.directory/applications/basic/update\nUpdate basic properties for applications\nmicrosoft.directory/applications/create\nCreate all types of applications\nmicrosoft.directory/applications/credentials/update\nUpdate application credentials\nmicrosoft.directory/applications/delete\nDelete all types of applications\nmicrosoft.directory/applications/extensionProperties/update\nUpdate extension properties on applications\nmicrosoft.directory/applications/notes/update\nUpdate notes of applications\nmicrosoft.directory/applications/owners/update\nUpdate owners of applications\nmicrosoft.directory/applications/permissions/update\nUpdate exposed permissions and required permissions on all types of applications\nmicrosoft.directory/applications/policies/update\nUpdate policies of applications\nmicrosoft.directory/applications/synchronization/standard/read\nRead provisioning settings associated with the application object\nmicrosoft.directory/applications/tag/update\nUpdate tags of applications\nmicrosoft.directory/applications/verification/update\nUpdate applicationsverification property\nmicrosoft.directory/applicationTemplates/instantiate\nInstantiate gallery applications from application templates\nmicrosoft.directory/auditLogs/allProperties/read\nRead all properties on audit logs, excluding custom security attributes audit logs\nmicrosoft.directory/connectorGroups/allProperties/read\nRead all properties of application proxy connector groups\nmicrosoft.directory/connectorGroups/allProperties/update\nUpdate all properties of application proxy connector groups\nmicrosoft.directory/connectorGroups/create\nCreate application proxy connector groups\nmicrosoft.directory/connectorGroups/delete\nDelete application proxy connector groups\nmicrosoft.directory/connectors/allProperties/read\nRead all properties of application proxy connectors\nmicrosoft.directory/connectors/create\nCreate application proxy connectors\nmicrosoft.directory/customAuthenticationExtensions/allProperties/allTasks\nCreate and manage custom authentication extensions\nmicrosoft.directory/deletedItems.applications/delete\nPermanently delete applications, which can no longer be restored\nmicrosoft.directory/deletedItems.applications/restore\nRestore soft deleted applications to original state\nmicrosoft.directory/oAuth2PermissionGrants/allProperties/allTasks\nCreate and delete OAuth 2.0 permission grants, and read and update all properties\nmicrosoft.directory/provisioningLogs/allProperties/read\nRead all properties of provisioning logs\nmicrosoft.directory/servicePrincipals/appRoleAssignedTo/update\nUpdate service principal role assignments\nmicrosoft.directory/servicePrincipals/audience/update\nUpdate audience properties on service principals\nmicrosoft.directory/servicePrincipals/authentication/update\nUpdate authentication properties on service principals\nmicrosoft.directory/servicePrincipals/basic/update\nUpdate basic properties on service principals\nmicrosoft.directory/servicePrincipals/create\nCreate service principals\nmicrosoft.directory/servicePrincipals/credentials/update\nUpdate credentials of service principals\nmicrosoft.directory/servicePrincipals/delete\nDelete service principals\nmicrosoft.directory/servicePrincipals/disable\nDisable service principals\nmicrosoft.directory/servicePrincipals/enable\nEnable service principals\nmicrosoft.directory/servicePrincipals/getPasswordSingleSignOnCredentials\nManage password single sign-on credentials on service principals\nmicrosoft.directory/servicePrincipals/managePasswordSingleSignOnCredentials\nRead password single sign-on credentials on service principals\nmicrosoft.directory/servicePrincipals/managePermissionGrantsForAll.microsoft-application-admin\nGrant consent for application permissions and delegated permissions on behalf of any user or all users, except for application permissions for Microsoft Graph and Azure AD Graph\nmicrosoft.directory/servicePrincipals/notes/update\nUpdate notes of service principals\nmicrosoft.directory/servicePrincipals/owners/update\nUpdate owners of service principals\nmicrosoft.directory/servicePrincipals/permissions/update\nUpdate permissions of service principals\nmicrosoft.directory/servicePrincipals/policies/update\nUpdate policies of service principals\nmicrosoft.directory/servicePrincipals/synchronization.cloudTenantToExternalSystem/credentials/manage\nManage application provisioning secrets and credentials.\nmicrosoft.directory/servicePrincipals/synchronization.cloudTenantToExternalSystem/jobs/manage\nStart, restart, and pause application provisioning synchronization jobs.\nmicrosoft.directory/servicePrincipals/synchronization.cloudTenantToExternalSystem/schema/manage\nCreate and manage application provisioning synchronization jobs and schema.\nmicrosoft.directory/servicePrincipals/synchronization/standard/read\nRead provisioning settings associated with your service principal\nmicrosoft.directory/servicePrincipals/synchronizationCredentials/manage\nManage application provisioning secrets and credentials\nmicrosoft.directory/servicePrincipals/synchronizationJobs/manage\nStart, restart, and pause application provisioning synchronization jobs\nmicrosoft.directory/servicePrincipals/synchronizationSchema/manage\nCreate and manage application provisioning synchronization jobs and schema\nmicrosoft.directory/servicePrincipals/tag/update\nUpdate the tag property for service principals\nmicrosoft.directory/signInReports/allProperties/read\nRead all properties on sign-in reports, including privileged properties\nmicrosoft.office365.serviceHealth/allEntities/allTasks\nRead and configure Service Health in the Microsoft 365 admin center\nmicrosoft.office365.supportTickets/allEntities/allTasks\nCreate and manage Microsoft 365 service requests\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nApplication Developer\nThis is a\nprivileged role\n. Users in this role can create application registrations when the \"Users can register applications\" setting is set to No. This role also grants permission to consent on one's own behalf when the \"Users can consent to apps accessing company data on their behalf\" setting is set to No. Users assigned to this role are added as owners when creating new application registrations.\nActions\nDescription\nmicrosoft.directory/applications/createAsOwner\nCreate all types of applications, and creator is added as the first owner\nmicrosoft.directory/oAuth2PermissionGrants/createAsOwner\nCreate OAuth 2.0 permission grants, with creator as the first owner\nmicrosoft.directory/servicePrincipals/createAsOwner\nCreate service principals, with creator as the first owner\nAttack Payload Author\nUsers in this role can create attack payloads but not actually launch or schedule them. Attack payloads are then available to all administrators in the tenant who can use them to create a simulation. Access to reports is limited to simulations executed by the user, and this role doesn't grant access to aggregate reports such as Training efficacy, Repeat offenders, Training completion, or User coverage.\nFor more information, see these articles:\nGet started using Attack simulation training\nMicrosoft Defender for Office 365 permissions in the Microsoft Defender portal\nPermissions in the Microsoft Purview portal\nActions\nDescription\nmicrosoft.office365.protectionCenter/attackSimulator/payload/allProperties/allTasks\nCreate and manage attack payloads in Attack Simulator\nmicrosoft.office365.protectionCenter/attackSimulator/reports/allProperties/read\nRead reports of attack simulation, responses, and associated training\nAttack Simulation Administrator\nUsers in this role can create and manage all aspects of attack simulation creation, launch/scheduling of a simulation, and the review of simulation results. Members of this role have this access for all simulations in the tenant.\nFor more information, see these articles:\nMicrosoft Defender for Office 365 permissions in the Microsoft Defender portal\nPermissions in the Microsoft Purview portal\nActions\nDescription\nmicrosoft.office365.protectionCenter/attackSimulator/payload/allProperties/allTasks\nCreate and manage attack payloads in Attack Simulator\nmicrosoft.office365.protectionCenter/attackSimulator/reports/allProperties/read\nRead reports of attack simulation, responses, and associated training\nmicrosoft.office365.protectionCenter/attackSimulator/simulation/allProperties/allTasks\nCreate and manage attack simulation templates in Attack Simulator\nAttribute Assignment Administrator\nUsers with this role can assign and remove custom security attribute keys and values for supported Microsoft Entra objects such as users, service principals, and devices.\nImportant\nBy default,\nGlobal Administrator\nand other administrator roles do not have permissions to read, define, or assign custom security attributes.\nFor more information, see\nManage access to custom security attributes in Microsoft Entra ID\n.\nActions\nDescription\nmicrosoft.directory/attributeSets/allProperties/read\nRead all properties of attribute sets\nmicrosoft.directory/azureManagedIdentities/customSecurityAttributes/read\nRead custom security attribute values for Microsoft Entra managed identities\nmicrosoft.directory/azureManagedIdentities/customSecurityAttributes/update\nUpdate custom security attribute values for Microsoft Entra managed identities\nmicrosoft.directory/customSecurityAttributeDefinitions/allProperties/read\nRead all properties of custom security attribute definitions\nmicrosoft.directory/devices/customSecurityAttributes/read\nRead custom security attribute values for devices\nmicrosoft.directory/devices/customSecurityAttributes/update\nUpdate custom security attribute values for devices\nmicrosoft.directory/servicePrincipals/customSecurityAttributes/read\nRead custom security attribute values for service principals\nmicrosoft.directory/servicePrincipals/customSecurityAttributes/update\nUpdate custom security attribute values for service principals\nmicrosoft.directory/users/customSecurityAttributes/read\nRead custom security attribute values for users\nmicrosoft.directory/users/customSecurityAttributes/update\nUpdate custom security attribute values for users\nAttribute Assignment Reader\nUsers with this role can read custom security attribute keys and values for supported Microsoft Entra objects.\nImportant\nBy default,\nGlobal Administrator\nand other administrator roles do not have permissions to read, define, or assign custom security attributes.\nFor more information, see\nManage access to custom security attributes in Microsoft Entra ID\n.\nActions\nDescription\nmicrosoft.directory/attributeSets/allProperties/read\nRead all properties of attribute sets\nmicrosoft.directory/azureManagedIdentities/customSecurityAttributes/read\nRead custom security attribute values for Microsoft Entra managed identities\nmicrosoft.directory/customSecurityAttributeDefinitions/allProperties/read\nRead all properties of custom security attribute definitions\nmicrosoft.directory/devices/customSecurityAttributes/read\nRead custom security attribute values for devices\nmicrosoft.directory/servicePrincipals/customSecurityAttributes/read\nRead custom security attribute values for service principals\nmicrosoft.directory/users/customSecurityAttributes/read\nRead custom security attribute values for users\nAttribute Definition Administrator\nUsers with this role can define a valid set of custom security attributes that can be assigned to supported Microsoft Entra objects. This role can also activate and deactivate custom security attributes.\nImportant\nBy default,\nGlobal Administrator\nand other administrator roles do not have permissions to read, define, or assign custom security attributes.\nFor more information, see\nManage access to custom security attributes in Microsoft Entra ID\n.\nActions\nDescription\nmicrosoft.directory/attributeSets/allProperties/allTasks\nManage all aspects of attribute sets\nmicrosoft.directory/customSecurityAttributeDefinitions/allProperties/allTasks\nManage all aspects of custom security attribute definitions\nAttribute Definition Reader\nUsers with this role can read the definition of custom security attributes.\nImportant\nBy default,\nGlobal Administrator\nand other administrator roles do not have permissions to read, define, or assign custom security attributes.\nFor more information, see\nManage access to custom security attributes in Microsoft Entra ID\n.\nActions\nDescription\nmicrosoft.directory/attributeSets/allProperties/read\nRead all properties of attribute sets\nmicrosoft.directory/customSecurityAttributeDefinitions/allProperties/read\nRead all properties of custom security attribute definitions\nAttribute Log Administrator\nAssign the Attribute Log Reader role to users who need to do the following tasks:\nRead audit logs for custom security attribute value changes\nRead audit logs for custom security attribute definition changes and assignments\nConfigure diagnostic settings for custom security attributes\nUsers with this role\ncannot\nread audit logs for other events.\nImportant\nBy default,\nGlobal Administrator\nand other administrator roles do not have permissions to read, define, or assign custom security attributes.\nFor more information, see\nManage access to custom security attributes in Microsoft Entra ID\n.\nActions\nDescription\nmicrosoft.azure.customSecurityAttributeDiagnosticSettings/allEntities/allProperties/allTasks\nConfigure all aspects of custom security attributes diagnostic settings\nmicrosoft.directory/customSecurityAttributeAuditLogs/allProperties/read\nRead audit logs related to custom secruity attributes\nAttribute Log Reader\nAssign the Attribute Log Reader role to users who need to do the following tasks:\nRead audit logs for custom security attribute value changes\nRead audit logs for custom security attribute definition changes and assignments\nUsers with this role\ncannot\ndo the following tasks:\nConfigure diagnostic settings for custom security attributes\nRead audit logs for other events\nImportant\nBy default,\nGlobal Administrator\nand other administrator roles do not have permissions to read, define, or assign custom security attributes.\nFor more information, see\nManage access to custom security attributes in Microsoft Entra ID\n.\nActions\nDescription\nmicrosoft.directory/customSecurityAttributeAuditLogs/allProperties/read\nRead audit logs related to custom secruity attributes\nAttribute Provisioning Administrator\nThis is a\nprivileged role\n. Assign the Attribute Provisioning Administrator role to users who need to do the following tasks:\nRead and write attribute mappings for custom security attributes when provisioning in an application.\nRead and write provisioning and auditing logs for custom security attributes when provisioning in an application.\nUsers with this role cannot read audit logs for other events. This role must be used in conjunction with the Cloud Application Administrator or Application Administrator roles (from least to most privileged) to read provisioning configurations.\nImportant\nThis role does not have the ability to create custom security attribute sets or to directly assign or update custom security attribute values for the user object. This role can only configure the flow of the custom security attributes in the provisioning app.\nLearn more\nActions\nDescription\nmicrosoft.directory/servicePrincipals/synchronization.customSecurityAttributes/schema/read\nRead all custom security attributes in the synchronization schema\nmicrosoft.directory/servicePrincipals/synchronization.customSecurityAttributes/schema/update\nUpdate custom security attribute mappings in the synchronization schema\nAttribute Provisioning Reader\nThis is a\nprivileged role\n. Assign the Attribute Provisioning Reader role to users who need to do the following tasks:\nRead the attribute mappings for custom security attributes when provisioning in an application.\nRead the provisioning and auditing logs for custom security attributes when provisioning in an application.\nUsers with this role can't read audit logs for other events. This role must be used with the Cloud Application Administrator or Application Administrator roles (from least to most privileged) to read provisioning configurations.\nLearn more\nActions\nDescription\nmicrosoft.directory/servicePrincipals/synchronization.customSecurityAttributes/schema/read\nRead all custom security attributes in the synchronization schema\nAuthentication Administrator\nThis is a\nprivileged role\n. Assign the Authentication Administrator role to users who need to do the following:\nSet or reset any authentication method (including passwords) for nonadministrators and some roles. For a list of the roles that an Authentication Administrator can read or update authentication methods, see\nWho can reset passwords\n.\nRequire users who are nonadministrators or assigned to some roles to re-register against existing nonpassword credentials (for example, MFA or FIDO), and can also revoke\nremember MFA on the device\n, which prompts for MFA on the next sign-in.\nManage MFA settings in the legacy MFA management portal.\nPerform sensitive actions for some users. For more information, see\nWho can perform sensitive actions\n.\nCreate and manage support tickets in Azure and the Microsoft 365 admin center.\nUsers with this role\ncannot\ndo the following:\nCannot change the credentials or reset MFA for members and owners of a\nrole-assignable group\n.\nCannot manage Hardware OATH tokens.\nThe following table compares the capabilities of authentication-related roles.\nRole\nManage user's auth methods\nManage per-user MFA\nManage MFA settings\nManage auth method policy\nManage password protection policy\nUpdate sensitive properties\nDelete and restore users\nAuthentication Administrator\nYes for\nsome users\nNo\nNo\nNo\nNo\nYes for\nsome users\nYes for\nsome users\nPrivileged Authentication Administrator\nYes for all users\nNo\nNo\nNo\nNo\nYes for all users\nYes for all users\nAuthentication Policy Administrator\nNo\nYes\nYes\nYes\nYes\nNo\nNo\nUser Administrator\nNo\nNo\nNo\nNo\nNo\nYes for\nsome users\nYes for\nsome users\nImportant\nUsers with this role can change credentials for people who may have access to sensitive or private information or critical configuration inside and outside of Microsoft Entra ID. Changing the credentials of a user may mean the ability to assume that user's identity and permissions. For example:\nApplication Registration and Enterprise Application owners, who can manage credentials of apps they own. Those apps may have privileged permissions in Microsoft Entra ID and elsewhere not granted to Authentication Administrators. Through this path, an Authentication Administrator can assume the identity of an application owner and then further assume the identity of a privileged application by updating the credentials for the application.\nAzure subscription owners, who may have access to sensitive or private information or critical configuration in Azure.\nSecurity Group and Microsoft 365 group owners, who can manage group membership. Those groups may grant access to sensitive or private information or critical configuration in Microsoft Entra ID and elsewhere.\nAdministrators in other services outside of Microsoft Entra ID like Exchange Online, Microsoft Defender XDR portal, Microsoft Purview portal, and human resources systems.\nNonadministrators like executives, legal counsel, and human resources employees who may have access to sensitive or private information.\nActions\nDescription\nmicrosoft.azure.serviceHealth/allEntities/allTasks\nRead and configure Azure Service Health\nmicrosoft.azure.supportTickets/allEntities/allTasks\nCreate and manage Azure support tickets\nmicrosoft.directory/deletedItems.users/restore\nRestore soft deleted users to original state\nmicrosoft.directory/users/authenticationMethods/basic/update\nUpdate basic properties of authentication methods for users\nmicrosoft.directory/users/authenticationMethods/create\nUpdate authentication methods for users\nmicrosoft.directory/users/authenticationMethods/delete\nDelete authentication methods for users\nmicrosoft.directory/users/authenticationMethods/standard/restrictedRead\nRead standard properties of authentication methods that do not include personally identifiable information for users\nmicrosoft.directory/users/basic/update\nUpdate basic properties on users\nmicrosoft.directory/users/delete\nDelete users\nmicrosoft.directory/users/disable\nDisable users\nmicrosoft.directory/users/enable\nEnable users\nmicrosoft.directory/users/invalidateAllRefreshTokens\nForce sign-out by invalidating user refresh tokens\nmicrosoft.directory/users/manager/update\nUpdate manager for users\nmicrosoft.directory/users/password/update\nReset passwords for all users\nmicrosoft.directory/users/restore\nRestore deleted users\nmicrosoft.directory/users/userPrincipalName/update\nUpdate User Principal Name of users\nmicrosoft.office365.serviceHealth/allEntities/allTasks\nRead and configure Service Health in the Microsoft 365 admin center\nmicrosoft.office365.supportTickets/allEntities/allTasks\nCreate and manage Microsoft 365 service requests\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nAuthentication Extensibility Administrator\nThis is a\nprivileged role\n. Assign the Authentication Extensibility Administrator role to users who need to do the following tasks:\nCreate and manage all aspects of custom authentication extensions.\nUsers with this role\ncan't\ndo the following:\nCan't assign custom authentication extensions to applications to modify the authentication experiences, and can't consent to application permissions or create app registrations associated with the custom authentication extension. Instead, you must use the Application Administrator, Application Developer, or Cloud Application Administrator roles.\nA custom authentication extension is an API endpoint created by a developer for authentication events and is registered in Microsoft Entra ID. Application administrators and application owners can use custom authentication extensions to customize their application's authentication experiences, such as sign in and sign up, or password reset.\nLearn more\nActions\nDescription\nmicrosoft.directory/customAuthenticationExtensions/allProperties/allTasks\nCreate and manage custom authentication extensions\nAuthentication Policy Administrator\nAssign the Authentication Policy Administrator role to users who need to do the following:\nConfigure the authentication methods policy, tenant-wide MFA settings, and password protection policy that determine which methods each user can register and use.\nManage Password Protection settings: smart lockout configurations and updating the custom banned passwords list.\nManage MFA settings in the legacy MFA management portal.\nCreate and manage verifiable credentials.\nCreate and manage Azure support tickets.\nUsers with this role\ncannot\ndo the following:\nCannot update sensitive properties. For more information, see\nWho can perform sensitive actions\n.\nCannot delete or restore users. For more information, see\nWho can perform sensitive actions\n.\nCannot manage Hardware OATH tokens.\nThe following table compares the capabilities of authentication-related roles.\nRole\nManage user's auth methods\nManage per-user MFA\nManage MFA settings\nManage auth method policy\nManage password protection policy\nUpdate sensitive properties\nDelete and restore users\nAuthentication Administrator\nYes for\nsome users\nNo\nNo\nNo\nNo\nYes for\nsome users\nYes for\nsome users\nPrivileged Authentication Administrator\nYes for all users\nNo\nNo\nNo\nNo\nYes for all users\nYes for all users\nAuthentication Policy Administrator\nNo\nYes\nYes\nYes\nYes\nNo\nNo\nUser Administrator\nNo\nNo\nNo\nNo\nNo\nYes for\nsome users\nYes for\nsome users\nActions\nDescription\nmicrosoft.azure.supportTickets/allEntities/allTasks\nCreate and manage Azure support tickets\nmicrosoft.directory/organization/strongAuthentication/allTasks\nManage all aspects of strong authentication properties of an organization\nmicrosoft.directory/userCredentialPolicies/basic/update\nUpdate basic policies for users\nmicrosoft.directory/userCredentialPolicies/create\nCreate credential policies for users\nmicrosoft.directory/userCredentialPolicies/delete\nDelete credential policies for users\nmicrosoft.directory/userCredentialPolicies/owners/read\nRead owners of credential policies for users\nmicrosoft.directory/userCredentialPolicies/owners/update\nUpdate owners of credential policies for users\nmicrosoft.directory/userCredentialPolicies/policyAppliedTo/read\nRead policy.appliesTo navigation link\nmicrosoft.directory/userCredentialPolicies/standard/read\nRead standard properties of credential policies for users\nmicrosoft.directory/userCredentialPolicies/tenantDefault/update\nUpdate policy.isOrganizationDefault property\nmicrosoft.directory/verifiableCredentials/configuration/allProperties/read\nRead configuration required to create and manage verifiable credentials\nmicrosoft.directory/verifiableCredentials/configuration/allProperties/update\nUpdate configuration required to create and manage verifiable credentials\nmicrosoft.directory/verifiableCredentials/configuration/contracts/allProperties/read\nRead a verifiable credential contract\nmicrosoft.directory/verifiableCredentials/configuration/contracts/allProperties/update\nUpdate a verifiable credential contract\nmicrosoft.directory/verifiableCredentials/configuration/contracts/cards/allProperties/read\nRead a verifiable credential card\nmicrosoft.directory/verifiableCredentials/configuration/contracts/cards/revoke\nRevoke a verifiable credential card\nmicrosoft.directory/verifiableCredentials/configuration/contracts/create\nCreate a verifiable credential contract\nmicrosoft.directory/verifiableCredentials/configuration/create\nCreate configuration required to create and manage verifiable credentials\nmicrosoft.directory/verifiableCredentials/configuration/delete\nDelete configuration required to create and manage verifiable credentials and delete all of its verifiable credentials\nAzure DevOps Administrator\nUsers with this role can manage all enterprise Azure DevOps policies, applicable to all Azure DevOps organizations backed by Microsoft Entra ID. Users in this role can manage these policies by navigating to any Azure DevOps organization that is backed by the company's Microsoft Entra ID. Additionally, users in this role can claim ownership of orphaned Azure DevOps organizations. This role grants no other Azure DevOps-specific permissions (for example, Project Collection Administrators) inside any of the Azure DevOps organizations backed by the company's Microsoft Entra organization.\nActions\nDescription\nmicrosoft.azure.devOps/allEntities/allTasks\nRead and configure Azure DevOps\nAzure Information Protection Administrator\nUsers with this role have all permissions in the Azure Information Protection service. This role allows configuring labels for the Azure Information Protection policy, managing protection templates, and activating protection. This role doesn't grant any permissions in Microsoft Entra ID Protection, Privileged Identity Management, Monitor Microsoft 365 Service Health, Microsoft Defender XDR portal, or Microsoft Purview portal.\nActions\nDescription\nmicrosoft.azure.informationProtection/allEntities/allTasks\nManage all aspects of Azure Information Protection\nmicrosoft.azure.serviceHealth/allEntities/allTasks\nRead and configure Azure Service Health\nmicrosoft.azure.supportTickets/allEntities/allTasks\nCreate and manage Azure support tickets\nmicrosoft.directory/authorizationPolicy/standard/read\nRead standard properties of authorization policy\nmicrosoft.office365.serviceHealth/allEntities/allTasks\nRead and configure Service Health in the Microsoft 365 admin center\nmicrosoft.office365.supportTickets/allEntities/allTasks\nCreate and manage Microsoft 365 service requests\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nB2C IEF Keyset Administrator\nThis is a\nprivileged role\n. Users assigned to this role can create and manage policy keys and secrets used for token encryption, token signing, and claim encryption/decryption. They can add new keys to existing key containers, enabling secret rollover without affecting existing applications. Additionally, users in this role can view the complete details of these secrets, including their expiration dates, even after creation.\nImportant\nThis is a sensitive role. The keyset administrator role should be carefully audited and assigned with care during preproduction and production.\nActions\nDescription\nmicrosoft.directory/b2cTrustFrameworkKeySet/allProperties/allTasks\nRead and configure key sets in Azure Active Directory B2C\nB2C IEF Policy Administrator\nUsers in this role have the ability to create, read, update, and delete all custom policies in Azure AD B2C and therefore have full control over the Identity Experience Framework in the relevant Azure AD B2C organization. By editing policies, this user can establish direct federation with external identity providers, change the directory schema, change all user-facing content (HTML, CSS, JavaScript), change the requirements to complete an authentication, create new users, send user data to external systems including full migrations, and edit all user information including sensitive fields like passwords and phone numbers. Conversely, this role cannot change the encryption keys or edit the secrets used for federation in the organization.\nImportant\nThe B2 IEF Policy Administrator is a highly sensitive role that should be assigned on a very limited basis for organizations in production. Activities by these users should be closely audited, especially for organizations in production.\nActions\nDescription\nmicrosoft.directory/b2cTrustFrameworkPolicy/allProperties/allTasks\nRead and configure custom policies in Azure Active Directory B2C\nBilling Administrator\nMakes purchases, manages subscriptions, manages support tickets, and monitors service health.\nActions\nDescription\nmicrosoft.azure.serviceHealth/allEntities/allTasks\nRead and configure Azure Service Health\nmicrosoft.azure.supportTickets/allEntities/allTasks\nCreate and manage Azure support tickets\nmicrosoft.commerce.billing/allEntities/allProperties/allTasks\nManage all aspects of Office 365 billing\nmicrosoft.directory/organization/basic/update\nUpdate basic properties on organization\nmicrosoft.office365.serviceHealth/allEntities/allTasks\nRead and configure Service Health in the Microsoft 365 admin center\nmicrosoft.office365.supportTickets/allEntities/allTasks\nCreate and manage Microsoft 365 service requests\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nCloud App Security Administrator\nUsers with this role have full permissions in Defender for Cloud Apps. They can add administrators, add Microsoft Defender for Cloud Apps policies and settings, upload logs, and perform governance actions.\nActions\nDescription\nmicrosoft.directory/cloudAppSecurity/allProperties/allTasks\nCreate and delete all resources, and read and update standard properties in Microsoft Defender for Cloud Apps\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nCloud Application Administrator\nThis is a\nprivileged role\n. Users in this role have the same permissions as the Application Administrator role, excluding the ability to manage application proxy. This role grants the ability to create and manage all aspects of enterprise applications and application registrations. Users assigned to this role are not added as owners when creating new application registrations or enterprise applications.\nThis role also grants the ability to consent for delegated permissions and application permissions, with the exception of application permissions for Azure AD Graph and Microsoft Graph.\nImportant\nThis exception means that you can still consent to application permissions for\nother\napps (for example, other Microsoft apps, 3rd-party apps, or apps that you have registered). You can still\nrequest\nthese permissions as part of the app registration, but\ngranting\n(that is, consenting to) these permissions requires a more privileged administrator, such as Privileged Role Administrator.\nThis role grants the ability to manage application credentials. Users assigned this role can add credentials to an application, and use those credentials to impersonate the application's identity. If the application's identity has been granted access to a resource, such as the ability to create or update User or other objects, then a user assigned to this role could perform those actions while impersonating the application. This ability to impersonate the application's identity may be an elevation of privilege over what the user can do via their role assignments. It is important to understand that assigning a user to the Application Administrator role gives them the ability to impersonate an application's identity.\nActions\nDescription\nmicrosoft.azure.serviceHealth/allEntities/allTasks\nRead and configure Azure Service Health\nmicrosoft.azure.supportTickets/allEntities/allTasks\nCreate and manage Azure support tickets\nmicrosoft.directory/adminConsentRequestPolicy/allProperties/allTasks\nManage admin consent request policies in Microsoft Entra ID\nmicrosoft.directory/appConsent/appConsentRequests/allProperties/read\nRead all properties of consent requests for applications registered with Microsoft Entra ID\nmicrosoft.directory/applicationPolicies/basic/update\nUpdate standard properties of application policies\nmicrosoft.directory/applicationPolicies/create\nCreate application policies\nmicrosoft.directory/applicationPolicies/delete\nDelete application policies\nmicrosoft.directory/applicationPolicies/owners/read\nRead owners on application policies\nmicrosoft.directory/applicationPolicies/owners/update\nUpdate the owner property of application policies\nmicrosoft.directory/applicationPolicies/policyAppliedTo/read\nRead application policies applied to objects list\nmicrosoft.directory/applicationPolicies/standard/read\nRead standard properties of application policies\nmicrosoft.directory/applications/appRoles/update\nUpdate the appRoles property on all types of applications\nmicrosoft.directory/applications/audience/update\nUpdate the audience property for applications\nmicrosoft.directory/applications/authentication/update\nUpdate authentication on all types of applications\nmicrosoft.directory/applications/basic/update\nUpdate basic properties for applications\nmicrosoft.directory/applications/create\nCreate all types of applications\nmicrosoft.directory/applications/credentials/update\nUpdate application credentials\nmicrosoft.directory/applications/delete\nDelete all types of applications\nmicrosoft.directory/applications/extensionProperties/update\nUpdate extension properties on applications\nmicrosoft.directory/applications/notes/update\nUpdate notes of applications\nmicrosoft.directory/applications/owners/update\nUpdate owners of applications\nmicrosoft.directory/applications/permissions/update\nUpdate exposed permissions and required permissions on all types of applications\nmicrosoft.directory/applications/policies/update\nUpdate policies of applications\nmicrosoft.directory/applications/synchronization/standard/read\nRead provisioning settings associated with the application object\nmicrosoft.directory/applications/tag/update\nUpdate tags of applications\nmicrosoft.directory/applications/verification/update\nUpdate applicationsverification property\nmicrosoft.directory/applicationTemplates/instantiate\nInstantiate gallery applications from application templates\nmicrosoft.directory/auditLogs/allProperties/read\nRead all properties on audit logs, excluding custom security attributes audit logs\nmicrosoft.directory/deletedItems.applications/delete\nPermanently delete applications, which can no longer be restored\nmicrosoft.directory/deletedItems.applications/restore\nRestore soft deleted applications to original state\nmicrosoft.directory/oAuth2PermissionGrants/allProperties/allTasks\nCreate and delete OAuth 2.0 permission grants, and read and update all properties\nmicrosoft.directory/provisioningLogs/allProperties/read\nRead all properties of provisioning logs\nmicrosoft.directory/servicePrincipals/appRoleAssignedTo/update\nUpdate service principal role assignments\nmicrosoft.directory/servicePrincipals/audience/update\nUpdate audience properties on service principals\nmicrosoft.directory/servicePrincipals/authentication/update\nUpdate authentication properties on service principals\nmicrosoft.directory/servicePrincipals/basic/update\nUpdate basic properties on service principals\nmicrosoft.directory/servicePrincipals/create\nCreate service principals\nmicrosoft.directory/servicePrincipals/credentials/update\nUpdate credentials of service principals\nmicrosoft.directory/servicePrincipals/delete\nDelete service principals\nmicrosoft.directory/servicePrincipals/disable\nDisable service principals\nmicrosoft.directory/servicePrincipals/enable\nEnable service principals\nmicrosoft.directory/servicePrincipals/getPasswordSingleSignOnCredentials\nManage password single sign-on credentials on service principals\nmicrosoft.directory/servicePrincipals/managePasswordSingleSignOnCredentials\nRead password single sign-on credentials on service principals\nmicrosoft.directory/servicePrincipals/managePermissionGrantsForAll.microsoft-application-admin\nGrant consent for application permissions and delegated permissions on behalf of any user or all users, except for application permissions for Microsoft Graph and Azure AD Graph\nmicrosoft.directory/servicePrincipals/notes/update\nUpdate notes of service principals\nmicrosoft.directory/servicePrincipals/owners/update\nUpdate owners of service principals\nmicrosoft.directory/servicePrincipals/permissions/update\nUpdate permissions of service principals\nmicrosoft.directory/servicePrincipals/policies/update\nUpdate policies of service principals\nmicrosoft.directory/servicePrincipals/synchronization.cloudTenantToExternalSystem/credentials/manage\nManage application provisioning secrets and credentials.\nmicrosoft.directory/servicePrincipals/synchronization.cloudTenantToExternalSystem/jobs/manage\nStart, restart, and pause application provisioning synchronization jobs.\nmicrosoft.directory/servicePrincipals/synchronization.cloudTenantToExternalSystem/schema/manage\nCreate and manage application provisioning synchronization jobs and schema.\nmicrosoft.directory/servicePrincipals/synchronization/standard/read\nRead provisioning settings associated with your service principal\nmicrosoft.directory/servicePrincipals/synchronizationCredentials/manage\nManage application provisioning secrets and credentials\nmicrosoft.directory/servicePrincipals/synchronizationJobs/manage\nStart, restart, and pause application provisioning synchronization jobs\nmicrosoft.directory/servicePrincipals/synchronizationSchema/manage\nCreate and manage application provisioning synchronization jobs and schema\nmicrosoft.directory/servicePrincipals/tag/update\nUpdate the tag property for service principals\nmicrosoft.directory/signInReports/allProperties/read\nRead all properties on sign-in reports, including privileged properties\nmicrosoft.office365.serviceHealth/allEntities/allTasks\nRead and configure Service Health in the Microsoft 365 admin center\nmicrosoft.office365.supportTickets/allEntities/allTasks\nCreate and manage Microsoft 365 service requests\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nCloud Device Administrator\nThis is a\nprivileged role\n. Users in this role can enable, disable, and delete devices in Microsoft Entra ID and read Windows 10 BitLocker keys (if present) in the Azure portal. The role does not grant permissions to manage any other properties on the device.\nActions\nDescription\nmicrosoft.azure.serviceHealth/allEntities/allTasks\nRead and configure Azure Service Health\nmicrosoft.directory/auditLogs/allProperties/read\nRead all properties on audit logs, excluding custom security attributes audit logs\nmicrosoft.directory/authorizationPolicy/standard/read\nRead standard properties of authorization policy\nmicrosoft.directory/bitlockerKeys/key/read\nRead bitlocker metadata and key on devices\nmicrosoft.directory/deletedItems.devices/delete\nPermanently delete devices, which can no longer be restored\nmicrosoft.directory/deletedItems.devices/restore\nRestore soft deleted devices to original state\nmicrosoft.directory/deviceLocalCredentials/password/read\nRead all properties of the backed up local administrator account credentials for Microsoft Entra joined devices, including the password\nmicrosoft.directory/deviceManagementPolicies/basic/update\nUpdate basic properties on mobile device management and mobile app management policies\nmicrosoft.directory/deviceManagementPolicies/standard/read\nRead standard properties on mobile device management and mobile app management policies\nmicrosoft.directory/deviceRegistrationPolicy/basic/update\nUpdate basic properties on device registration policies\nmicrosoft.directory/deviceRegistrationPolicy/standard/read\nRead standard properties on device registration policies\nmicrosoft.directory/devices/delete\nDelete devices from Microsoft Entra ID\nmicrosoft.directory/devices/disable\nDisable devices in Microsoft Entra ID\nmicrosoft.directory/devices/enable\nEnable devices in Microsoft Entra ID\nmicrosoft.directory/devices/permissions/update\nUpdate the alternative name property on an IoT device\nmicrosoft.directory/deviceTemplates/owners/read\nRead owners on Internet of Things (IoT) device templates\nmicrosoft.directory/deviceTemplates/owners/update\nUpdate owners on Internet of Things (IoT) device templates\nmicrosoft.directory/signInReports/allProperties/read\nRead all properties on sign-in reports, including privileged properties\nmicrosoft.office365.serviceHealth/allEntities/allTasks\nRead and configure Service Health in the Microsoft 365 admin center\nCompliance Administrator\nUsers with this role have permissions to manage compliance-related features in the Microsoft Purview portal, Microsoft 365 admin center, Azure, and Microsoft 365 Defender portal. Assignees can also manage all features within the Exchange admin center and create support tickets for Azure and Microsoft 365. For more information, see\nRoles and role groups in Microsoft Defender for Office 365 and Microsoft Purview compliance\n.\nIn\nCan do\nMicrosoft Purview portal\nProtect and manage your organization's data across Microsoft 365 services\nManage compliance alerts\nMicrosoft Purview Compliance Manager\nTrack, assign, and verify your organization's regulatory compliance activities\nMicrosoft 365 Defender portal\nManage data governance\nPerform legal and data investigation\nManage Data Subject Request\nThis role has the same permissions as the\nCompliance Administrator role group\nin Microsoft 365 Defender portal role-based access control.\nIntune\nView all Intune audit data\nMicrosoft Defender for Cloud Apps\nHas read-only permissions and can manage alerts\nCan create and modify file policies and allow file governance actions\nCan view all the built-in reports under Data Management\nActions\nDescription\nmicrosoft.azure.serviceHealth/allEntities/allTasks\nRead and configure Azure Service Health\nmicrosoft.azure.supportTickets/allEntities/allTasks\nCreate and manage Azure support tickets\nmicrosoft.directory/entitlementManagement/allProperties/read\nRead all properties in Microsoft Entra entitlement management\nmicrosoft.office365.complianceManager/allEntities/allTasks\nManage all aspects of Office 365 Compliance Manager\nmicrosoft.office365.serviceHealth/allEntities/allTasks\nRead and configure Service Health in the Microsoft 365 admin center\nmicrosoft.office365.supportTickets/allEntities/allTasks\nCreate and manage Microsoft 365 service requests\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nCompliance Data Administrator\nUsers with this role have permissions to track data in the Microsoft Purview portal, Microsoft 365 admin center, and Azure. Users can also track compliance data within the Exchange admin center, Compliance Manager, and Teams & Skype for Business admin center and create support tickets for Azure and Microsoft 365. For more information about the differences between Compliance Administrator and Compliance Data Administrator, see\nRoles and role groups in Microsoft Defender for Office 365 and Microsoft Purview compliance\n.\nIn\nCan do\nMicrosoft Purview portal\nMonitor compliance-related policies across Microsoft 365 services\nManage compliance alerts\nMicrosoft Purview Compliance Manager\nTrack, assign, and verify your organization's regulatory compliance activities\nMicrosoft 365 Defender portal\nManage data governance\nPerform legal and data investigation\nManage Data Subject Request\nThis role has the same permissions as the\nCompliance Data Administrator role group\nin Microsoft 365 Defender portal role-based access control.\nIntune\nView all Intune audit data\nMicrosoft Defender for Cloud Apps\nHas read-only permissions and can manage alerts\nCan create and modify file policies and allow file governance actions\nCan view all the built-in reports under Data Management\nActions\nDescription\nmicrosoft.azure.informationProtection/allEntities/allTasks\nManage all aspects of Azure Information Protection\nmicrosoft.azure.serviceHealth/allEntities/allTasks\nRead and configure Azure Service Health\nmicrosoft.azure.supportTickets/allEntities/allTasks\nCreate and manage Azure support tickets\nmicrosoft.directory/authorizationPolicy/standard/read\nRead standard properties of authorization policy\nmicrosoft.directory/cloudAppSecurity/allProperties/allTasks\nCreate and delete all resources, and read and update standard properties in Microsoft Defender for Cloud Apps\nmicrosoft.office365.complianceManager/allEntities/allTasks\nManage all aspects of Office 365 Compliance Manager\nmicrosoft.office365.serviceHealth/allEntities/allTasks\nRead and configure Service Health in the Microsoft 365 admin center\nmicrosoft.office365.supportTickets/allEntities/allTasks\nCreate and manage Microsoft 365 service requests\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nConditional Access Administrator\nThis is a\nprivileged role\n. Users with this role have the ability to manage Microsoft Entra Conditional Access settings.\nActions\nDescription\nmicrosoft.directory/conditionalAccessPolicies/basic/update\nUpdate basic properties for Conditional Access policies\nmicrosoft.directory/conditionalAccessPolicies/create\nCreate Conditional Access policies\nmicrosoft.directory/conditionalAccessPolicies/delete\nDelete Conditional Access policies\nmicrosoft.directory/conditionalAccessPolicies/owners/read\nRead the owners of Conditional Access policies\nmicrosoft.directory/conditionalAccessPolicies/owners/update\nUpdate owners for Conditional Access policies\nmicrosoft.directory/conditionalAccessPolicies/policyAppliedTo/read\nRead the \"applied to\" property for Conditional Access policies\nmicrosoft.directory/conditionalAccessPolicies/standard/read\nRead Conditional Access for policies\nmicrosoft.directory/conditionalAccessPolicies/tenantDefault/update\nUpdate the default tenant for Conditional Access policies\nmicrosoft.directory/namedLocations/basic/update\nUpdate basic properties of custom rules that define network locations\nmicrosoft.directory/namedLocations/create\nCreate custom rules that define network locations\nmicrosoft.directory/namedLocations/delete\nDelete custom rules that define network locations\nmicrosoft.directory/namedLocations/standard/read\nRead basic properties of custom rules that define network locations\nmicrosoft.directory/resourceNamespaces/resourceActions/authenticationContext/update\nUpdate Conditional Access authentication context of Microsoft 365 role-based access control (RBAC) resource actions\nCustomer LockBox Access Approver\nManages\nMicrosoft Purview Customer Lockbox requests\nin your organization. They receive email notifications for Customer Lockbox requests and can approve and deny requests from the Microsoft 365 admin center. They can also turn the Customer Lockbox feature on or off. Only Global Administrators can reset the passwords of people assigned to this role.\nActions\nDescription\nmicrosoft.office365.lockbox/allEntities/allTasks\nManage all aspects of Customer Lockbox\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nDesktop Analytics Administrator\nUsers in this role can manage the Desktop Analytics service. This includes the ability to view asset inventory, create deployment plans, and view deployment and health status.\nActions\nDescription\nmicrosoft.azure.serviceHealth/allEntities/allTasks\nRead and configure Azure Service Health\nmicrosoft.azure.supportTickets/allEntities/allTasks\nCreate and manage Azure support tickets\nmicrosoft.directory/authorizationPolicy/standard/read\nRead standard properties of authorization policy\nmicrosoft.office365.desktopAnalytics/allEntities/allTasks\nManage all aspects of Desktop Analytics\nDirectory Readers\nUsers in this role can read basic directory information. This role should be used for:\nGranting a specific set of guest users read access instead of granting it to all guest users.\nGranting a specific set of non-admin users access to Microsoft Entra admin center when \"Restrict access to Microsoft Entra admin center\" is set to \"Yes\".\nGranting service principals access to directory where Directory.Read.All is not an option.\nActions\nDescription\nmicrosoft.directory/administrativeUnits/members/read\nRead members of administrative units\nmicrosoft.directory/administrativeUnits/standard/read\nRead basic properties on administrative units\nmicrosoft.directory/applicationPolicies/standard/read\nRead standard properties of application policies\nmicrosoft.directory/applications/owners/read\nRead owners of applications\nmicrosoft.directory/applications/policies/read\nRead policies of applications\nmicrosoft.directory/applications/standard/read\nRead standard properties of applications\nmicrosoft.directory/contacts/memberOf/read\nRead the group membership for all contacts in Microsoft Entra ID\nmicrosoft.directory/contacts/standard/read\nRead basic properties on contacts in Microsoft Entra ID\nmicrosoft.directory/contracts/standard/read\nRead basic properties on partner contracts\nmicrosoft.directory/devices/memberOf/read\nRead device memberships\nmicrosoft.directory/devices/registeredOwners/read\nRead registered owners of devices\nmicrosoft.directory/devices/registeredUsers/read\nRead registered users of devices\nmicrosoft.directory/devices/standard/read\nRead basic properties on devices\nmicrosoft.directory/directoryRoles/eligibleMembers/read\nRead the eligible members of Microsoft Entra roles\nmicrosoft.directory/directoryRoles/members/read\nRead all members of Microsoft Entra roles\nmicrosoft.directory/directoryRoles/standard/read\nRead basic properties of Microsoft Entra roles\nmicrosoft.directory/domains/standard/read\nRead basic properties on domains\nmicrosoft.directory/groups/appRoleAssignments/read\nRead application role assignments of groups\nmicrosoft.directory/groups/memberOf/read\nRead the memberOf property on Security groups and Microsoft 365 groups, including role-assignable groups\nmicrosoft.directory/groups/members/read\nRead members of Security groups and Microsoft 365 groups, including role-assignable groups\nmicrosoft.directory/groups/owners/read\nRead owners of Security groups and Microsoft 365 groups, including role-assignable groups\nmicrosoft.directory/groups/settings/read\nRead settings of groups\nmicrosoft.directory/groups/standard/read\nRead standard properties of Security groups and Microsoft 365 groups, including role-assignable groups\nmicrosoft.directory/groupSettings/standard/read\nRead basic properties on group settings\nmicrosoft.directory/groupSettingTemplates/standard/read\nRead basic properties on group setting templates\nmicrosoft.directory/oAuth2PermissionGrants/standard/read\nRead basic properties on OAuth 2.0 permission grants\nmicrosoft.directory/organization/standard/read\nRead basic properties on an organization\nmicrosoft.directory/organization/trustedCAsForPasswordlessAuth/read\nRead trusted certificate authorities for passwordless authentication\nmicrosoft.directory/roleAssignments/standard/read\nRead basic properties on role assignments\nmicrosoft.directory/roleDefinitions/standard/read\nRead basic properties on role definitions\nmicrosoft.directory/servicePrincipals/appRoleAssignedTo/read\nRead service principal role assignments\nmicrosoft.directory/servicePrincipals/appRoleAssignments/read\nRead role assignments assigned to service principals\nmicrosoft.directory/servicePrincipals/memberOf/read\nRead the group memberships on service principals\nmicrosoft.directory/servicePrincipals/oAuth2PermissionGrants/read\nRead delegated permission grants on service principals\nmicrosoft.directory/servicePrincipals/ownedObjects/read\nRead owned objects of service principals\nmicrosoft.directory/servicePrincipals/owners/read\nRead owners of service principals\nmicrosoft.directory/servicePrincipals/policies/read\nRead policies of service principals\nmicrosoft.directory/servicePrincipals/standard/read\nRead basic properties of service principals\nmicrosoft.directory/subscribedSkus/standard/read\nRead basic properties on subscriptions\nmicrosoft.directory/users/appRoleAssignments/read\nRead application role assignments for users\nmicrosoft.directory/users/deviceForResourceAccount/read\nRead deviceForResourceAccount of users\nmicrosoft.directory/users/directReports/read\nRead the direct reports for users\nmicrosoft.directory/users/invitedBy/read\nRead the user that invited an external user to a tenant\nmicrosoft.directory/users/licenseDetails/read\nRead license details of users\nmicrosoft.directory/users/manager/read\nRead manager of users\nmicrosoft.directory/users/memberOf/read\nRead the group memberships of users\nmicrosoft.directory/users/oAuth2PermissionGrants/read\nRead delegated permission grants on users\nmicrosoft.directory/users/ownedDevices/read\nRead owned devices of users\nmicrosoft.directory/users/ownedObjects/read\nRead owned objects of users\nmicrosoft.directory/users/photo/read\nRead photo of users\nmicrosoft.directory/users/registeredDevices/read\nRead registered devices of users\nmicrosoft.directory/users/scopedRoleMemberOf/read\nRead user's membership of a Microsoft Entra role, that is scoped to an administrative unit\nmicrosoft.directory/users/sponsors/read\nRead sponsors of users\nmicrosoft.directory/users/standard/read\nRead basic properties on users\nDirectory Synchronization Accounts\nDo not use. This role is automatically assigned to the Microsoft Entra Connect service, and is not intended or supported for any other use.\nActions\nDescription\nmicrosoft.directory/onPremisesSynchronization/standard/read\nRead standard on-premises directory synchronization information\nDirectory Writers\nThis is a\nprivileged role\n. Users in this role can read and update basic information of users, groups, and service principals.\nActions\nDescription\nmicrosoft.directory/applications/extensionProperties/update\nUpdate extension properties on applications\nmicrosoft.directory/contacts/create\nCreate contacts\nmicrosoft.directory/groups/assignedLabels/update\nUpdate the assigned labels property on groups of assigned membership type, excluding role-assignable groups\nmicrosoft.directory/groups/assignLicense\nAssign product licenses to groups for group-based licensing\nmicrosoft.directory/groups/basic/update\nUpdate basic properties on Security groups and Microsoft 365 groups, excluding role-assignable groups\nmicrosoft.directory/groups/classification/update\nUpdate the classification property on Security groups and Microsoft 365 groups, excluding role-assignable groups\nmicrosoft.directory/groups/create\nCreate Security groups and Microsoft 365 groups, excluding role-assignable groups\nmicrosoft.directory/groups/dynamicMembershipRule/update\nUpdate the dynamic membership rule on Security groups and Microsoft 365 groups, excluding role-assignable groups\nmicrosoft.directory/groups/groupType/update\nUpdate properties that would affect the group type of Security groups and Microsoft 365 groups, excluding role-assignable groups\nmicrosoft.directory/groups/members/update\nUpdate members of Security groups and Microsoft 365 groups, excluding role-assignable groups\nmicrosoft.directory/groups/onPremWriteBack/update\nUpdate Microsoft Entra groups to be written back to on-premises with Microsoft Entra Connect\nmicrosoft.directory/groups/owners/update\nUpdate owners of Security groups and Microsoft 365 groups, excluding role-assignable groups\nmicrosoft.directory/groups/reprocessLicenseAssignment\nReprocess license assignments for group-based licensing\nmicrosoft.directory/groups/settings/update\nUpdate settings of groups\nmicrosoft.directory/groups/visibility/update\nUpdate the visibility property of Security groups and Microsoft 365 groups, excluding role-assignable groups\nmicrosoft.directory/groupSettings/basic/update\nUpdate basic properties on group settings\nmicrosoft.directory/groupSettings/create\nCreate group settings\nmicrosoft.directory/groupSettings/delete\nDelete group settings\nmicrosoft.directory/oAuth2PermissionGrants/basic/update\nUpdate OAuth 2.0 permission grants\nmicrosoft.directory/oAuth2PermissionGrants/create\nCreate OAuth 2.0 permission grants\nmicrosoft.directory/servicePrincipals/appRoleAssignedTo/update\nUpdate service principal role assignments\nmicrosoft.directory/servicePrincipals/synchronization.cloudTenantToCloudTenant/credentials/manage\nManage cloud tenant to cloud tenant application provisioning secrets and credentials.\nmicrosoft.directory/servicePrincipals/synchronization.cloudTenantToCloudTenant/jobs/manage\nStart, restart, and pause cloud tenant to cloud tenant application provisioning synchronization jobs.\nmicrosoft.directory/servicePrincipals/synchronization.cloudTenantToCloudTenant/schema/manage\nCreate and manage cloud tenant to cloud tenant application provisioning synchronization jobs and schema.\nmicrosoft.directory/servicePrincipals/synchronization.cloudTenantToExternalSystem/credentials/manage\nManage application provisioning secrets and credentials.\nmicrosoft.directory/servicePrincipals/synchronization.cloudTenantToExternalSystem/jobs/manage\nStart, restart, and pause application provisioning synchronization jobs.\nmicrosoft.directory/servicePrincipals/synchronization.cloudTenantToExternalSystem/schema/manage\nCreate and manage application provisioning synchronization jobs and schema.\nmicrosoft.directory/servicePrincipals/synchronizationCredentials/manage\nManage application provisioning secrets and credentials\nmicrosoft.directory/servicePrincipals/synchronizationJobs/manage\nStart, restart, and pause application provisioning synchronization jobs\nmicrosoft.directory/servicePrincipals/synchronizationSchema/manage\nCreate and manage application provisioning synchronization jobs and schema\nmicrosoft.directory/users/assignLicense\nManage user licenses\nmicrosoft.directory/users/basic/update\nUpdate basic properties on users\nmicrosoft.directory/users/create\nAdd users\nmicrosoft.directory/users/disable\nDisable users\nmicrosoft.directory/users/enable\nEnable users\nmicrosoft.directory/users/invalidateAllRefreshTokens\nForce sign-out by invalidating user refresh tokens\nmicrosoft.directory/users/inviteGuest\nInvite guest users\nmicrosoft.directory/users/manager/update\nUpdate manager for users\nmicrosoft.directory/users/photo/update\nUpdate photo of users\nmicrosoft.directory/users/reprocessLicenseAssignment\nReprocess license assignments for users\nmicrosoft.directory/users/sponsors/update\nUpdate sponsors of users\nmicrosoft.directory/users/userPrincipalName/update\nUpdate User Principal Name of users\nDomain Name Administrator\nThis is a\nprivileged role\n. Users with this role can manage (read, add, verify, update, and delete) domain names. They can also read directory information about users, groups, and applications, as these objects possess domain dependencies. For on-premises environments, users with this role can configure domain names for federation so that associated users are always authenticated on-premises. These users can then sign into Microsoft Entra based services with their on-premises passwords via single sign-on. Federation settings need to be synced via Microsoft Entra Connect, so users also have permissions to manage Microsoft Entra Connect.\nActions\nDescription\nmicrosoft.directory/domains/allProperties/allTasks\nCreate and delete domains, and read and update all properties\nmicrosoft.office365.supportTickets/allEntities/allTasks\nCreate and manage Microsoft 365 service requests\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nDragon Administrator\nAssign the Dragon Administrator role to users who need to do the following tasks:\nManage all aspects of the administrative experience in the Dragon admin center\nProvision clinical applications\nCreate and manage the organization hierarchy\nOversee healthcare groups\nManage experiences of various clinical applications embedded in Electronic Health Record (EHR) systems\nConfigure clinical applications, such as manage settings, view analytics, and handle library objects\nCreate, manage, and view support tickets for their organization in the Dragon admin center\nCreate, view, manage, and monitor billing plans for licenses purchased by their organization (additional roles may be required)\nLearn more\nActions\nDescription\nmicrosoft.azure.serviceHealth/allEntities/allTasks\nRead and configure Azure Service Health\nmicrosoft.azure.supportTickets/allEntities/allTasks\nCreate and manage Azure support tickets\nmicrosoft.healthPlatform/allEntities/allProperties/allTasks\nManage all aspects of Microsoft Dragon admin center\nmicrosoft.office365.network/performance/allProperties/read\nRead all network performance properties in the Microsoft 365 admin center\nmicrosoft.office365.serviceHealth/allEntities/allTasks\nRead and configure Service Health in the Microsoft 365 admin center\nmicrosoft.office365.supportTickets/allEntities/allTasks\nCreate and manage Microsoft 365 service requests\nmicrosoft.office365.usageReports/allEntities/allProperties/read\nRead Office 365 usage reports\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nDynamics 365 Administrator\nAssign the Dynamics 365 Administrator role to users who need to manage all aspects of Dynamics 365 services, including configuration, user management, and support tickets.\nActions\nDescription\nmicrosoft.azure.serviceHealth/allEntities/allTasks\nRead and configure Azure Service Health\nmicrosoft.azure.supportTickets/allEntities/allTasks\nCreate and manage Azure support tickets\nmicrosoft.dynamics365/allEntities/allTasks\nManage all aspects of Dynamics 365\nmicrosoft.office365.serviceHealth/allEntities/allTasks\nRead and configure Service Health in the Microsoft 365 admin center\nmicrosoft.office365.supportTickets/allEntities/allTasks\nCreate and manage Microsoft 365 service requests\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nDynamics 365 Business Central Administrator\nAssign the Dynamics 365 Business Central Administrator role to users who need to do the following tasks:\nAccess Dynamics 365 Business Central environments\nPerform all administrative tasks on environments\nManage the lifecycle of customer's environments\nSupervise the extensions installed on environments\nControl upgrades of environments\nPerform data exports of environments\nRead and configure Azure and Microsoft 365 service health dashboards\nThis role does not provide any permissions for other Dynamics 365 products.\nActions\nDescription\nmicrosoft.azure.serviceHealth/allEntities/allTasks\nRead and configure Azure Service Health\nmicrosoft.directory/domains/standard/read\nRead basic properties on domains\nmicrosoft.directory/organization/standard/read\nRead basic properties on an organization\nmicrosoft.directory/subscribedSkus/standard/read\nRead basic properties on subscriptions\nmicrosoft.directory/users/standard/read\nRead basic properties on users\nmicrosoft.dynamics365.businessCentral/allEntities/allProperties/allTasks\nManage all aspects of Dynamics 365 Business Central\nmicrosoft.office365.serviceHealth/allEntities/allTasks\nRead and configure Service Health in the Microsoft 365 admin center\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nEdge Administrator\nUsers in this role can create and manage the enterprise site list required for Internet Explorer mode on Microsoft Edge. This role grants permissions to create, edit, and publish the site list and additionally allows access to manage support tickets.\nLearn more\nActions\nDescription\nmicrosoft.edge/allEntities/allProperties/allTasks\nManage all aspects of Microsoft Edge\nmicrosoft.office365.supportTickets/allEntities/allTasks\nCreate and manage Microsoft 365 service requests\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nExchange Administrator\nUsers with this role have global permissions within Microsoft Exchange Online, when the service is present. Also has the ability to create and manage all Microsoft 365 groups, manage support tickets, and monitor service health. For more information, see\nAbout admin roles in the Microsoft 365 admin center\n.\nNote\nIn the Microsoft Graph API and Microsoft Graph PowerShell, this role is named Exchange Service Administrator. In the\nAzure portal\n, it is named Exchange Administrator. In the\nExchange admin center\n, it is named Exchange Online administrator.\nActions\nDescription\nmicrosoft.azure.serviceHealth/allEntities/allTasks\nRead and configure Azure Service Health\nmicrosoft.azure.supportTickets/allEntities/allTasks\nCreate and manage Azure support tickets\nmicrosoft.backup/exchangeProtectionPolicies/allProperties/allTasks\nCreate and manage Exchange Online protection policy in Microsoft 365 Backup\nmicrosoft.backup/exchangeRestoreSessions/allProperties/allTasks\nRead and configure restore session for Exchange Online in Microsoft 365 Backup\nmicrosoft.backup/restorePoints/userMailboxes/allProperties/allTasks\nManage all restore points associated with selected Exchange Online mailboxes in M365 Backup\nmicrosoft.backup/userMailboxProtectionUnits/allProperties/allTasks\nManage mailboxes added to Exchange Online protection policy in Microsoft 365 Backup\nmicrosoft.backup/userMailboxRestoreArtifacts/allProperties/allTasks\nManage mailboxes added to restore session for Exchange Online in Microsoft 365 Backup\nmicrosoft.directory/contacts/allProperties/read\nRead all properties for contacts\nmicrosoft.directory/contacts/memberOf/read\nRead the group membership for all contacts in Microsoft Entra ID\nmicrosoft.directory/contacts/standard/read\nRead basic properties on contacts in Microsoft Entra ID\nmicrosoft.directory/groups.unified/assignedLabels/update\nUpdate the assigned labels property on Microsoft 365 groups of assigned membership type, excluding role-assignable groups\nmicrosoft.directory/groups.unified/basic/update\nUpdate basic properties on Microsoft 365 groups, excluding role-assignable groups\nmicrosoft.directory/groups.unified/create\nCreate Microsoft 365 groups, excluding role-assignable groups\nmicrosoft.directory/groups.unified/delete\nDelete Microsoft 365 groups, excluding role-assignable groups\nmicrosoft.directory/groups.unified/members/update\nUpdate members of Microsoft 365 groups, excluding role-assignable groups\nmicrosoft.directory/groups.unified/owners/update\nUpdate owners of Microsoft 365 groups, excluding role-assignable groups\nmicrosoft.directory/groups.unified/restore\nRestore Microsoft 365 groups from soft-deleted container, excluding role-assignable groups\nmicrosoft.directory/groups/hiddenMembers/read\nRead hidden members of Security groups and Microsoft 365 groups, including role-assignable groups\nmicrosoft.directory/onPremisesSynchronization/standard/read\nRead standard on-premises directory synchronization information\nmicrosoft.office365.exchange/allEntities/basic/allTasks\nManage all aspects of Exchange Online\nmicrosoft.office365.network/performance/allProperties/read\nRead all network performance properties in the Microsoft 365 admin center\nmicrosoft.office365.serviceHealth/allEntities/allTasks\nRead and configure Service Health in the Microsoft 365 admin center\nmicrosoft.office365.supportTickets/allEntities/allTasks\nCreate and manage Microsoft 365 service requests\nmicrosoft.office365.usageReports/allEntities/allProperties/read\nRead Office 365 usage reports\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nExchange Backup Administrator\nAssign the Exchange Backup Administrator role to users who need to do the following tasks:\nManage all aspects of Microsoft 365 Backup for Exchange Online\nBack up and restore content including granular restore for Exchange Online\nCreate, edit, and manage backup configuration policies for Exchange Online\nPerform restore operations for Exchange Online\nActions\nDescription\nmicrosoft.azure.serviceHealth/allEntities/allTasks\nRead and configure Azure Service Health\nmicrosoft.azure.supportTickets/allEntities/allTasks\nCreate and manage Azure support tickets\nmicrosoft.backup/exchangeProtectionPolicies/allProperties/allTasks\nCreate and manage Exchange Online protection policy in Microsoft 365 Backup\nmicrosoft.backup/exchangeRestoreSessions/allProperties/allTasks\nRead and configure restore session for Exchange Online in Microsoft 365 Backup\nmicrosoft.backup/restorePoints/userMailboxes/allProperties/allTasks\nManage all restore points associated with selected Exchange Online mailboxes in M365 Backup\nmicrosoft.backup/userMailboxProtectionUnits/allProperties/allTasks\nManage mailboxes added to Exchange Online protection policy in Microsoft 365 Backup\nmicrosoft.backup/userMailboxRestoreArtifacts/allProperties/allTasks\nManage mailboxes added to restore session for Exchange Online in Microsoft 365 Backup\nmicrosoft.office365.network/performance/allProperties/read\nRead all network performance properties in the Microsoft 365 admin center\nmicrosoft.office365.serviceHealth/allEntities/allTasks\nRead and configure Service Health in the Microsoft 365 admin center\nmicrosoft.office365.supportTickets/allEntities/allTasks\nCreate and manage Microsoft 365 service requests\nmicrosoft.office365.usageReports/allEntities/allProperties/read\nRead Office 365 usage reports\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nExchange Recipient Administrator\nUsers with this role have read access to recipients and write access to the attributes of those recipients in Exchange Online. For more information, see\nRecipients in Exchange Server\n.\nActions\nDescription\nmicrosoft.office365.exchange/migration/allProperties/allTasks\nManage all tasks related to migration of recipients in Exchange Online\nmicrosoft.office365.exchange/recipients/allProperties/allTasks\nCreate and delete all recipients, and read and update all properties of recipients in Exchange Online\nExtended Directory User Administrator\nActions\nDescription\nmicrosoft.directory/externalUserProfiles/basic/update\nUpdate basic properties of external user profiles in the extended directory for Teams\nmicrosoft.directory/externalUserProfiles/delete\nDelete external user profiles in the extended directory for Teams\nmicrosoft.directory/externalUserProfiles/standard/read\nRead standard properties of external user profiles in the extended directory for Teams\nmicrosoft.directory/pendingExternalUserProfiles/basic/update\nUpdate basic properties of external user profiles in the extended directory for Teams\nmicrosoft.directory/pendingExternalUserProfiles/create\nCreate external user profiles in the extended directory for Teams\nmicrosoft.directory/pendingExternalUserProfiles/delete\nDelete external user profiles in the extended directory for Teams\nmicrosoft.directory/pendingExternalUserProfiles/standard/read\nRead standard properties of external user profiles in the extended directory for Teams\nExternal ID User Flow Administrator\nUsers with this role can create and manage user flows (also called \"built-in\" policies) in the Azure portal. These users can customize HTML/CSS/JavaScript content, change MFA requirements, select claims in the token, manage API connectors and their credentials, and configure session settings for all user flows in the Microsoft Entra organization. On the other hand, this role does not include the ability to review user data or make changes to the attributes that are included in the organization schema. Changes to Identity Experience Framework policies (also known as custom policies) are also outside the scope of this role.\nActions\nDescription\nmicrosoft.directory/b2cUserFlow/allProperties/allTasks\nRead and configure user flow in Azure Active Directory B2C\nExternal ID User Flow Attribute Administrator\nUsers with this role add or delete custom attributes available to all user flows in the Microsoft Entra organization. As such, users with this role can change or add new elements to the end-user schema and impact the behavior of all user flows, and indirectly result in changes to what data may be asked of end users and ultimately sent as claims to applications. This role can't edit user flows.\nActions\nDescription\nmicrosoft.directory/b2cUserAttribute/allProperties/allTasks\nRead and configure user attribute in Azure Active Directory B2C\nExternal Identity Provider Administrator\nThis is a\nprivileged role\n. This administrator manages federation between Microsoft Entra organizations and external identity providers. With this role, users can add new identity providers and configure all available settings (e.g. authentication path, service ID, assigned key containers). This user can enable the Microsoft Entra organization to trust authentications from external identity providers. The resulting impact on end-user experiences depends on the type of organization:\nMicrosoft Entra organizations for employees and partners: The addition of a federation (e.g. with Gmail) will immediately impact all guest invitations not yet redeemed. See\nAdding Google as an identity provider for B2B guest users\n.\nAzure Active Directory B2C organizations: The addition of a federation (for example, with Facebook, or with another Microsoft Entra organization) does not immediately impact end-user flows until the identity provider is added as an option in a user flow (also called a built-in policy). See\nConfiguring a Microsoft account as an identity provider\nfor an example. To change user flows, the limited role of \"B2C User Flow Administrator\" is required.\nActions\nDescription\nmicrosoft.directory/domains/federation/update\nUpdate federation property of domains\nmicrosoft.directory/identityProviders/allProperties/allTasks\nRead and configure identity providers in Azure Active Directory B2C\nFabric Administrator\nUsers with this role have global permissions within Microsoft Fabric and Power BI, when the service is present, as well as the ability to manage support tickets and monitor service health. For more information, see\nUnderstanding Fabric admin roles\n.\nActions\nDescription\nmicrosoft.azure.serviceHealth/allEntities/allTasks\nRead and configure Azure Service Health\nmicrosoft.azure.supportTickets/allEntities/allTasks\nCreate and manage Azure support tickets\nmicrosoft.office365.serviceHealth/allEntities/allTasks\nRead and configure Service Health in the Microsoft 365 admin center\nmicrosoft.office365.supportTickets/allEntities/allTasks\nCreate and manage Microsoft 365 service requests\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nmicrosoft.powerApps.powerBI/allEntities/allTasks\nManage all aspects of Fabric and Power BI\nGlobal Administrator\nThis is a\nprivileged role\n. Users with this role have access to all administrative features in Microsoft Entra ID, as well as services that use Microsoft Entra identities like the Microsoft 365 Defender portal, the Microsoft Purview portal, Exchange Online, SharePoint Online, and Skype for Business Online. Global Administrators can view Directory Activity logs. Furthermore, Global Administrators can\nelevate their access\nto manage all Azure subscriptions and management groups. This allows Global Administrators to get full access to all Azure resources using the respective Microsoft Entra tenant. The person who signs up for the Microsoft Entra organization becomes a Global Administrator. There can be more than one Global Administrator at your company. Global Administrators can reset the password for any user and all other administrators. A Global Administrator cannot remove their own Global Administrator assignment. This is to prevent a situation where an organization has zero Global Administrators.\nNote\nAs a best practice, Microsoft recommends that you assign the Global Administrator role to fewer than five people in your organization. For more information, see\nBest practices for Microsoft Entra roles\n.\nActions\nDescription\nmicrosoft.agentRegistry/allEntities/allProperties/allTasks\nManage all aspects of Agent Registry in Microsoft Entra ID\nmicrosoft.azure.advancedThreatProtection/allEntities/allTasks\nManage all aspects of Azure Advanced Threat Protection\nmicrosoft.azure.informationProtection/allEntities/allTasks\nManage all aspects of Azure Information Protection\nmicrosoft.azure.serviceHealth/allEntities/allTasks\nRead and configure Azure Service Health\nmicrosoft.azure.supportTickets/allEntities/allTasks\nCreate and manage Azure support tickets\nmicrosoft.backup/allEntities/allProperties/allTasks\nManage all aspects of Microsoft 365 Backup\nmicrosoft.cloudPC/allEntities/allProperties/allTasks\nManage all aspects of Windows 365\nmicrosoft.commerce.billing/allEntities/allProperties/allTasks\nManage all aspects of Office 365 billing\nmicrosoft.commerce.billing/purchases/standard/read\nRead purchase services in Microsoft 365 admin center.\nmicrosoft.directory/accessReviews/allProperties/allTasks\nCreate and delete access reviews, and read and update all properties of access reviews in Microsoft Entra ID\nmicrosoft.directory/accessReviews/definitions/allProperties/allTasks\nManage access reviews of all reviewable resources in Microsoft Entra ID\nmicrosoft.directory/adminConsentRequestPolicy/allProperties/allTasks\nManage admin consent request policies in Microsoft Entra ID\nmicrosoft.directory/administrativeUnits/allProperties/allTasks\nCreate and manage administrative units (including members)\nmicrosoft.directory/agentIdentities/appRoleAssignedTo/update\nUpdate agent identity role assignments.\nmicrosoft.directory/agentIdentities/basic/update\nUpdate basic properties of agent identities.\nmicrosoft.directory/agentIdentities/create\nCreate agent identities.\nmicrosoft.directory/agentIdentities/delete\nDelete agent identities.\nmicrosoft.directory/agentIdentities/disable\nDisable agent identities.\nmicrosoft.directory/agentIdentities/enable\nEnable agent identities.\nmicrosoft.directory/agentIdentities/owners/update\nAdd and remove owners to agent identities.\nmicrosoft.directory/agentIdentities/tag/update\nUpdate tags for agent identities.\nmicrosoft.directory/agentIdentityBlueprintPrincipals/appRoleAssignedTo/update\nUpdate agent identity blueprint principal role assignments.\nmicrosoft.directory/agentIdentityBlueprintPrincipals/basic/update\nUpdate basic properties of agent identity blueprint principals.\nmicrosoft.directory/agentIdentityBlueprintPrincipals/create\nCreate agent identity blueprint principals.\nmicrosoft.directory/agentIdentityBlueprintPrincipals/delete\nDelete agent identity blueprint principals.\nmicrosoft.directory/agentIdentityBlueprintPrincipals/disable\nDisable agent identity blueprint principals.\nmicrosoft.directory/agentIdentityBlueprintPrincipals/enable\nEnable agent identity blueprint principals.\nmicrosoft.directory/agentIdentityBlueprintPrincipals/owners/update\nAdd and remove owners to agent identity blueprint principals.\nmicrosoft.directory/agentIdentityBlueprintPrincipals/tag/update\nUpdate tags for agent identity blueprint principals.\nmicrosoft.directory/agentIdentityBlueprints/allProperties/read\nRead all properties and settings for agent identity blueprints.\nmicrosoft.directory/agentIdentityBlueprints/allProperties/update\nUpdate all properties and settings for agent identity blueprints.\nmicrosoft.directory/agentIdentityBlueprints/appRoles/update\nModify app roles defined on agent identity blueprints.\nmicrosoft.directory/agentIdentityBlueprints/authentication/update\nUpdate authentication related settings for agent identity blueprints.\nmicrosoft.directory/agentIdentityBlueprints/audience/update\nUpdate the sign-in audience setting for agent identity blueprints.\nmicrosoft.directory/agentIdentityBlueprints/basic/update\nUpdate basic properties of agent identity blueprints.\nmicrosoft.directory/agentIdentityBlueprints/create\nCreate agent identity blueprints.\nmicrosoft.directory/agentIdentityBlueprints/credentials/update\nAdd and remove credentials to agent identity blueprints.\nmicrosoft.directory/agentIdentityBlueprints/delete\nDelete agent identity blueprints.\nmicrosoft.directory/agentIdentityBlueprints/owners/update\nAdd and remove owners to agent identity blueprints.\nmicrosoft.directory/agentIdentityBlueprints/permissions/update\nModify exposed permissions on agent identity blueprints.\nmicrosoft.directory/agentIdentityBlueprints/tag/update\nUpdate tags for agent identity blueprints.\nmicrosoft.directory/agentIdentityBlueprints/verification/update\nUpdate publisher verification setting for agent identity blueprints.\nmicrosoft.directory/agentUsers/assignLicense\nManage agent user licenses\nmicrosoft.directory/agentUsers/basic/update\nUpdate basic properties on agent users\nmicrosoft.directory/agentUsers/create\nAdd agent users\nmicrosoft.directory/agentUsers/delete\nDelete agent users\nmicrosoft.directory/agentUsers/disable\nDisable agent users\nmicrosoft.directory/agentUsers/enable\nEnable agent users\nmicrosoft.directory/agentUsers/invalidateAllRefreshTokens\nForce sign-out by invalidating agent user refresh tokens\nmicrosoft.directory/agentUsers/lifeCycleInfo/read\nRead lifecycle information of agent users, such as employeeLeaveDateTime\nmicrosoft.directory/agentUsers/lifeCycleInfo/update\nUpdate lifecycle information of agent users, such as employeeLeaveDateTime\nmicrosoft.directory/agentUsers/manager/update\nUpdate manager for agent users\nmicrosoft.directory/agentUsers/photo/update\nUpdate photo of agent users\nmicrosoft.directory/agentUsers/reprocessLicenseAssignment\nReprocess license assignments for agent users\nmicrosoft.directory/agentUsers/restore\nRestore deleted agent users\nmicrosoft.directory/agentUsers/revokeSignInSessions\nRevoke sign-in sessions for a agent user\nmicrosoft.directory/agentUsers/sponsors/update\nUpdate sponsors of agent users\nmicrosoft.directory/agentUsers/usageLocation/update\nUpdate usage location of agent users\nmicrosoft.directory/agentUsers/userPrincipalName/update\nUpdate User Principal Name of agent users\nmicrosoft.directory/appConsent/appConsentRequests/allProperties/read\nRead all properties of consent requests for applications registered with Microsoft Entra ID\nmicrosoft.directory/applications/allProperties/allTasks\nCreate and delete applications, and read and update all properties\nmicrosoft.directory/applications/synchronization/standard/read\nRead provisioning settings associated with the application object\nmicrosoft.directory/applicationTemplates/instantiate\nInstantiate gallery applications from application templates\nmicrosoft.directory/auditLogs/allProperties/read\nRead all properties on audit logs, excluding custom security attributes audit logs\nmicrosoft.directory/authorizationPolicy/allProperties/allTasks\nManage all aspects of authorization policy\nmicrosoft.directory/bitlockerKeys/key/read\nRead bitlocker metadata and key on devices\nmicrosoft.directory/bulkJobs/basic/update\nUpdate all the bulk jobs in a directory\nmicrosoft.directory/bulkJobs/create\nCreate all bulk jobs in a directory\nmicrosoft.directory/cloudAppSecurity/allProperties/allTasks\nCreate and delete all resources, and read and update standard properties in Microsoft Defender for Cloud Apps\nmicrosoft.directory/conditionalAccessPolicies/allProperties/allTasks\nManage all properties of Conditional Access policies\nmicrosoft.directory/connectorGroups/allProperties/read\nRead all properties of application proxy connector groups\nmicrosoft.directory/connectorGroups/allProperties/update\nUpdate all properties of application proxy connector groups\nmicrosoft.directory/connectorGroups/create\nCreate application proxy connector groups\nmicrosoft.directory/connectorGroups/delete\nDelete application proxy connector groups\nmicrosoft.directory/connectors/allProperties/read\nRead all properties of application proxy connectors\nmicrosoft.directory/connectors/create\nCreate application proxy connectors\nmicrosoft.directory/contacts/allProperties/allTasks\nCreate and delete contacts, and read and update all properties\nmicrosoft.directory/contracts/allProperties/allTasks\nCreate and delete partner contracts, and read and update all properties\nmicrosoft.directory/crossTenantAccessPolicy/allowedCloudEndpoints/update\nUpdate allowed cloud endpoints of cross-tenant access policy\nmicrosoft.directory/crossTenantAccessPolicy/basic/update\nUpdate basic settings of cross-tenant access policy\nmicrosoft.directory/crossTenantAccessPolicy/default/b2bCollaboration/update\nUpdate Microsoft Entra B2B collaboration settings of the default cross-tenant access policy\nmicrosoft.directory/crossTenantAccessPolicy/default/b2bDirectConnect/update\nUpdate Microsoft Entra B2B direct connect settings of the default cross-tenant access policy\nmicrosoft.directory/crossTenantAccessPolicy/default/crossCloudMeetings/update\nUpdate cross-cloud Teams meeting settings of the default cross-tenant access policy\nmicrosoft.directory/crossTenantAccessPolicy/default/standard/read\nRead basic properties of the default cross-tenant access policy\nmicrosoft.directory/crossTenantAccessPolicy/default/tenantRestrictions/update\nUpdate tenant restrictions of the default cross-tenant access policy\nmicrosoft.directory/crossTenantAccessPolicy/partners/b2bCollaboration/update\nUpdate Microsoft Entra B2B collaboration settings of cross-tenant access policy for partners\nmicrosoft.directory/crossTenantAccessPolicy/partners/b2bDirectConnect/update\nUpdate Microsoft Entra B2B direct connect settings of cross-tenant access policy for partners\nmicrosoft.directory/crossTenantAccessPolicy/partners/create\nCreate cross-tenant access policy for partners\nmicrosoft.directory/crossTenantAccessPolicy/partners/crossCloudMeetings/update\nUpdate cross-cloud Teams meeting settings of cross-tenant access policy for partners\nmicrosoft.directory/crossTenantAccessPolicy/partners/delete\nDelete cross-tenant access policy for partners\nmicrosoft.directory/crossTenantAccessPolicy/partners/identitySynchronization/basic/update\nUpdate basic settings of cross-tenant sync policy\nmicrosoft.directory/crossTenantAccessPolicy/partners/identitySynchronization/create\nCreate cross-tenant sync policy for partners\nmicrosoft.directory/crossTenantAccessPolicy/partners/identitySynchronization/standard/read\nRead basic properties of cross-tenant sync policy\nmicrosoft.directory/crossTenantAccessPolicy/partners/standard/read\nRead basic properties of cross-tenant access policy for partners\nmicrosoft.directory/crossTenantAccessPolicy/partners/templates/multiTenantOrganizationIdentitySynchronization/basic/update\nUpdate cross tenant sync policy templates for multi-tenant organization\nmicrosoft.directory/crossTenantAccessPolicy/partners/templates/multiTenantOrganizationIdentitySynchronization/resetToDefaultSettings\nReset cross tenant sync policy template for multi-tenant organization to default settings\nmicrosoft.directory/crossTenantAccessPolicy/partners/templates/multiTenantOrganizationIdentitySynchronization/standard/read\nRead basic properties of cross tenant sync policy templates for multi-tenant organization\nmicrosoft.directory/crossTenantAccessPolicy/partners/templates/multiTenantOrganizationPartnerConfiguration/basic/update\nUpdate cross tenant access policy templates for multi-tenant organization\nmicrosoft.directory/crossTenantAccessPolicy/partners/templates/multiTenantOrganizationPartnerConfiguration/resetToDefaultSettings\nReset cross tenant access policy template for multi-tenant organization to default settings\nmicrosoft.directory/crossTenantAccessPolicy/partners/templates/multiTenantOrganizationPartnerConfiguration/standard/read\nRead basic properties of cross tenant access policy templates for multi-tenant organization\nmicrosoft.directory/crossTenantAccessPolicy/partners/tenantRestrictions/update\nUpdate tenant restrictions of cross-tenant access policy for partners\nmicrosoft.directory/crossTenantAccessPolicy/standard/read\nRead basic properties of cross-tenant access policy\nmicrosoft.directory/customAuthenticationExtensions/allProperties/allTasks\nCreate and manage custom authentication extensions\nmicrosoft.directory/deletedItems/delete\nPermanently delete objects, which can no longer be restored\nmicrosoft.directory/deletedItems/restore\nRestore soft deleted objects to original state\nmicrosoft.directory/deviceLocalCredentials/password/read\nRead all properties of the backed up local administrator account credentials for Microsoft Entra joined devices, including the password\nmicrosoft.directory/deviceManagementPolicies/basic/update\nUpdate basic properties on mobile device management and mobile app management policies\nmicrosoft.directory/deviceManagementPolicies/standard/read\nRead standard properties on mobile device management and mobile app management policies\nmicrosoft.directory/deviceRegistrationPolicy/basic/update\nUpdate basic properties on device registration policies\nmicrosoft.directory/deviceRegistrationPolicy/standard/read\nRead standard properties on device registration policies\nmicrosoft.directory/devices/allProperties/allTasks\nCreate and delete devices, and read and update all properties\nmicrosoft.directory/devices/permissions/update\nUpdate the alternative name property on an IoT device\nmicrosoft.directory/deviceTemplates/owners/read\nRead owners on Internet of Things (IoT) device templates\nmicrosoft.directory/deviceTemplates/owners/update\nUpdate owners on Internet of Things (IoT) device templates\nmicrosoft.directory/directoryRoles/allProperties/allTasks\nCreate and delete directory roles, and read and update all properties\nmicrosoft.directory/directoryRoleTemplates/allProperties/allTasks\nCreate and delete Microsoft Entra role templates, and read and update all properties\nmicrosoft.directory/domains/allProperties/allTasks\nCreate and delete domains, and read and update all properties\nmicrosoft.directory/domains/federationConfiguration/basic/update\nUpdate basic federation configuration for domains\nmicrosoft.directory/domains/federationConfiguration/create\nCreate federation configuration for domains\nmicrosoft.directory/domains/federationConfiguration/delete\nDelete federation configuration for domains\nmicrosoft.directory/domains/federationConfiguration/standard/read\nRead standard properties of federation configuration for domains\nmicrosoft.directory/entitlementManagement/allProperties/allTasks\nCreate and delete resources, and read and update all properties in Microsoft Entra entitlement management\nmicrosoft.directory/externalUserProfiles/basic/update\nUpdate basic properties of external user profiles in the extended directory for Teams\nmicrosoft.directory/externalUserProfiles/delete\nDelete external user profiles in the extended directory for Teams\nmicrosoft.directory/externalUserProfiles/standard/read\nRead standard properties of external user profiles in the extended directory for Teams\nmicrosoft.directory/groups/allProperties/allTasks\nCreate and delete groups, and read and update all properties\nmicrosoft.directory/groupsAssignableToRoles/allProperties/update\nUpdate role-assignable groups\nmicrosoft.directory/groupsAssignableToRoles/assignLicense\nAssign a license to role-assignable groups\nmicrosoft.directory/groupsAssignableToRoles/create\nCreate role-assignable groups\nmicrosoft.directory/groupsAssignableToRoles/delete\nDelete role-assignable groups\nmicrosoft.directory/groupsAssignableToRoles/reprocessLicenseAssignment\nReprocess license assignments to role-assignable groups\nmicrosoft.directory/groupsAssignableToRoles/restore\nRestore role-assignable groups\nmicrosoft.directory/groupSettings/allProperties/allTasks\nCreate and delete group settings, and read and update all properties\nmicrosoft.directory/groupSettingTemplates/allProperties/allTasks\nCreate and delete group setting templates, and read and update all properties\nmicrosoft.directory/hybridAuthenticationPolicy/allProperties/allTasks\nManage hybrid authentication policy in Microsoft Entra ID\nmicrosoft.directory/identityProtection/allProperties/allTasks\nCreate and delete all resources, and read and update standard properties in Microsoft Entra ID Protection\nmicrosoft.directory/lifecycleWorkflows/workflows/allProperties/allTasks\nManage all aspects of lifecycle workflows and tasks in Microsoft Entra ID\nmicrosoft.directory/loginOrganizationBranding/allProperties/allTasks\nCreate and delete loginTenantBranding, and read and update all properties\nmicrosoft.directory/multiTenantOrganization/basic/update\nUpdate basic properties of a multi-tenant organization\nmicrosoft.directory/multiTenantOrganization/create\nCreate a multi-tenant organization\nmicrosoft.directory/multiTenantOrganization/joinRequest/organizationDetails/update\nJoin a multi-tenant organization\nmicrosoft.directory/multiTenantOrganization/joinRequest/standard/read\nRead properties of a multi-tenant organization join request\nmicrosoft.directory/multiTenantOrganization/standard/read\nRead basic properties of a multi-tenant organization\nmicrosoft.directory/multiTenantOrganization/tenants/create\nCreate a tenant in a multi-tenant organization\nmicrosoft.directory/multiTenantOrganization/tenants/delete\nDelete a tenant participating in a multi-tenant organization\nmicrosoft.directory/multiTenantOrganization/tenants/organizationDetails/read\nRead organization details of a tenant participating in a multi-tenant organization\nmicrosoft.directory/multiTenantOrganization/tenants/organizationDetails/update\nUpdate basic properties of a tenant participating in a multi-tenant organization\nmicrosoft.directory/multiTenantOrganization/tenants/standard/read\nRead basic properties of a tenant participating in a multi-tenant organization\nmicrosoft.directory/namedLocations/basic/update\nUpdate basic properties of custom rules that define network locations\nmicrosoft.directory/namedLocations/create\nCreate custom rules that define network locations\nmicrosoft.directory/namedLocations/delete\nDelete custom rules that define network locations\nmicrosoft.directory/namedLocations/standard/read\nRead basic properties of custom rules that define network locations\nmicrosoft.directory/oAuth2PermissionGrants/allProperties/allTasks\nCreate and delete OAuth 2.0 permission grants, and read and update all properties\nmicrosoft.directory/onPremisesSynchronization/basic/update\nUpdate basic on-premises directory synchronization information\nmicrosoft.directory/onPremisesSynchronization/standard/read\nRead standard on-premises directory synchronization information\nmicrosoft.directory/organization/allProperties/allTasks\nRead and update all properties for an organization\nmicrosoft.directory/passwordHashSync/allProperties/allTasks\nManage all aspects of Password Hash Synchronization (PHS) in Microsoft Entra ID\nmicrosoft.directory/pendingExternalUserProfiles/basic/update\nUpdate basic properties of external user profiles in the extended directory for Teams\nmicrosoft.directory/pendingExternalUserProfiles/create\nCreate external user profiles in the extended directory for Teams\nmicrosoft.directory/pendingExternalUserProfiles/delete\nDelete external user profiles in the extended directory for Teams\nmicrosoft.directory/pendingExternalUserProfiles/standard/read\nRead standard properties of external user profiles in the extended directory for Teams\nmicrosoft.directory/permissionGrantPolicies/basic/update\nUpdate basic properties of permission grant policies\nmicrosoft.directory/permissionGrantPolicies/create\nCreate permission grant policies\nmicrosoft.directory/permissionGrantPolicies/delete\nDelete permission grant policies\nmicrosoft.directory/permissionGrantPolicies/standard/read\nRead standard properties of permission grant policies\nmicrosoft.directory/policies/allProperties/allTasks\nCreate and delete policies, and read and update all properties\nmicrosoft.directory/privilegedIdentityManagement/allProperties/read\nRead all resources in Privileged Identity Management\nmicrosoft.directory/provisioningLogs/allProperties/read\nRead all properties of provisioning logs\nmicrosoft.directory/resourceNamespaces/resourceActions/authenticationContext/update\nUpdate Conditional Access authentication context of Microsoft 365 role-based access control (RBAC) resource actions\nmicrosoft.directory/roleAssignments/allProperties/allTasks\nCreate and delete role assignments, and read and update all role assignment properties\nmicrosoft.directory/roleDefinitions/allProperties/allTasks\nCreate and delete role definitions, and read and update all properties\nmicrosoft.directory/scopedRoleMemberships/allProperties/allTasks\nCreate and delete scopedRoleMemberships, and read and update all properties\nmicrosoft.directory/serviceAction/activateService\nCan perform the \"activate service\" action for a service\nmicrosoft.directory/serviceAction/disableDirectoryFeature\nCan perform the \"disable directory feature\" service action\nmicrosoft.directory/serviceAction/enableDirectoryFeature\nCan perform the \"enable directory feature\" service action\nmicrosoft.directory/serviceAction/getAvailableExtentionProperties\nCan perform the getAvailableExtentionProperties service action\nmicrosoft.directory/servicePrincipalCreationPolicies/basic/update\nUpdate basic properties of service principal creation policies\nmicrosoft.directory/servicePrincipalCreationPolicies/create\nCreate service principal creation policies\nmicrosoft.directory/servicePrincipalCreationPolicies/delete\nDelete service principal creation policies\nmicrosoft.directory/servicePrincipalCreationPolicies/standard/read\nRead standard properties of service principal creation policies\nmicrosoft.directory/servicePrincipals/allProperties/allTasks\nCreate and delete service principals, and read and update all properties\nmicrosoft.directory/servicePrincipals/managePermissionGrantsForAll.microsoft-company-admin\nGrant consent for any permission to any application\nmicrosoft.directory/servicePrincipals/synchronization.cloudTenantToCloudTenant/credentials/manage\nManage cloud tenant to cloud tenant application provisioning secrets and credentials.\nmicrosoft.directory/servicePrincipals/synchronization.cloudTenantToCloudTenant/jobs/manage\nStart, restart, and pause cloud tenant to cloud tenant application provisioning synchronization jobs.\nmicrosoft.directory/servicePrincipals/synchronization.cloudTenantToCloudTenant/schema/manage\nCreate and manage cloud tenant to cloud tenant application provisioning synchronization jobs and schema.\nmicrosoft.directory/servicePrincipals/synchronization.cloudTenantToExternalSystem/credentials/manage\nManage application provisioning secrets and credentials.\nmicrosoft.directory/servicePrincipals/synchronization.cloudTenantToExternalSystem/jobs/manage\nStart, restart, and pause application provisioning synchronization jobs.\nmicrosoft.directory/servicePrincipals/synchronization.cloudTenantToExternalSystem/schema/manage\nCreate and manage application provisioning synchronization jobs and schema.\nmicrosoft.directory/servicePrincipals/synchronization/standard/read\nRead provisioning settings associated with your service principal\nmicrosoft.directory/signInReports/allProperties/read\nRead all properties on sign-in reports, including privileged properties\nmicrosoft.directory/subscribedSkus/allProperties/allTasks\nBuy and manage subscriptions and delete subscriptions\nmicrosoft.directory/tenantManagement/tenants/create\nCreate new tenants in Microsoft Entra ID\nmicrosoft.directory/users/allProperties/allTasks\nCreate and delete users, and read and update all properties\nmicrosoft.directory/users/authenticationMethods/basic/update\nUpdate basic properties of authentication methods for users\nmicrosoft.directory/users/authenticationMethods/create\nUpdate authentication methods for users\nmicrosoft.directory/users/authenticationMethods/delete\nDelete authentication methods for users\nmicrosoft.directory/users/authenticationMethods/standard/read\nRead standard properties of authentication methods for users\nmicrosoft.directory/users/convertExternalToInternalMemberUser\nConvert external user to internal user\nmicrosoft.directory/verifiableCredentials/configuration/allProperties/read\nRead configuration required to create and manage verifiable credentials\nmicrosoft.directory/verifiableCredentials/configuration/allProperties/update\nUpdate configuration required to create and manage verifiable credentials\nmicrosoft.directory/verifiableCredentials/configuration/contracts/allProperties/read\nRead a verifiable credential contract\nmicrosoft.directory/verifiableCredentials/configuration/contracts/allProperties/update\nUpdate a verifiable credential contract\nmicrosoft.directory/verifiableCredentials/configuration/contracts/cards/allProperties/read\nRead a verifiable credential card\nmicrosoft.directory/verifiableCredentials/configuration/contracts/cards/revoke\nRevoke a verifiable credential card\nmicrosoft.directory/verifiableCredentials/configuration/contracts/create\nCreate a verifiable credential contract\nmicrosoft.directory/verifiableCredentials/configuration/create\nCreate configuration required to create and manage verifiable credentials\nmicrosoft.directory/verifiableCredentials/configuration/delete\nDelete configuration required to create and manage verifiable credentials and delete all of its verifiable credentials\nmicrosoft.dynamics365/allEntities/allTasks\nManage all aspects of Dynamics 365\nmicrosoft.edge/allEntities/allProperties/allTasks\nManage all aspects of Microsoft Edge\nmicrosoft.flow/allEntities/allTasks\nManage all aspects of Microsoft Power Automate\nmicrosoft.graph.dataConnect/allEntities/allProperties/allTasks\nManage aspects of Microsoft Graph Data Connect\nmicrosoft.hardware.support/shippingAddress/allProperties/allTasks\nCreate, read, update, and delete shipping addresses for Microsoft hardware warranty claims, including shipping addresses created by others\nmicrosoft.hardware.support/shippingStatus/allProperties/read\nRead shipping status for open Microsoft hardware warranty claims\nmicrosoft.hardware.support/warrantyClaims/allProperties/allTasks\nCreate and manage all aspects of Microsoft hardware warranty claims\nmicrosoft.healthPlatform/allEntities/allProperties/allTasks\nManage all aspects of Microsoft Dragon admin center\nmicrosoft.insights/allEntities/allProperties/allTasks\nManage all aspects of Insights app\nmicrosoft.intune/allEntities/allTasks\nManage all aspects of Microsoft Intune\nmicrosoft.microsoft365.organizationalData/allEntities/allProperties/allTasks\nManage all aspects of organizational data in Microsoft 365\nmicrosoft.networkAccess/allEntities/allProperties/allTasks\nManage all aspects of Microsoft Entra Network Access\nmicrosoft.networkAccess/trafficLogs/standard/read\nRead standard properties of traffic logs such as DeviceId, DestinationIp and PolicyRuleId\nmicrosoft.office365.complianceManager/allEntities/allTasks\nManage all aspects of Office 365 Compliance Manager\nmicrosoft.office365.copilot/allEntities/allProperties/allTasks\nCreate and manage all settings for Microsoft 365 Copilot\nmicrosoft.office365.desktopAnalytics/allEntities/allTasks\nManage all aspects of Desktop Analytics\nmicrosoft.office365.exchange/allEntities/basic/allTasks\nManage all aspects of Exchange Online\nmicrosoft.office365.fileStorageContainers/allEntities/allProperties/allTasks\nManage all aspects of SharePoint Embedded containers\nmicrosoft.office365.knowledge/contentUnderstanding/allProperties/allTasks\nRead and update all properties of content understanding in Microsoft 365 admin center\nmicrosoft.office365.knowledge/contentUnderstanding/analytics/allProperties/read\nRead analytics reports of content understanding in Microsoft 365 admin center\nmicrosoft.office365.knowledge/knowledgeNetwork/allProperties/allTasks\nRead and update all properties of knowledge network in Microsoft 365 admin center\nmicrosoft.office365.knowledge/knowledgeNetwork/topicVisibility/allProperties/allTasks\nManage topic visibility of knowledge network in Microsoft 365 admin center\nmicrosoft.office365.knowledge/learningSources/allProperties/allTasks\nManage learning sources and all their properties in Learning App.\nmicrosoft.office365.lockbox/allEntities/allTasks\nManage all aspects of Customer Lockbox\nmicrosoft.office365.messageCenter/messages/read\nRead messages in Message Center in the Microsoft 365 admin center, excluding security messages\nmicrosoft.office365.messageCenter/securityMessages/read\nRead security messages in Message Center in the Microsoft 365 admin center\nmicrosoft.office365.migrations/allEntities/allProperties/allTasks\nManage all aspects of Microsoft 365 migrations\nmicrosoft.office365.network/performance/allProperties/read\nRead all network performance properties in the Microsoft 365 admin center\nmicrosoft.office365.organizationalMessages/allEntities/allProperties/allTasks\nManage all authoring aspects of Microsoft 365 Organizational Messages\nmicrosoft.office365.protectionCenter/allEntities/allProperties/allTasks\nManage all aspects of the Security and Compliance centers\nmicrosoft.office365.search/content/manage\nCreate and delete content, and read and update all properties in Microsoft Search\nmicrosoft.office365.securityComplianceCenter/allEntities/allTasks\nCreate and delete all resources, and read and update standard properties in the Microsoft 365 Security and Compliance Center\nmicrosoft.office365.serviceHealth/allEntities/allTasks\nRead and configure Service Health in the Microsoft 365 admin center\nmicrosoft.office365.sharePoint/allEntities/allTasks\nCreate and delete all resources, and read and update standard properties in SharePoint\nmicrosoft.office365.sharePointAdvancedManagement/allEntities/allProperties/allTasks\nManage all aspects of SharePoint Advanced Management\nmicrosoft.office365.skypeForBusiness/allEntities/allTasks\nManage all aspects of Skype for Business Online\nmicrosoft.office365.supportTickets/allEntities/allTasks\nCreate and manage Microsoft 365 service requests\nmicrosoft.office365.usageReports/allEntities/allProperties/read\nRead Office 365 usage reports\nmicrosoft.office365.userCommunication/allEntities/allTasks\nRead and update what's new messages visibility\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nmicrosoft.office365.yammer/allEntities/allProperties/allTasks\nManage all aspects of Yammer\nmicrosoft.people/users/photo/read\nRead profile photo of user\nmicrosoft.people/users/photo/update\nUpdate profile photo of user\nmicrosoft.peopleAdmin/organization/allProperties/read\nRead people settings for users, such as pronouns, name pronunciation, and profile card settings\nmicrosoft.peopleAdmin/organization/allProperties/update\nUpdate people settings for users, such as pronouns, name pronunciation, and profile card settings\nmicrosoft.permissionsManagement/allEntities/allProperties/allTasks\nManage all aspects of Microsoft Entra Permissions Management\nmicrosoft.powerApps.powerBI/allEntities/allTasks\nManage all aspects of Fabric and Power BI\nmicrosoft.powerApps/allEntities/allTasks\nManage all aspects of Power Apps\nmicrosoft.teams/allEntities/allProperties/allTasks\nManage all resources in Teams\nmicrosoft.virtualVisits/allEntities/allProperties/allTasks\nManage and share Virtual Visits information and metrics from admin centers or the Virtual Visits app\nmicrosoft.viva.glint/allEntities/allProperties/allTasks\nManage and configure all Microsoft Viva Glint settings in the Microsoft 365 admin center\nmicrosoft.viva.goals/allEntities/allProperties/allTasks\nManage all aspects of Microsoft Viva Goals\nmicrosoft.viva.pulse/allEntities/allProperties/allTasks\nManage all aspects of Microsoft Viva Pulse\nmicrosoft.windows.defenderAdvancedThreatProtection/allEntities/allTasks\nManage all aspects of Microsoft Defender for Endpoint\nmicrosoft.windows.updatesDeployments/allEntities/allProperties/allTasks\nRead and configure all aspects of Windows Update Service\nGlobal Reader\nThis is a\nprivileged role\n. Users in this role can read settings and administrative information across Microsoft 365 services but can't take management actions. Global Reader is the read-only counterpart to Global Administrator. Assign Global Reader instead of Global Administrator for planning, audits, or investigations. Use Global Reader in combination with other limited admin roles like Exchange Administrator to make it easier to get work done without the assigning the Global Administrator role. Global Reader works with Microsoft 365 admin center, Exchange admin center, SharePoint admin center, Teams admin center, Microsoft 365 Defender portal, Microsoft Purview portal, Azure portal, and Device Management admin center.\nUsers with this role\ncannot\ndo the following:\nCannot access the Purchase Services area in the Microsoft 365 admin center.\nNote\nGlobal Reader role has the following limitations:\nOneDrive admin center - OneDrive admin center does not support the Global Reader role\nMicrosoft 365 Defender portal\n- Global Reader can't do content search or see Secure Score.\nTeams admin center\n- Global Reader cannot read\nTeams lifecycle\n,\nAnalytics & reports\n,\nIP phone device management\n, and\nApp catalog\n. For more information, see\nUse Microsoft Teams administrator roles to manage Teams\n.\nPrivileged Access Management\ndoesn't support the Global Reader role.\nAzure Information Protection\n- Global Reader is supported\nfor central reporting\nonly, and when your Microsoft Entra organization isn't on the\nunified labeling platform\n.\nSharePoint\n- Global Reader has read access to SharePoint Online PowerShell cmdlets and Read APIs.\nPower Platform admin center\n- Global Reader is not yet supported in the Power Platform admin center.\nMicrosoft Purview doesn't support the Global Reader role.\nActions\nDescription\nmicrosoft.agentRegistry/allEntities/allProperties/read\nRead all properties of Agent Registry in Microsoft Entra ID\nmicrosoft.azure.serviceHealth/allEntities/allTasks\nRead and configure Azure Service Health\nmicrosoft.backup/allEntities/allProperties/read\nRead all aspects of Microsoft 365 Backup\nmicrosoft.cloudPC/allEntities/allProperties/read\nRead all aspects of Windows 365\nmicrosoft.commerce.billing/allEntities/allProperties/read\nRead all resources of Office 365 billing\nmicrosoft.commerce.billing/purchases/standard/read\nRead purchase services in Microsoft 365 admin center.\nmicrosoft.directory/accessReviews/allProperties/read\nRead all properties of access reviews\nmicrosoft.directory/accessReviews/definitions/allProperties/read\nRead all properties of access reviews of all reviewable resources in Microsoft Entra ID\nmicrosoft.directory/adminConsentRequestPolicy/allProperties/read\nRead all properties of admin consent request policies in Microsoft Entra ID\nmicrosoft.directory/administrativeUnits/allProperties/read\nRead all properties of administrative units, including members\nmicrosoft.directory/appConsent/appConsentRequests/allProperties/read\nRead all properties of consent requests for applications registered with Microsoft Entra ID\nmicrosoft.directory/applications/allProperties/read\nRead all properties (including privileged properties) on all types of applications\nmicrosoft.directory/applications/synchronization/standard/read\nRead provisioning settings associated with the application object\nmicrosoft.directory/auditLogs/allProperties/read\nRead all properties on audit logs, excluding custom security attributes audit logs\nmicrosoft.directory/authorizationPolicy/standard/read\nRead standard properties of authorization policy\nmicrosoft.directory/bitlockerKeys/key/read\nRead bitlocker metadata and key on devices\nmicrosoft.directory/cloudAppSecurity/allProperties/read\nRead all properties for Cloud app security\nmicrosoft.directory/conditionalAccessPolicies/allProperties/read\nRead all properties of Conditional Access policies\nmicrosoft.directory/connectorGroups/allProperties/read\nRead all properties of application proxy connector groups\nmicrosoft.directory/connectors/allProperties/read\nRead all properties of application proxy connectors\nmicrosoft.directory/contacts/allProperties/read\nRead all properties for contacts\nmicrosoft.directory/crossTenantAccessPolicy/default/standard/read\nRead basic properties of the default cross-tenant access policy\nmicrosoft.directory/crossTenantAccessPolicy/partners/identitySynchronization/standard/read\nRead basic properties of cross-tenant sync policy\nmicrosoft.directory/crossTenantAccessPolicy/partners/standard/read\nRead basic properties of cross-tenant access policy for partners\nmicrosoft.directory/crossTenantAccessPolicy/partners/templates/multiTenantOrganizationIdentitySynchronization/standard/read\nRead basic properties of cross tenant sync policy templates for multi-tenant organization\nmicrosoft.directory/crossTenantAccessPolicy/partners/templates/multiTenantOrganizationPartnerConfiguration/standard/read\nRead basic properties of cross tenant access policy templates for multi-tenant organization\nmicrosoft.directory/crossTenantAccessPolicy/standard/read\nRead basic properties of cross-tenant access policy\nmicrosoft.directory/customAuthenticationExtensions/allProperties/read\nRead custom authentication extensions\nmicrosoft.directory/deviceLocalCredentials/standard/read\nRead all properties of the backed up local administrator account credentials for Microsoft Entra joined devices, except the password\nmicrosoft.directory/deviceManagementPolicies/standard/read\nRead standard properties on mobile device management and mobile app management policies\nmicrosoft.directory/deviceRegistrationPolicy/standard/read\nRead standard properties on device registration policies\nmicrosoft.directory/devices/allProperties/read\nRead all properties of devices\nmicrosoft.directory/directoryRoles/allProperties/read\nRead all properties of directory roles\nmicrosoft.directory/directoryRoleTemplates/allProperties/read\nRead all properties of directory role templates\nmicrosoft.directory/domains/allProperties/read\nRead all properties of domains\nmicrosoft.directory/domains/federationConfiguration/standard/read\nRead standard properties of federation configuration for domains\nmicrosoft.directory/entitlementManagement/allProperties/read\nRead all properties in Microsoft Entra entitlement management\nmicrosoft.directory/externalUserProfiles/standard/read\nRead standard properties of external user profiles in the extended directory for Teams\nmicrosoft.directory/groups/allProperties/read\nRead all properties (including privileged properties) on Security groups and Microsoft 365 groups, including role-assignable groups\nmicrosoft.directory/groupSettings/allProperties/read\nRead all properties of group settings\nmicrosoft.directory/groupSettingTemplates/allProperties/read\nRead all properties of group setting templates\nmicrosoft.directory/identityProtection/allProperties/read\nRead all resources in Microsoft Entra ID Protection\nmicrosoft.directory/lifecycleWorkflows/workflows/allProperties/read\nRead all properties of lifecycle workflows and tasks in Microsoft Entra ID\nmicrosoft.directory/loginOrganizationBranding/allProperties/read\nRead all properties for your organization's branded sign-in page\nmicrosoft.directory/multiTenantOrganization/joinRequest/standard/read\nRead properties of a multi-tenant organization join request\nmicrosoft.directory/multiTenantOrganization/standard/read\nRead basic properties of a multi-tenant organization\nmicrosoft.directory/multiTenantOrganization/tenants/organizationDetails/read\nRead organization details of a tenant participating in a multi-tenant organization\nmicrosoft.directory/multiTenantOrganization/tenants/standard/read\nRead basic properties of a tenant participating in a multi-tenant organization\nmicrosoft.directory/namedLocations/standard/read\nRead basic properties of custom rules that define network locations\nmicrosoft.directory/oAuth2PermissionGrants/allProperties/read\nRead all properties of OAuth 2.0 permission grants\nmicrosoft.directory/onPremisesSynchronization/standard/read\nRead standard on-premises directory synchronization information\nmicrosoft.directory/organization/allProperties/read\nRead all properties for an organization\nmicrosoft.directory/pendingExternalUserProfiles/standard/read\nRead standard properties of external user profiles in the extended directory for Teams\nmicrosoft.directory/permissionGrantPolicies/standard/read\nRead standard properties of permission grant policies\nmicrosoft.directory/policies/allProperties/read\nRead all properties of policies\nmicrosoft.directory/privilegedIdentityManagement/allProperties/read\nRead all resources in Privileged Identity Management\nmicrosoft.directory/provisioningLogs/allProperties/read\nRead all properties of provisioning logs\nmicrosoft.directory/roleAssignments/allProperties/read\nRead all properties of role assignments\nmicrosoft.directory/roleDefinitions/allProperties/read\nRead all properties of role definitions\nmicrosoft.directory/scopedRoleMemberships/allProperties/read\nView members in administrative units\nmicrosoft.directory/serviceAction/getAvailableExtentionProperties\nCan perform the getAvailableExtentionProperties service action\nmicrosoft.directory/servicePrincipalCreationPolicies/standard/read\nRead standard properties of service principal creation policies\nmicrosoft.directory/servicePrincipals/allProperties/read\nRead all properties (including privileged properties) on servicePrincipals\nmicrosoft.directory/servicePrincipals/synchronization/standard/read\nRead provisioning settings associated with your service principal\nmicrosoft.directory/signInReports/allProperties/read\nRead all properties on sign-in reports, including privileged properties\nmicrosoft.directory/subscribedSkus/allProperties/read\nRead all properties of product subscriptions\nmicrosoft.directory/users/allProperties/read\nRead all properties of users\nmicrosoft.directory/users/authenticationMethods/standard/restrictedRead\nRead standard properties of authentication methods that do not include personally identifiable information for users\nmicrosoft.directory/verifiableCredentials/configuration/allProperties/read\nRead configuration required to create and manage verifiable credentials\nmicrosoft.directory/verifiableCredentials/configuration/contracts/allProperties/read\nRead a verifiable credential contract\nmicrosoft.directory/verifiableCredentials/configuration/contracts/cards/allProperties/read\nRead a verifiable credential card\nmicrosoft.edge/allEntities/allProperties/read\nRead all aspects of Microsoft Edge\nmicrosoft.graph.dataConnect/allEntities/allProperties/read\nRead aspects of Microsoft Graph Data Connect\nmicrosoft.hardware.support/shippingAddress/allProperties/read\nRead shipping addresses for Microsoft hardware warranty claims, including existing shipping addresses created by others\nmicrosoft.hardware.support/shippingStatus/allProperties/read\nRead shipping status for open Microsoft hardware warranty claims\nmicrosoft.hardware.support/warrantyClaims/allProperties/read\nRead Microsoft hardware warranty claims\nmicrosoft.healthPlatform/allEntities/allProperties/read\nRead all aspects of Microsoft Dragon admin center\nmicrosoft.insights/allEntities/allProperties/read\nRead all aspects of Viva Insights\nmicrosoft.microsoft365.organizationalData/allEntities/allProperties/read\nRead all aspects of organizational data in Microsoft 365\nmicrosoft.networkAccess/allEntities/allProperties/read\nRead all aspects of Microsoft Entra Network Access\nmicrosoft.office365.copilot/allEntities/allProperties/read\nRead all settings for Microsoft 365 Copilot\nmicrosoft.office365.fileStorageContainers/allEntities/allProperties/read\nRead entities and permissions of SharePoint Embedded containers\nmicrosoft.office365.messageCenter/messages/read\nRead messages in Message Center in the Microsoft 365 admin center, excluding security messages\nmicrosoft.office365.messageCenter/securityMessages/read\nRead security messages in Message Center in the Microsoft 365 admin center\nmicrosoft.office365.network/performance/allProperties/read\nRead all network performance properties in the Microsoft 365 admin center\nmicrosoft.office365.organizationalMessages/allEntities/allProperties/read\nRead all aspects of Microsoft 365 Organizational Messages\nmicrosoft.office365.protectionCenter/allEntities/allProperties/read\nRead all properties in the Security and Compliance centers\nmicrosoft.office365.securityComplianceCenter/allEntities/read\nRead standard properties in Microsoft 365 Security and Compliance Center\nmicrosoft.office365.serviceHealth/allEntities/allTasks\nRead and configure Service Health in the Microsoft 365 admin center\nmicrosoft.office365.usageReports/allEntities/allProperties/read\nRead Office 365 usage reports\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nmicrosoft.office365.yammer/allEntities/allProperties/read\nRead all aspects of Yammer\nmicrosoft.permissionsManagement/allEntities/allProperties/read\nRead all aspects of Microsoft Entra Permissions Management\nmicrosoft.teams/allEntities/allProperties/read\nRead all properties of Microsoft Teams\nmicrosoft.virtualVisits/allEntities/allProperties/read\nRead all aspects of Virtual Visits\nmicrosoft.viva.glint/allEntities/allProperties/read\nRead all Microsoft Viva Glint settings in the Microsoft 365 admin center\nmicrosoft.viva.goals/allEntities/allProperties/read\nRead all aspects of Microsoft Viva Goals\nmicrosoft.viva.pulse/allEntities/allProperties/read\nRead all aspects of Microsoft Viva Pulse\nmicrosoft.windows.updatesDeployments/allEntities/allProperties/read\nRead all aspects of Windows Update Service\nGlobal Secure Access Administrator\nAssign the Global Secure Access Administrator role to users who need to do the following:\nCreate and manage all aspects of Microsoft Entra Internet Access and Microsoft Entra Private Access\nManage access to public and private endpoints\nUsers with this role\ncannot\ndo the following:\nCannot manage enterprise applications, application registrations, Conditional Access, or application proxy settings\nLearn more\nActions\nDescription\nmicrosoft.azure.supportTickets/allEntities/allTasks\nCreate and manage Azure support tickets\nmicrosoft.directory/applicationPolicies/standard/read\nRead standard properties of application policies\nmicrosoft.directory/applications/applicationProxy/read\nRead all application proxy properties\nmicrosoft.directory/applications/owners/read\nRead owners of applications\nmicrosoft.directory/applications/policies/read\nRead policies of applications\nmicrosoft.directory/applications/standard/read\nRead standard properties of applications\nmicrosoft.directory/auditLogs/allProperties/read\nRead all properties on audit logs, excluding custom security attributes audit logs\nmicrosoft.directory/conditionalAccessPolicies/standard/read\nRead Conditional Access for policies\nmicrosoft.directory/connectorGroups/allProperties/read\nRead all properties of application proxy connector groups\nmicrosoft.directory/connectors/allProperties/read\nRead all properties of application proxy connectors\nmicrosoft.directory/crossTenantAccessPolicy/default/standard/read\nRead basic properties of the default cross-tenant access policy\nmicrosoft.directory/crossTenantAccessPolicy/partners/standard/read\nRead basic properties of cross-tenant access policy for partners\nmicrosoft.directory/crossTenantAccessPolicy/standard/read\nRead basic properties of cross-tenant access policy\nmicrosoft.directory/namedLocations/standard/read\nRead basic properties of custom rules that define network locations\nmicrosoft.directory/signInReports/allProperties/read\nRead all properties on sign-in reports, including privileged properties\nmicrosoft.networkAccess/allEntities/allProperties/allTasks\nManage all aspects of Microsoft Entra Network Access\nmicrosoft.office365.messageCenter/messages/read\nRead messages in Message Center in the Microsoft 365 admin center, excluding security messages\nmicrosoft.office365.serviceHealth/allEntities/allTasks\nRead and configure Service Health in the Microsoft 365 admin center\nmicrosoft.office365.supportTickets/allEntities/allTasks\nCreate and manage Microsoft 365 service requests\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nGlobal Secure Access Log Reader\nAssign the Global Secure Access Log Reader role to users who need to do the following:\nRead network traffic logs in Microsoft Entra Internet Access and Microsoft Entra Private Access for analysis by designated security personnel\nView log details such as session, connection, and transaction\nFilter logs based on criteria such as IP address and domain\nLearn more\nActions\nDescription\nmicrosoft.networkAccess/trafficLogs/standard/read\nRead standard properties of traffic logs such as DeviceId, DestinationIp and PolicyRuleId\nGroups Administrator\nUsers in this role can create/manage groups and its settings like naming and expiration policies. It is important to understand that assigning a user to this role gives them the ability to manage all groups in the organization across various workloads like Teams, SharePoint, Yammer in addition to Outlook. Also the user will be able to manage the various groups settings across various admin portals like Microsoft admin center, Azure portal, as well as workload specific ones like Teams and SharePoint admin centers.\nActions\nDescription\nmicrosoft.azure.serviceHealth/allEntities/allTasks\nRead and configure Azure Service Health\nmicrosoft.azure.supportTickets/allEntities/allTasks\nCreate and manage Azure support tickets\nmicrosoft.directory/bulkJobs.groups/basic/update\nUpdate bulk jobs related to groups\nmicrosoft.directory/bulkJobs.groups/create\nCreate bulk jobs related to groups\nmicrosoft.directory/bulkJobs.groups/standard/read\nRead bulk jobs related to groups\nmicrosoft.directory/deletedItems.groups/delete\nPermanently delete groups, which can no longer be restored\nmicrosoft.directory/deletedItems.groups/restore\nRestore soft deleted groups to original state\nmicrosoft.directory/groups/assignedLabels/update\nUpdate the assigned labels property on groups of assigned membership type, excluding role-assignable groups\nmicrosoft.directory/groups/assignLicense\nAssign product licenses to groups for group-based licensing\nmicrosoft.directory/groups/basic/update\nUpdate basic properties on Security groups and Microsoft 365 groups, excluding role-assignable groups\nmicrosoft.directory/groups/classification/update\nUpdate the classification property on Security groups and Microsoft 365 groups, excluding role-assignable groups\nmicrosoft.directory/groups/create\nCreate Security groups and Microsoft 365 groups, excluding role-assignable groups\nmicrosoft.directory/groups/delete\nDelete Security groups and Microsoft 365 groups, excluding role-assignable groups\nmicrosoft.directory/groups/dynamicMembershipRule/update\nUpdate the dynamic membership rule on Security groups and Microsoft 365 groups, excluding role-assignable groups\nmicrosoft.directory/groups/groupType/update\nUpdate properties that would affect the group type of Security groups and Microsoft 365 groups, excluding role-assignable groups\nmicrosoft.directory/groups/hiddenMembers/read\nRead hidden members of Security groups and Microsoft 365 groups, including role-assignable groups\nmicrosoft.directory/groups/members/update\nUpdate members of Security groups and Microsoft 365 groups, excluding role-assignable groups\nmicrosoft.directory/groups/onPremWriteBack/update\nUpdate Microsoft Entra groups to be written back to on-premises with Microsoft Entra Connect\nmicrosoft.directory/groups/owners/update\nUpdate owners of Security groups and Microsoft 365 groups, excluding role-assignable groups\nmicrosoft.directory/groups/reprocessLicenseAssignment\nReprocess license assignments for group-based licensing\nmicrosoft.directory/groups/restore\nRestore groups from soft-deleted container\nmicrosoft.directory/groups/settings/update\nUpdate settings of groups\nmicrosoft.directory/groups/visibility/update\nUpdate the visibility property of Security groups and Microsoft 365 groups, excluding role-assignable groups\nmicrosoft.office365.serviceHealth/allEntities/allTasks\nRead and configure Service Health in the Microsoft 365 admin center\nmicrosoft.office365.supportTickets/allEntities/allTasks\nCreate and manage Microsoft 365 service requests\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nGuest Inviter\nUsers in this role can manage Microsoft Entra B2B guest user invitations when the\nMembers can invite\nuser setting is set to No. More information about B2B collaboration at\nAbout Microsoft Entra B2B collaboration\n. It does not include any other permissions.\nActions\nDescription\nmicrosoft.directory/users/appRoleAssignments/read\nRead application role assignments for users\nmicrosoft.directory/users/deviceForResourceAccount/read\nRead deviceForResourceAccount of users\nmicrosoft.directory/users/directReports/read\nRead the direct reports for users\nmicrosoft.directory/users/invitedBy/read\nRead the user that invited an external user to a tenant\nmicrosoft.directory/users/inviteGuest\nInvite guest users\nmicrosoft.directory/users/licenseDetails/read\nRead license details of users\nmicrosoft.directory/users/manager/read\nRead manager of users\nmicrosoft.directory/users/memberOf/read\nRead the group memberships of users\nmicrosoft.directory/users/oAuth2PermissionGrants/read\nRead delegated permission grants on users\nmicrosoft.directory/users/ownedDevices/read\nRead owned devices of users\nmicrosoft.directory/users/ownedObjects/read\nRead owned objects of users\nmicrosoft.directory/users/photo/read\nRead photo of users\nmicrosoft.directory/users/registeredDevices/read\nRead registered devices of users\nmicrosoft.directory/users/scopedRoleMemberOf/read\nRead user's membership of a Microsoft Entra role, that is scoped to an administrative unit\nmicrosoft.directory/users/sponsors/read\nRead sponsors of users\nmicrosoft.directory/users/standard/read\nRead basic properties on users\nHelpdesk Administrator\nThis is a\nprivileged role\n. Users with this role can change passwords, invalidate refresh tokens, create and manage support requests with Microsoft for Azure and Microsoft 365 services, and monitor service health. Invalidating a refresh token forces the user to sign in again. Whether a Helpdesk Administrator can reset a user's password and invalidate refresh tokens depends on the role the user is assigned. For a list of the roles that a Helpdesk Administrator can reset passwords for and invalidate refresh tokens, see\nWho can reset passwords\n.\nUsers with this role\ncannot\ndo the following:\nCannot change the credentials or reset MFA for members and owners of a\nrole-assignable group\n.\nImportant\nUsers with this role can change passwords for people who may have access to sensitive or private information or critical configuration inside and outside of Microsoft Entra ID. Changing the password of a user may mean the ability to assume that user's identity and permissions. For example:\nApplication Registration and Enterprise Application owners, who can manage credentials of apps they own. Those apps may have privileged permissions in Microsoft Entra ID and elsewhere not granted to Helpdesk Administrators. Through this path a Helpdesk Administrator may be able to assume the identity of an application owner and then further assume the identity of a privileged application by updating the credentials for the application.\nAzure subscription owners, who might have access to sensitive or private information or critical configuration in Azure.\nSecurity Group and Microsoft 365 group owners, who can manage group membership. Those groups may grant access to sensitive or private information or critical configuration in Microsoft Entra ID and elsewhere.\nAdministrators in other services outside of Microsoft Entra ID like Exchange Online, Microsoft 365 Defender portal, Microsoft Purview portal, and human resources systems.\nNon-administrators like executives, legal counsel, and human resources employees who may have access to sensitive or private information.\nDelegating administrative permissions over subsets of users and applying policies to a subset of users is possible with\nAdministrative Units\n.\nThis role was previously named Password Administrator in the\nAzure portal\n. It was renamed to Helpdesk Administrator to align with the existing name in the Microsoft Graph API and Microsoft Graph PowerShell.\nActions\nDescription\nmicrosoft.azure.serviceHealth/allEntities/allTasks\nRead and configure Azure Service Health\nmicrosoft.azure.supportTickets/allEntities/allTasks\nCreate and manage Azure support tickets\nmicrosoft.directory/bitlockerKeys/key/read\nRead bitlocker metadata and key on devices\nmicrosoft.directory/deviceLocalCredentials/standard/read\nRead all properties of the backed up local administrator account credentials for Microsoft Entra joined devices, except the password\nmicrosoft.directory/users/invalidateAllRefreshTokens\nForce sign-out by invalidating user refresh tokens\nmicrosoft.directory/users/password/update\nReset passwords for all users\nmicrosoft.office365.serviceHealth/allEntities/allTasks\nRead and configure Service Health in the Microsoft 365 admin center\nmicrosoft.office365.supportTickets/allEntities/allTasks\nCreate and manage Microsoft 365 service requests\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nHybrid Identity Administrator\nThis is a\nprivileged role\n. Users in this role can create, manage and deploy provisioning configuration setup from Active Directory to Microsoft Entra ID using Cloud Provisioning as well as manage Microsoft Entra Connect, pass-through authentication (PTA), password hash synchronization (PHS), seamless single sign-on (seamless SSO), and federation settings. Does not have access to manage Microsoft Entra Connect Health. Users can also troubleshoot and monitor logs using this role.\nActions\nDescription\nmicrosoft.azure.serviceHealth/allEntities/allTasks\nRead and configure Azure Service Health\nmicrosoft.azure.supportTickets/allEntities/allTasks\nCreate and manage Azure support tickets\nmicrosoft.directory/applications/appRoles/update\nUpdate the appRoles property on all types of applications\nmicrosoft.directory/applications/audience/update\nUpdate the audience property for applications\nmicrosoft.directory/applications/authentication/update\nUpdate authentication on all types of applications\nmicrosoft.directory/applications/basic/update\nUpdate basic properties for applications\nmicrosoft.directory/applications/create\nCreate all types of applications\nmicrosoft.directory/applications/delete\nDelete all types of applications\nmicrosoft.directory/applications/notes/update\nUpdate notes of applications\nmicrosoft.directory/applications/owners/update\nUpdate owners of applications\nmicrosoft.directory/applications/permissions/update\nUpdate exposed permissions and required permissions on all types of applications\nmicrosoft.directory/applications/policies/update\nUpdate policies of applications\nmicrosoft.directory/applications/synchronization/standard/read\nRead provisioning settings associated with the application object\nmicrosoft.directory/applications/tag/update\nUpdate tags of applications\nmicrosoft.directory/applicationTemplates/instantiate\nInstantiate gallery applications from application templates\nmicrosoft.directory/auditLogs/allProperties/read\nRead all properties on audit logs, excluding custom security attributes audit logs\nmicrosoft.directory/cloudProvisioning/allProperties/allTasks\nRead and configure all properties of Microsoft Entra cloud provisioning service.\nmicrosoft.directory/deletedItems.applications/delete\nPermanently delete applications, which can no longer be restored\nmicrosoft.directory/deletedItems.applications/restore\nRestore soft deleted applications to original state\nmicrosoft.directory/domains/allProperties/read\nRead all properties of domains\nmicrosoft.directory/domains/federation/update\nUpdate federation property of domains\nmicrosoft.directory/domains/federationConfiguration/basic/update\nUpdate basic federation configuration for domains\nmicrosoft.directory/domains/federationConfiguration/create\nCreate federation configuration for domains\nmicrosoft.directory/domains/federationConfiguration/delete\nDelete federation configuration for domains\nmicrosoft.directory/domains/federationConfiguration/standard/read\nRead standard properties of federation configuration for domains\nmicrosoft.directory/hybridAuthenticationPolicy/allProperties/allTasks\nManage hybrid authentication policy in Microsoft Entra ID\nmicrosoft.directory/onPremisesSynchronization/basic/update\nUpdate basic on-premises directory synchronization information\nmicrosoft.directory/onPremisesSynchronization/standard/read\nRead standard on-premises directory synchronization information\nmicrosoft.directory/organization/dirSync/update\nUpdate the organization directory sync property\nmicrosoft.directory/passwordHashSync/allProperties/allTasks\nManage all aspects of Password Hash Synchronization (PHS) in Microsoft Entra ID\nmicrosoft.directory/provisioningLogs/allProperties/read\nRead all properties of provisioning logs\nmicrosoft.directory/servicePrincipals/appRoleAssignedTo/update\nUpdate service principal role assignments\nmicrosoft.directory/servicePrincipals/audience/update\nUpdate audience properties on service principals\nmicrosoft.directory/servicePrincipals/authentication/update\nUpdate authentication properties on service principals\nmicrosoft.directory/servicePrincipals/basic/update\nUpdate basic properties on service principals\nmicrosoft.directory/servicePrincipals/create\nCreate service principals\nmicrosoft.directory/servicePrincipals/delete\nDelete service principals\nmicrosoft.directory/servicePrincipals/disable\nDisable service principals\nmicrosoft.directory/servicePrincipals/enable\nEnable service principals\nmicrosoft.directory/servicePrincipals/notes/update\nUpdate notes of service principals\nmicrosoft.directory/servicePrincipals/owners/update\nUpdate owners of service principals\nmicrosoft.directory/servicePrincipals/permissions/update\nUpdate permissions of service principals\nmicrosoft.directory/servicePrincipals/policies/update\nUpdate policies of service principals\nmicrosoft.directory/servicePrincipals/synchronization.cloudTenantToCloudTenant/credentials/manage\nManage cloud tenant to cloud tenant application provisioning secrets and credentials.\nmicrosoft.directory/servicePrincipals/synchronization.cloudTenantToCloudTenant/jobs/manage\nStart, restart, and pause cloud tenant to cloud tenant application provisioning synchronization jobs.\nmicrosoft.directory/servicePrincipals/synchronization.cloudTenantToCloudTenant/schema/manage\nCreate and manage cloud tenant to cloud tenant application provisioning synchronization jobs and schema.\nmicrosoft.directory/servicePrincipals/synchronization.cloudTenantToExternalSystem/credentials/manage\nManage application provisioning secrets and credentials.\nmicrosoft.directory/servicePrincipals/synchronization.cloudTenantToExternalSystem/jobs/manage\nStart, restart, and pause application provisioning synchronization jobs.\nmicrosoft.directory/servicePrincipals/synchronization.cloudTenantToExternalSystem/schema/manage\nCreate and manage application provisioning synchronization jobs and schema.\nmicrosoft.directory/servicePrincipals/synchronization/standard/read\nRead provisioning settings associated with your service principal\nmicrosoft.directory/servicePrincipals/synchronizationCredentials/manage\nManage application provisioning secrets and credentials\nmicrosoft.directory/servicePrincipals/synchronizationJobs/manage\nStart, restart, and pause application provisioning synchronization jobs\nmicrosoft.directory/servicePrincipals/synchronizationSchema/manage\nCreate and manage application provisioning synchronization jobs and schema\nmicrosoft.directory/servicePrincipals/tag/update\nUpdate the tag property for service principals\nmicrosoft.directory/signInReports/allProperties/read\nRead all properties on sign-in reports, including privileged properties\nmicrosoft.directory/users/authorizationInfo/update\nUpdate the multivalued Certificate user IDs property of users\nmicrosoft.office365.messageCenter/messages/read\nRead messages in Message Center in the Microsoft 365 admin center, excluding security messages\nmicrosoft.office365.serviceHealth/allEntities/allTasks\nRead and configure Service Health in the Microsoft 365 admin center\nmicrosoft.office365.supportTickets/allEntities/allTasks\nCreate and manage Microsoft 365 service requests\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nIdentity Governance Administrator\nUsers with this role can manage Microsoft Entra ID Governance configuration, including access packages, access reviews, catalogs and policies, ensuring access is approved and reviewed and guest users who no longer need access are removed.\nActions\nDescription\nmicrosoft.directory/accessReviews/allProperties/allTasks\nCreate and delete access reviews, and read and update all properties of access reviews in Microsoft Entra ID\nmicrosoft.directory/accessReviews/definitions.applications/allProperties/allTasks\nManage access reviews of application role assignments in Microsoft Entra ID\nmicrosoft.directory/accessReviews/definitions.entitlementManagement/allProperties/allTasks\nManage access reviews for access package assignments in entitlement management\nmicrosoft.directory/accessReviews/definitions.groups/allProperties/read\nRead all properties of access reviews for membership in Security and Microsoft 365 groups, including role-assignable groups.\nmicrosoft.directory/accessReviews/definitions.groups/allProperties/update\nUpdate all properties of access reviews for membership in Security and Microsoft 365 groups, excluding role-assignable groups.\nmicrosoft.directory/accessReviews/definitions.groups/create\nCreate access reviews for membership in Security and Microsoft 365 groups.\nmicrosoft.directory/accessReviews/definitions.groups/delete\nDelete access reviews for membership in Security and Microsoft 365 groups.\nmicrosoft.directory/entitlementManagement/allProperties/allTasks\nCreate and delete resources, and read and update all properties in Microsoft Entra entitlement management\nmicrosoft.directory/groups/members/update\nUpdate members of Security groups and Microsoft 365 groups, excluding role-assignable groups\nmicrosoft.directory/servicePrincipals/appRoleAssignedTo/update\nUpdate service principal role assignments\nInsights Administrator\nUsers in this role can access the full set of administrative capabilities in the Microsoft Viva Insights app. This role has the ability to read directory information, monitor service health, file support tickets, and access the Insights Administrator settings aspects.\nLearn more\nActions\nDescription\nmicrosoft.azure.serviceHealth/allEntities/allTasks\nRead and configure Azure Service Health\nmicrosoft.azure.supportTickets/allEntities/allTasks\nCreate and manage Azure support tickets\nmicrosoft.insights/allEntities/allProperties/allTasks\nManage all aspects of Insights app\nmicrosoft.office365.serviceHealth/allEntities/allTasks\nRead and configure Service Health in the Microsoft 365 admin center\nmicrosoft.office365.supportTickets/allEntities/allTasks\nCreate and manage Microsoft 365 service requests\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nInsights Analyst\nAssign the Insights Analyst role to users who need to do the following:\nAnalyze data in the Microsoft Viva Insights app, but can't manage any configuration settings\nCreate, manage, and run queries\nView basic settings and reports in the Microsoft 365 admin center\nCreate and manage service requests in the Microsoft 365 admin center\nLearn more\nActions\nDescription\nmicrosoft.insights/queries/allProperties/allTasks\nRun and manage queries in Viva Insights\nmicrosoft.office365.supportTickets/allEntities/allTasks\nCreate and manage Microsoft 365 service requests\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nInsights Business Leader\nUsers in this role can access a set of dashboards and insights via the Microsoft Viva Insights app. This includes full access to all dashboards and presented insights and data exploration functionality. Users in this role do not have access to product configuration settings, which is the responsibility of the Insights Administrator role.\nLearn more\nActions\nDescription\nmicrosoft.insights/programs/allProperties/update\nDeploy and manage programs in Insights app\nmicrosoft.insights/reports/allProperties/read\nView reports and dashboard in Insights app\nIntune Administrator\nThis is a\nprivileged role\n. Users with this role have global permissions within Microsoft Intune Online, when the service is present. Additionally, this role contains the ability to manage users and devices in order to associate policy, as well as create and manage groups. For more information, see\nRole-based administration control (RBAC) with Microsoft Intune\n.\nThis role can create and manage all security groups. However, Intune Administrator does not have admin rights over Microsoft 365 groups. That means the admin cannot update owners or memberships of all Microsoft 365 groups in the organization. However, he/she can manage the Microsoft 365 group that he creates which comes as a part of his/her end-user privileges. So, any Microsoft 365 group (not security group) that he/she creates should be counted against his/her quota of 250.\nNote\nIn the Microsoft Graph API and Microsoft Graph PowerShell, this role is named Intune Service Administrator. In the\nAzure portal\n, it is named Intune Administrator.\nActions\nDescription\nmicrosoft.azure.supportTickets/allEntities/allTasks\nCreate and manage Azure support tickets\nmicrosoft.cloudPC/allEntities/allProperties/allTasks\nManage all aspects of Windows 365\nmicrosoft.directory/bitlockerKeys/key/read\nRead bitlocker metadata and key on devices\nmicrosoft.directory/contacts/basic/update\nUpdate basic properties on contacts\nmicrosoft.directory/contacts/create\nCreate contacts\nmicrosoft.directory/contacts/delete\nDelete contacts\nmicrosoft.directory/deletedItems.devices/delete\nPermanently delete devices, which can no longer be restored\nmicrosoft.directory/deletedItems.devices/restore\nRestore soft deleted devices to original state\nmicrosoft.directory/deviceLocalCredentials/password/read\nRead all properties of the backed up local administrator account credentials for Microsoft Entra joined devices, including the password\nmicrosoft.directory/deviceManagementPolicies/standard/read\nRead standard properties on mobile device management and mobile app management policies\nmicrosoft.directory/deviceRegistrationPolicy/standard/read\nRead standard properties on device registration policies\nmicrosoft.directory/devices/basic/update\nUpdate basic properties on devices\nmicrosoft.directory/devices/create\nCreate devices (enroll in Microsoft Entra ID)\nmicrosoft.directory/devices/delete\nDelete devices from Microsoft Entra ID\nmicrosoft.directory/devices/disable\nDisable devices in Microsoft Entra ID\nmicrosoft.directory/devices/enable\nEnable devices in Microsoft Entra ID\nmicrosoft.directory/devices/extensionAttributeSet1/update\nUpdate the extensionAttribute1 to extensionAttribute5 properties on devices\nmicrosoft.directory/devices/extensionAttributeSet2/update\nUpdate the extensionAttribute6 to extensionAttribute10 properties on devices\nmicrosoft.directory/devices/extensionAttributeSet3/update\nUpdate the extensionAttribute11 to extensionAttribute15 properties on devices\nmicrosoft.directory/devices/registeredOwners/update\nUpdate registered owners of devices\nmicrosoft.directory/devices/registeredUsers/update\nUpdate registered users of devices\nmicrosoft.directory/groups.security/assignedLabels/update\nUpdate the assigned labels property on Security groups of assigned membership type, excluding role-assignable groups\nmicrosoft.directory/groups.security/basic/update\nUpdate basic properties on Security groups, excluding role-assignable groups\nmicrosoft.directory/groups.security/classification/update\nUpdate the classification property on Security groups, excluding role-assignable groups\nmicrosoft.directory/groups.security/create\nCreate Security groups, excluding role-assignable groups\nmicrosoft.directory/groups.security/delete\nDelete Security groups, excluding role-assignable groups\nmicrosoft.directory/groups.security/dynamicMembershipRule/update\nUpdate the dynamic membership rule on Security groups, excluding role-assignable groups\nmicrosoft.directory/groups.security/members/update\nUpdate members of Security groups, excluding role-assignable groups\nmicrosoft.directory/groups.security/owners/update\nUpdate owners of Security groups, excluding role-assignable groups\nmicrosoft.directory/groups.security/visibility/update\nUpdate the visibility property on Security groups, excluding role-assignable groups\nmicrosoft.directory/groups/hiddenMembers/read\nRead hidden members of Security groups and Microsoft 365 groups, including role-assignable groups\nmicrosoft.directory/users/basic/update\nUpdate basic properties on users\nmicrosoft.directory/users/manager/update\nUpdate manager for users\nmicrosoft.directory/users/photo/update\nUpdate photo of users\nmicrosoft.intune/allEntities/allTasks\nManage all aspects of Microsoft Intune\nmicrosoft.office365.organizationalMessages/allEntities/allProperties/read\nRead all aspects of Microsoft 365 Organizational Messages\nmicrosoft.office365.supportTickets/allEntities/allTasks\nCreate and manage Microsoft 365 service requests\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nIoT Device Administrator\nAssign the IoT Device Administrator role to users who need to do the following tasks:\nProvision new IoT devices using device templates\nManage the lifecycle of IoT devices\nConfigure certificates used for IoT device authentication\nManage the lifecycle of IoT device templates\nLearn more\nActions\nDescription\nmicrosoft.directory/certificateBasedDeviceAuthConfigurations/create\nCreate Certificate Authorities configurations for IoT Device trust and authentication\nmicrosoft.directory/certificateBasedDeviceAuthConfigurations/credentials/update\nUpdate crendential related properties on certificate authority configurations for Internet of Things (IoT) device trust and authentication\nmicrosoft.directory/certificateBasedDeviceAuthConfigurations/delete\nDelete certificate authority configurations for Internet of Things (IoT) device\nmicrosoft.directory/certificateBasedDeviceAuthConfigurations/standard/read\nRead standard properties on certificate authority configurations for Internet of Things (IoT) device trust and authentication\nmicrosoft.directory/deviceTemplates/create\nCreate Internet of Things (IoT) device templates\nmicrosoft.directory/deviceTemplates/createDeviceFromTemplate\nCreate IoT Device from Internet of Things (IoT) device templates\nmicrosoft.directory/deviceTemplates/delete\nDelete Internet of Things (IoT) device templates\nmicrosoft.directory/deviceTemplates/deviceInstances/read\nRead device instances from Internet of Things (IoT) device links\nmicrosoft.directory/deviceTemplates/owners/read\nRead owners on Internet of Things (IoT) device templates\nmicrosoft.directory/deviceTemplates/owners/update\nUpdate owners on Internet of Things (IoT) device templates\nKaizala Administrator\nUsers with this role have global permissions to manage settings within Microsoft Kaizala, when the service is present, as well as the ability to manage support tickets and monitor service health. Additionally, the user can access reports related to adoption & usage of Kaizala by Organization members and business reports generated using the Kaizala actions.\nActions\nDescription\nmicrosoft.directory/authorizationPolicy/standard/read\nRead standard properties of authorization policy\nmicrosoft.office365.serviceHealth/allEntities/allTasks\nRead and configure Service Health in the Microsoft 365 admin center\nmicrosoft.office365.supportTickets/allEntities/allTasks\nCreate and manage Microsoft 365 service requests\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nKnowledge Administrator\nUsers in this role have full access to all knowledge, learning and intelligent features settings in the Microsoft 365 admin center. They have a general understanding of the suite of products, licensing details and have responsibility to control access. Knowledge Administrator can create and manage content, like topics, acronyms and learning resources. Additionally, these users can create content centers, monitor service health, and create service requests.\nActions\nDescription\nmicrosoft.directory/groups.security/basic/update\nUpdate basic properties on Security groups, excluding role-assignable groups\nmicrosoft.directory/groups.security/create\nCreate Security groups, excluding role-assignable groups\nmicrosoft.directory/groups.security/createAsOwner\nCreate Security groups, excluding role-assignable groups. Creator is added as the first owner.\nmicrosoft.directory/groups.security/delete\nDelete Security groups, excluding role-assignable groups\nmicrosoft.directory/groups.security/members/update\nUpdate members of Security groups, excluding role-assignable groups\nmicrosoft.directory/groups.security/owners/update\nUpdate owners of Security groups, excluding role-assignable groups\nmicrosoft.office365.knowledge/contentUnderstanding/allProperties/allTasks\nRead and update all properties of content understanding in Microsoft 365 admin center\nmicrosoft.office365.knowledge/knowledgeNetwork/allProperties/allTasks\nRead and update all properties of knowledge network in Microsoft 365 admin center\nmicrosoft.office365.knowledge/learningSources/allProperties/allTasks\nManage learning sources and all their properties in Learning App.\nmicrosoft.office365.protectionCenter/sensitivityLabels/allProperties/read\nRead all properties of sensitivity labels in the Security and Compliance centers\nmicrosoft.office365.sharePoint/allEntities/allTasks\nCreate and delete all resources, and read and update standard properties in SharePoint\nmicrosoft.office365.supportTickets/allEntities/allTasks\nCreate and manage Microsoft 365 service requests\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nKnowledge Manager\nUsers in this role can create and manage content, like topics, acronyms and learning content. These users are primarily responsible for the quality and structure of knowledge. This user has full rights to topic management actions to confirm a topic, approve edits, or delete a topic. This role can also manage taxonomies as part of the term store management tool and create content centers.\nActions\nDescription\nmicrosoft.directory/groups.security/basic/update\nUpdate basic properties on Security groups, excluding role-assignable groups\nmicrosoft.directory/groups.security/create\nCreate Security groups, excluding role-assignable groups\nmicrosoft.directory/groups.security/createAsOwner\nCreate Security groups, excluding role-assignable groups. Creator is added as the first owner.\nmicrosoft.directory/groups.security/delete\nDelete Security groups, excluding role-assignable groups\nmicrosoft.directory/groups.security/members/update\nUpdate members of Security groups, excluding role-assignable groups\nmicrosoft.directory/groups.security/owners/update\nUpdate owners of Security groups, excluding role-assignable groups\nmicrosoft.office365.knowledge/contentUnderstanding/analytics/allProperties/read\nRead analytics reports of content understanding in Microsoft 365 admin center\nmicrosoft.office365.knowledge/knowledgeNetwork/topicVisibility/allProperties/allTasks\nManage topic visibility of knowledge network in Microsoft 365 admin center\nmicrosoft.office365.sharePoint/allEntities/allTasks\nCreate and delete all resources, and read and update standard properties in SharePoint\nmicrosoft.office365.supportTickets/allEntities/allTasks\nCreate and manage Microsoft 365 service requests\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nLicense Administrator\nUsers in this role can read, add, remove, and update license assignments on users, groups (using group-based licensing), and manage the usage location on users. The role does not grant the ability to purchase or manage subscriptions, create or manage groups, or create or manage users beyond the usage location. This role has no access to view, create, or manage support tickets.\nActions\nDescription\nmicrosoft.azure.serviceHealth/allEntities/allTasks\nRead and configure Azure Service Health\nmicrosoft.directory/authorizationPolicy/standard/read\nRead standard properties of authorization policy\nmicrosoft.directory/groups/assignLicense\nAssign product licenses to groups for group-based licensing\nmicrosoft.directory/groups/reprocessLicenseAssignment\nReprocess license assignments for group-based licensing\nmicrosoft.directory/users/assignLicense\nManage user licenses\nmicrosoft.directory/users/reprocessLicenseAssignment\nReprocess license assignments for users\nmicrosoft.directory/users/usageLocation/update\nUpdate usage location of users\nmicrosoft.office365.serviceHealth/allEntities/allTasks\nRead and configure Service Health in the Microsoft 365 admin center\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nLifecycle Workflows Administrator\nThis is a\nprivileged role\n. Assign the Lifecycle Workflows Administrator role to users who need to do the following tasks:\nCreate and manage all aspects of workflows and tasks associated with Lifecycle Workflows in Microsoft Entra ID\nCheck the execution of scheduled workflows\nLaunch on-demand workflow runs\nInspect workflow execution logs\nActions\nDescription\nmicrosoft.directory/lifecycleWorkflows/workflows/allProperties/allTasks\nManage all aspects of lifecycle workflows and tasks in Microsoft Entra ID\nmicrosoft.directory/organization/strongAuthentication/read\nRead strong authentication properties of an organization\nmicrosoft.directory/users/lifeCycleInfo/read\nRead lifecycle information of users, such as employeeLeaveDateTime\nMessage Center Privacy Reader\nUsers in this role can monitor all notifications in the Message Center, including data privacy messages. Message Center Privacy Readers get email notifications including those related to data privacy and they can unsubscribe using Message Center Preferences. Only the Global Administrator and the Message Center Privacy Reader can read data privacy messages. Additionally, this role contains the ability to view groups, domains, and subscriptions. This role has no permission to view, create, or manage service requests.\nActions\nDescription\nmicrosoft.office365.messageCenter/messages/read\nRead messages in Message Center in the Microsoft 365 admin center, excluding security messages\nmicrosoft.office365.messageCenter/securityMessages/read\nRead security messages in Message Center in the Microsoft 365 admin center\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nMessage Center Reader\nUsers in this role can monitor notifications and advisory health updates in\nMessage center\nfor their organization on configured services such as Exchange, Intune, and Microsoft Teams. Message Center Readers receive weekly email digests of posts, updates, and can share message center posts in Microsoft 365. In Microsoft Entra ID, users assigned to this role will only have read-only access on Microsoft Entra services such as users and groups. This role has no access to view, create, or manage support tickets.\nActions\nDescription\nmicrosoft.office365.messageCenter/messages/read\nRead messages in Message Center in the Microsoft 365 admin center, excluding security messages\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nMicrosoft 365 Backup Administrator\nAssign the Microsoft 365 Backup Administrator role to users who need to do the following tasks:\nManage all aspects of Microsoft 365 Backup\nCreate, edit, and manage backup configuration policies for SharePoint, OneDrive, and Exchange Online\nPerform restore operations for backed-up SharePoint sites, OneDrive accounts, and Exchange mailboxes\nActions\nDescription\nmicrosoft.azure.serviceHealth/allEntities/allTasks\nRead and configure Azure Service Health\nmicrosoft.azure.supportTickets/allEntities/allTasks\nCreate and manage Azure support tickets\nmicrosoft.backup/allEntities/allProperties/allTasks\nManage all aspects of Microsoft 365 Backup\nmicrosoft.office365.network/performance/allProperties/read\nRead all network performance properties in the Microsoft 365 admin center\nmicrosoft.office365.serviceHealth/allEntities/allTasks\nRead and configure Service Health in the Microsoft 365 admin center\nmicrosoft.office365.supportTickets/allEntities/allTasks\nCreate and manage Microsoft 365 service requests\nmicrosoft.office365.usageReports/allEntities/allProperties/read\nRead Office 365 usage reports\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nMicrosoft 365 Migration Administrator\nAssign the Microsoft 365 Migration Administrator role to users who need to do the following tasks:\nUse Migration Manager in the Microsoft 365 admin center to manage content migration to Microsoft 365, including Teams, OneDrive for Business, and SharePoint sites, from Google Drive, Dropbox, Box, and Egnyte\nSelect migration sources, create migration inventories (such as Google Drive user lists), schedule and execute migrations, and download reports\nCreate new SharePoint sites if the destination sites don't already exist, create SharePoint lists under the SharePoint admin sites, and create and update items in SharePoint lists\nManage migration project settings and migration lifecycle for tasks\nManage permission mappings from source to destination\nNote\nThis role doesn't allow you to migrate from file share sources using the SharePoint admin center. You can use the SharePoint Administrator role to migrate from file share sources.\nLearn more\nActions\nDescription\nmicrosoft.office365.migrations/allEntities/allProperties/allTasks\nManage all aspects of Microsoft 365 migrations\nmicrosoft.office365.supportTickets/allEntities/allTasks\nCreate and manage Microsoft 365 service requests\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nMicrosoft Entra Joined Device Local Administrator\nThis role is available for assignment only as an additional local administrator in\nDevice settings\n. Users with this role become local machine administrators on all Windows 10 devices that are joined to Microsoft Entra ID. They do not have the ability to manage devices objects in Microsoft Entra ID.\nActions\nDescription\nmicrosoft.directory/groupSettings/standard/read\nRead basic properties on group settings\nmicrosoft.directory/groupSettingTemplates/standard/read\nRead basic properties on group setting templates\nMicrosoft Graph Data Connect Administrator\nAssign the Microsoft Graph Data Connect Administrator role to users who need to do the following tasks:\nAccess the full set of administrative capabilities of Microsoft Graph Data Connect\nManage Microsoft Graph Data Connect settings in a tenant\nEnable or disable the Microsoft Graph Data Connect service\nConfigure dataset workload selections in Microsoft Graph Data Connect\nConfigure cross-tenant data movement settings in Microsoft Graph Data Connect\nView, approve, or deny application authorization requests for Microsoft Graph Data Connect\nView, create, update, or delete application registrations for Microsoft Graph Data Connect\nLearn more\nActions\nDescription\nmicrosoft.azure.serviceHealth/allEntities/allTasks\nRead and configure Azure Service Health\nmicrosoft.azure.supportTickets/allEntities/allTasks\nCreate and manage Azure support tickets\nmicrosoft.graph.dataConnect/allEntities/allProperties/allTasks\nManage aspects of Microsoft Graph Data Connect\nmicrosoft.office365.messageCenter/messages/read\nRead messages in Message Center in the Microsoft 365 admin center, excluding security messages\nmicrosoft.office365.serviceHealth/allEntities/allTasks\nRead and configure Service Health in the Microsoft 365 admin center\nmicrosoft.office365.supportTickets/allEntities/allTasks\nCreate and manage Microsoft 365 service requests\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nMicrosoft Hardware Warranty Administrator\nAssign the Microsoft Hardware Warranty Administrator role to users who need to do the following tasks:\nCreate new warranty claims for Microsoft manufactured hardware, like Surface and HoloLens\nSearch and read opened or closed warranty claims\nSearch and read warranty claims by serial number\nCreate, read, update, and delete shipping addresses\nRead shipping status for open warranty claims\nCreate and manage service requests in the Microsoft 365 admin center\nRead Message center announcements in the Microsoft 365 admin center\nA warranty claim is a request to have the hardware repaired or replaced in accordance with the terms of the warranty. For more information, see\nSelf-serve your Surface warranty & service requests\n.\nActions\nDescription\nmicrosoft.hardware.support/shippingAddress/allProperties/allTasks\nCreate, read, update, and delete shipping addresses for Microsoft hardware warranty claims, including shipping addresses created by others\nmicrosoft.hardware.support/shippingStatus/allProperties/read\nRead shipping status for open Microsoft hardware warranty claims\nmicrosoft.hardware.support/warrantyClaims/allProperties/allTasks\nCreate and manage all aspects of Microsoft hardware warranty claims\nmicrosoft.office365.messageCenter/messages/read\nRead messages in Message Center in the Microsoft 365 admin center, excluding security messages\nmicrosoft.office365.supportTickets/allEntities/allTasks\nCreate and manage Microsoft 365 service requests\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nMicrosoft Hardware Warranty Specialist\nAssign the Microsoft Hardware Warranty Specialist role to users who need to do the following tasks:\nCreate new warranty claims for Microsoft manufactured hardware, like Surface and HoloLens\nRead warranty claims that they created\nRead and update existing shipping addresses\nRead shipping status for open warranty claims they created\nCreate and manage service requests in the Microsoft 365 admin center\nA warranty claim is a request to have the hardware repaired or replaced in accordance with the terms of the warranty. For more information, see\nSelf-serve your Surface warranty & service requests\n.\nActions\nDescription\nmicrosoft.hardware.support/shippingAddress/allProperties/read\nRead shipping addresses for Microsoft hardware warranty claims, including existing shipping addresses created by others\nmicrosoft.hardware.support/shippingStatus/allProperties/read\nRead shipping status for open Microsoft hardware warranty claims\nmicrosoft.hardware.support/warrantyClaims/allProperties/read\nRead Microsoft hardware warranty claims\nmicrosoft.hardware.support/warrantyClaims/createAsOwner\nCreate Microsoft hardware warranty claims where creator is the owner\nmicrosoft.office365.supportTickets/allEntities/allTasks\nCreate and manage Microsoft 365 service requests\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nNetwork Administrator\nUsers in this role can review network perimeter architecture recommendations from Microsoft that are based on network telemetry from their user locations. Network performance for Microsoft 365 relies on careful enterprise customer network perimeter architecture which is generally user location specific. This role allows for editing of discovered user locations and configuration of network parameters for those locations to facilitate improved telemetry measurements and design recommendations.\nActions\nDescription\nmicrosoft.office365.network/locations/allProperties/allTasks\nManage all aspects of network locations\nmicrosoft.office365.network/performance/allProperties/read\nRead all network performance properties in the Microsoft 365 admin center\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nOffice Apps Administrator\nUsers in this role can manage Microsoft 365 apps' cloud settings. This includes managing cloud policies, self-service download management and the ability to view Office apps related report. This role additionally grants the ability to manage support tickets, and monitor service health within the main admin center. Users assigned to this role can also manage communication of new features in Office apps.\nActions\nDescription\nmicrosoft.azure.serviceHealth/allEntities/allTasks\nRead and configure Azure Service Health\nmicrosoft.azure.supportTickets/allEntities/allTasks\nCreate and manage Azure support tickets\nmicrosoft.office365.messageCenter/messages/read\nRead messages in Message Center in the Microsoft 365 admin center, excluding security messages\nmicrosoft.office365.serviceHealth/allEntities/allTasks\nRead and configure Service Health in the Microsoft 365 admin center\nmicrosoft.office365.supportTickets/allEntities/allTasks\nCreate and manage Microsoft 365 service requests\nmicrosoft.office365.userCommunication/allEntities/allTasks\nRead and update what's new messages visibility\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nOrganizational Branding Administrator\nAssign the Organizational Branding Administrator role to users who need to do the following tasks:\nManage all aspects of organizational branding in a tenant\nRead, create, update, and delete branding themes\nManage the default branding theme and all branding localization themes\nActions\nDescription\nmicrosoft.directory/loginOrganizationBranding/allProperties/allTasks\nCreate and delete loginTenantBranding, and read and update all properties\nOrganizational Data Source Administrator\nAssign the Organizational Data Source Administrator role to users who need to do the following tasks:\nManage settings related to ingesting and managing organizational data for Microsoft 365 and Microsoft Viva applications\nUpload, update, and delete the ingested organizational data in Microsoft 365 and Microsoft Viva applications\nExport organizational data from the authorized applications\nLearn more\nActions\nDescription\nmicrosoft.azure.serviceHealth/allEntities/allTasks\nRead and configure Azure Service Health\nmicrosoft.azure.supportTickets/allEntities/allTasks\nCreate and manage Azure support tickets\nmicrosoft.microsoft365.organizationalData/allEntities/allProperties/allTasks\nManage all aspects of organizational data in Microsoft 365\nmicrosoft.office365.messageCenter/messages/read\nRead messages in Message Center in the Microsoft 365 admin center, excluding security messages\nmicrosoft.office365.serviceHealth/allEntities/allTasks\nRead and configure Service Health in the Microsoft 365 admin center\nmicrosoft.office365.supportTickets/allEntities/allTasks\nCreate and manage Microsoft 365 service requests\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nOrganizational Messages Approver\nAssign the Organizational Messages Approver role to users who need to do the following tasks:\nReview, approve, or reject new organizational messages for delivery in the Microsoft 365 admin center before they are sent to users using the Microsoft 365 Organizational Messages platform\nRead all aspects of organizational messages\nRead basic properties on all resources in the Microsoft 365 admin center\nActions\nDescription\nmicrosoft.office365.organizationalMessages/allEntities/allProperties/read\nRead all aspects of Microsoft 365 Organizational Messages\nmicrosoft.office365.organizationalMessages/allEntities/allProperties/update\nApprove or reject new organizational messages for delivery in the Microsoft 365 admin center\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nOrganizational Messages Writer\nAssign the Organizational Messages Writer role to users who need to do the following tasks:\nWrite, publish, and delete organizational messages using Microsoft 365 admin center or Microsoft Intune\nManage organizational message delivery options using Microsoft 365 admin center or Microsoft Intune\nRead organizational message delivery results using Microsoft 365 admin center or Microsoft Intune\nView usage reports and most settings in the Microsoft 365 admin center, but can't make changes\nActions\nDescription\nmicrosoft.office365.organizationalMessages/allEntities/allProperties/allTasks\nManage all authoring aspects of Microsoft 365 Organizational Messages\nmicrosoft.office365.usageReports/allEntities/standard/read\nRead tenant-level aggregated Office 365 usage reports\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nPartner Tier1 Support\nThis is a\nprivileged role\n. Do not use. This role has been deprecated and will be removed from Microsoft Entra ID in the future. This role is intended for use by a small number of Microsoft resale partners, and is not intended for general use.\nImportant\nThis role can reset passwords and invalidate refresh tokens for only non-administrators. This role should not be used because it is deprecated.\nActions\nDescription\nmicrosoft.azure.serviceHealth/allEntities/allTasks\nRead and configure Azure Service Health\nmicrosoft.azure.supportTickets/allEntities/allTasks\nCreate and manage Azure support tickets\nmicrosoft.directory/applications/appRoles/update\nUpdate the appRoles property on all types of applications\nmicrosoft.directory/applications/audience/update\nUpdate the audience property for applications\nmicrosoft.directory/applications/authentication/update\nUpdate authentication on all types of applications\nmicrosoft.directory/applications/basic/update\nUpdate basic properties for applications\nmicrosoft.directory/applications/credentials/update\nUpdate application credentials\nmicrosoft.directory/applications/notes/update\nUpdate notes of applications\nmicrosoft.directory/applications/owners/update\nUpdate owners of applications\nmicrosoft.directory/applications/permissions/update\nUpdate exposed permissions and required permissions on all types of applications\nmicrosoft.directory/applications/policies/update\nUpdate policies of applications\nmicrosoft.directory/applications/tag/update\nUpdate tags of applications\nmicrosoft.directory/contacts/basic/update\nUpdate basic properties on contacts\nmicrosoft.directory/contacts/create\nCreate contacts\nmicrosoft.directory/contacts/delete\nDelete contacts\nmicrosoft.directory/deletedItems.groups/restore\nRestore soft deleted groups to original state\nmicrosoft.directory/deletedItems.users/restore\nRestore soft deleted users to original state\nmicrosoft.directory/groups.unified/assignedLabels/update\nUpdate the assigned labels property on Microsoft 365 groups of assigned membership type, excluding role-assignable groups\nmicrosoft.directory/groups/create\nCreate Security groups and Microsoft 365 groups, excluding role-assignable groups\nmicrosoft.directory/groups/delete\nDelete Security groups and Microsoft 365 groups, excluding role-assignable groups\nmicrosoft.directory/groups/members/update\nUpdate members of Security groups and Microsoft 365 groups, excluding role-assignable groups\nmicrosoft.directory/groups/owners/update\nUpdate owners of Security groups and Microsoft 365 groups, excluding role-assignable groups\nmicrosoft.directory/groups/restore\nRestore groups from soft-deleted container\nmicrosoft.directory/oAuth2PermissionGrants/allProperties/allTasks\nCreate and delete OAuth 2.0 permission grants, and read and update all properties\nmicrosoft.directory/servicePrincipals/appRoleAssignedTo/update\nUpdate service principal role assignments\nmicrosoft.directory/users/assignLicense\nManage user licenses\nmicrosoft.directory/users/basic/update\nUpdate basic properties on users\nmicrosoft.directory/users/create\nAdd users\nmicrosoft.directory/users/delete\nDelete users\nmicrosoft.directory/users/disable\nDisable users\nmicrosoft.directory/users/enable\nEnable users\nmicrosoft.directory/users/invalidateAllRefreshTokens\nForce sign-out by invalidating user refresh tokens\nmicrosoft.directory/users/manager/update\nUpdate manager for users\nmicrosoft.directory/users/password/update\nReset passwords for all users\nmicrosoft.directory/users/photo/update\nUpdate photo of users\nmicrosoft.directory/users/restore\nRestore deleted users\nmicrosoft.directory/users/userPrincipalName/update\nUpdate User Principal Name of users\nmicrosoft.office365.serviceHealth/allEntities/allTasks\nRead and configure Service Health in the Microsoft 365 admin center\nmicrosoft.office365.supportTickets/allEntities/allTasks\nCreate and manage Microsoft 365 service requests\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nPartner Tier2 Support\nThis is a\nprivileged role\n. Do not use. This role has been deprecated and will be removed from Microsoft Entra ID in the future. This role is intended for use by a small number of Microsoft resale partners, and is not intended for general use.\nImportant\nThis role can reset passwords and invalidate refresh tokens for all non-administrators and administrators (including Global Administrators). This role should not be used because it is deprecated.\nActions\nDescription\nmicrosoft.azure.serviceHealth/allEntities/allTasks\nRead and configure Azure Service Health\nmicrosoft.azure.supportTickets/allEntities/allTasks\nCreate and manage Azure support tickets\nmicrosoft.directory/applications/appRoles/update\nUpdate the appRoles property on all types of applications\nmicrosoft.directory/applications/audience/update\nUpdate the audience property for applications\nmicrosoft.directory/applications/authentication/update\nUpdate authentication on all types of applications\nmicrosoft.directory/applications/basic/update\nUpdate basic properties for applications\nmicrosoft.directory/applications/credentials/update\nUpdate application credentials\nmicrosoft.directory/applications/notes/update\nUpdate notes of applications\nmicrosoft.directory/applications/owners/update\nUpdate owners of applications\nmicrosoft.directory/applications/permissions/update\nUpdate exposed permissions and required permissions on all types of applications\nmicrosoft.directory/applications/policies/update\nUpdate policies of applications\nmicrosoft.directory/applications/tag/update\nUpdate tags of applications\nmicrosoft.directory/contacts/basic/update\nUpdate basic properties on contacts\nmicrosoft.directory/contacts/create\nCreate contacts\nmicrosoft.directory/contacts/delete\nDelete contacts\nmicrosoft.directory/deletedItems.groups/restore\nRestore soft deleted groups to original state\nmicrosoft.directory/deletedItems.users/restore\nRestore soft deleted users to original state\nmicrosoft.directory/domains/allProperties/allTasks\nCreate and delete domains, and read and update all properties\nmicrosoft.directory/groups.unified/assignedLabels/update\nUpdate the assigned labels property on Microsoft 365 groups of assigned membership type, excluding role-assignable groups\nmicrosoft.directory/groups/create\nCreate Security groups and Microsoft 365 groups, excluding role-assignable groups\nmicrosoft.directory/groups/delete\nDelete Security groups and Microsoft 365 groups, excluding role-assignable groups\nmicrosoft.directory/groups/members/update\nUpdate members of Security groups and Microsoft 365 groups, excluding role-assignable groups\nmicrosoft.directory/groups/owners/update\nUpdate owners of Security groups and Microsoft 365 groups, excluding role-assignable groups\nmicrosoft.directory/groups/restore\nRestore groups from soft-deleted container\nmicrosoft.directory/oAuth2PermissionGrants/allProperties/allTasks\nCreate and delete OAuth 2.0 permission grants, and read and update all properties\nmicrosoft.directory/organization/basic/update\nUpdate basic properties on organization\nmicrosoft.directory/roleAssignments/allProperties/allTasks\nCreate and delete role assignments, and read and update all role assignment properties\nmicrosoft.directory/roleDefinitions/allProperties/allTasks\nCreate and delete role definitions, and read and update all properties\nmicrosoft.directory/scopedRoleMemberships/allProperties/allTasks\nCreate and delete scopedRoleMemberships, and read and update all properties\nmicrosoft.directory/servicePrincipals/appRoleAssignedTo/update\nUpdate service principal role assignments\nmicrosoft.directory/subscribedSkus/standard/read\nRead basic properties on subscriptions\nmicrosoft.directory/users/assignLicense\nManage user licenses\nmicrosoft.directory/users/basic/update\nUpdate basic properties on users\nmicrosoft.directory/users/create\nAdd users\nmicrosoft.directory/users/delete\nDelete users\nmicrosoft.directory/users/disable\nDisable users\nmicrosoft.directory/users/enable\nEnable users\nmicrosoft.directory/users/invalidateAllRefreshTokens\nForce sign-out by invalidating user refresh tokens\nmicrosoft.directory/users/manager/update\nUpdate manager for users\nmicrosoft.directory/users/password/update\nReset passwords for all users\nmicrosoft.directory/users/photo/update\nUpdate photo of users\nmicrosoft.directory/users/restore\nRestore deleted users\nmicrosoft.directory/users/userPrincipalName/update\nUpdate User Principal Name of users\nmicrosoft.office365.serviceHealth/allEntities/allTasks\nRead and configure Service Health in the Microsoft 365 admin center\nmicrosoft.office365.supportTickets/allEntities/allTasks\nCreate and manage Microsoft 365 service requests\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nPassword Administrator\nThis is a\nprivileged role\n. Users with this role have limited ability to manage passwords. This role does not grant the ability to manage service requests or monitor service health. Whether a Password Administrator can reset a user's password depends on the role the user is assigned. For a list of the roles that a Password Administrator can reset passwords for, see\nWho can reset passwords\n.\nUsers with this role\ncannot\ndo the following:\nCannot change the credentials or reset MFA for members and owners of a\nrole-assignable group\n.\nActions\nDescription\nmicrosoft.directory/users/password/update\nReset passwords for all users\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nPeople Administrator\nAssign the People Administrator role to users who need to do the following tasks:\nUpdate profile photos for all users including administrators\nUpdate people settings for all users, such as pronouns, name pronunciation, and profile card settings\nActions\nDescription\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nmicrosoft.people/users/photo/read\nRead profile photo of user\nmicrosoft.people/users/photo/update\nUpdate profile photo of user\nmicrosoft.peopleAdmin/organization/allProperties/read\nRead people settings for users, such as pronouns, name pronunciation, and profile card settings\nmicrosoft.peopleAdmin/organization/allProperties/update\nUpdate people settings for users, such as pronouns, name pronunciation, and profile card settings\nPermissions Management Administrator\nAssign the Permissions Management Administrator role to users who need to do the following tasks:\nManage all aspects of Microsoft Entra Permissions Management, when the service is present\nLearn more about Permissions Management roles and polices at\nView information about roles/policies\n.\nActions\nDescription\nmicrosoft.permissionsManagement/allEntities/allProperties/allTasks\nManage all aspects of Microsoft Entra Permissions Management\nPlaces Administrator\nAssign the Places Administrator role to users who need to do the following tasks:\nManage all aspects of the Microsoft Places service\nConfigure and manage buildings, floors, rooms, and desks\nOversee and manage associated booking policies\nLearn more\nActions\nDescription\nmicrosoft.places/allEntities/allProperties/allTasks\nManage all aspects of the Microsoft Places service\nPower Platform Administrator\nUsers in this role can create and manage all aspects of environments, Power Apps, Flows, Data Loss Prevention policies. Additionally, users with this role have the ability to manage support tickets and monitor service health.\nActions\nDescription\nmicrosoft.azure.serviceHealth/allEntities/allTasks\nRead and configure Azure Service Health\nmicrosoft.azure.supportTickets/allEntities/allTasks\nCreate and manage Azure support tickets\nmicrosoft.dynamics365/allEntities/allTasks\nManage all aspects of Dynamics 365\nmicrosoft.flow/allEntities/allTasks\nManage all aspects of Microsoft Power Automate\nmicrosoft.office365.serviceHealth/allEntities/allTasks\nRead and configure Service Health in the Microsoft 365 admin center\nmicrosoft.office365.supportTickets/allEntities/allTasks\nCreate and manage Microsoft 365 service requests\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nmicrosoft.powerApps/allEntities/allTasks\nManage all aspects of Power Apps\nPrinter Administrator\nUsers in this role can register printers and manage all aspects of all printer configurations in the Microsoft Universal Print solution, including the Universal Print Connector settings. They can consent to all delegated print permission requests. Printer Administrators also have access to print reports.\nActions\nDescription\nmicrosoft.azure.print/allEntities/allProperties/allTasks\nCreate and delete printers and connectors, and read and update all properties in Microsoft Print\nPrinter Technician\nUsers with this role can register printers and manage printer status in the Microsoft Universal Print solution. They can also read all connector information. Key task a Printer Technician cannot do is set user permissions on printers and sharing printers.\nActions\nDescription\nmicrosoft.azure.print/connectors/allProperties/read\nRead all properties of connectors in Microsoft Print\nmicrosoft.azure.print/printers/allProperties/read\nRead all properties of printers in Microsoft Print\nmicrosoft.azure.print/printers/basic/update\nUpdate basic properties of printers in Microsoft Print\nmicrosoft.azure.print/printers/register\nRegister printers in Microsoft Print\nmicrosoft.azure.print/printers/unregister\nUnregister printers in Microsoft Print\nPrivileged Authentication Administrator\nThis is a\nprivileged role\n. Assign the Privileged Authentication Administrator role to users who need to do the following:\nSet or reset any authentication method (including passwords) for any user, including Global Administrators.\nDelete or restore any users, including Global Administrators. For more information, see\nWho can perform sensitive actions\n.\nForce users to re-register against existing non-password credential (such as MFA or FIDO2) and revoke\nremember MFA on the device\n, prompting for MFA on the next sign-in of all users.\nUpdate sensitive properties for all users. For more information, see\nWho can perform sensitive actions\n.\nCreate and manage support tickets in Azure and the Microsoft 365 admin center.\nConfigure certificate authorities with a PKI-based trust store (preview)\nUsers with this role\ncannot\ndo the following:\nCannot manage per-user MFA in the legacy MFA management portal.\nThe following table compares the capabilities of authentication-related roles.\nRole\nManage user's auth methods\nManage per-user MFA\nManage MFA settings\nManage auth method policy\nManage password protection policy\nUpdate sensitive properties\nDelete and restore users\nAuthentication Administrator\nYes for\nsome users\nNo\nNo\nNo\nNo\nYes for\nsome users\nYes for\nsome users\nPrivileged Authentication Administrator\nYes for all users\nNo\nNo\nNo\nNo\nYes for all users\nYes for all users\nAuthentication Policy Administrator\nNo\nYes\nYes\nYes\nYes\nNo\nNo\nUser Administrator\nNo\nNo\nNo\nNo\nNo\nYes for\nsome users\nYes for\nsome users\nImportant\nUsers with this role can change credentials for people who may have access to sensitive or private information or critical configuration inside and outside of Microsoft Entra ID. Changing the credentials of a user may mean the ability to assume that user's identity and permissions. For example:\nApplication Registration and Enterprise Application owners, who can manage credentials of apps they own. Those apps may have privileged permissions in Microsoft Entra ID and elsewhere not granted to Authentication Administrators. Through this path an Authentication Administrator can assume the identity of an application owner and then further assume the identity of a privileged application by updating the credentials for the application.\nAzure subscription owners, who may have access to sensitive or private information or critical configuration in Azure.\nSecurity Group and Microsoft 365 group owners, who can manage group membership. Those groups may grant access to sensitive or private information or critical configuration in Microsoft Entra ID and elsewhere.\nAdministrators in other services outside of Microsoft Entra ID like Exchange Online, Microsoft 365 Defender portal, and Microsoft Purview portal, and human resources systems.\nNon-administrators like executives, legal counsel, and human resources employees who may have access to sensitive or private information.\nActions\nDescription\nmicrosoft.azure.serviceHealth/allEntities/allTasks\nRead and configure Azure Service Health\nmicrosoft.azure.supportTickets/allEntities/allTasks\nCreate and manage Azure support tickets\nmicrosoft.directory/deletedItems.users/restore\nRestore soft deleted users to original state\nmicrosoft.directory/users/authenticationMethods/basic/update\nUpdate basic properties of authentication methods for users\nmicrosoft.directory/users/authenticationMethods/create\nUpdate authentication methods for users\nmicrosoft.directory/users/authenticationMethods/delete\nDelete authentication methods for users\nmicrosoft.directory/users/authenticationMethods/standard/read\nRead standard properties of authentication methods for users\nmicrosoft.directory/users/authorizationInfo/update\nUpdate the multivalued Certificate user IDs property of users\nmicrosoft.directory/users/basic/update\nUpdate basic properties on users\nmicrosoft.directory/users/delete\nDelete users\nmicrosoft.directory/users/disable\nDisable users\nmicrosoft.directory/users/enable\nEnable users\nmicrosoft.directory/users/invalidateAllRefreshTokens\nForce sign-out by invalidating user refresh tokens\nmicrosoft.directory/users/manager/update\nUpdate manager for users\nmicrosoft.directory/users/password/update\nReset passwords for all users\nmicrosoft.directory/users/restore\nRestore deleted users\nmicrosoft.directory/users/userPrincipalName/update\nUpdate User Principal Name of users\nmicrosoft.office365.serviceHealth/allEntities/allTasks\nRead and configure Service Health in the Microsoft 365 admin center\nmicrosoft.office365.supportTickets/allEntities/allTasks\nCreate and manage Microsoft 365 service requests\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nPrivileged Role Administrator\nThis is a\nprivileged role\n. Users with this role can manage role assignments in Microsoft Entra ID, as well as within Microsoft Entra Privileged Identity Management. They can create and manage groups that can be assigned to Microsoft Entra roles. In addition, this role allows management of all aspects of Privileged Identity Management and administrative units.\nImportant\nThis role grants the ability to manage assignments for all Microsoft Entra roles including the Global Administrator role. This role does not include any other privileged abilities in Microsoft Entra ID like creating or updating users. However, users assigned to this role can grant themselves or others additional privilege by assigning additional roles.\nActions\nDescription\nmicrosoft.directory/accessReviews/definitions.applications/allProperties/read\nRead all properties of access reviews of application role assignments in Microsoft Entra ID\nmicrosoft.directory/accessReviews/definitions.directoryRoles/allProperties/allTasks\nManage access reviews for Microsoft Entra role assignments\nmicrosoft.directory/accessReviews/definitions.groups/allProperties/read\nRead all properties of access reviews for membership in Security and Microsoft 365 groups, including role-assignable groups.\nmicrosoft.directory/accessReviews/definitions.groupsAssignableToRoles/allProperties/update\nUpdate all properties of access reviews for membership in groups that are assignable to Microsoft Entra roles\nmicrosoft.directory/accessReviews/definitions.groupsAssignableToRoles/create\nCreate access reviews for membership in groups that are assignable to Microsoft Entra roles\nmicrosoft.directory/accessReviews/definitions.groupsAssignableToRoles/delete\nDelete access reviews for membership in groups that are assignable to Microsoft Entra roles\nmicrosoft.directory/administrativeUnits/allProperties/allTasks\nCreate and manage administrative units (including members)\nmicrosoft.directory/authorizationPolicy/allProperties/allTasks\nManage all aspects of authorization policy\nmicrosoft.directory/directoryRoles/allProperties/allTasks\nCreate and delete directory roles, and read and update all properties\nmicrosoft.directory/groupsAssignableToRoles/allProperties/update\nUpdate role-assignable groups\nmicrosoft.directory/groupsAssignableToRoles/assignLicense\nAssign a license to role-assignable groups\nmicrosoft.directory/groupsAssignableToRoles/create\nCreate role-assignable groups\nmicrosoft.directory/groupsAssignableToRoles/delete\nDelete role-assignable groups\nmicrosoft.directory/groupsAssignableToRoles/reprocessLicenseAssignment\nReprocess license assignments to role-assignable groups\nmicrosoft.directory/groupsAssignableToRoles/restore\nRestore role-assignable groups\nmicrosoft.directory/oAuth2PermissionGrants/allProperties/allTasks\nCreate and delete OAuth 2.0 permission grants, and read and update all properties\nmicrosoft.directory/permissionGrantPolicies/allProperties/read\nRead all properties of permission grant policies\nmicrosoft.directory/permissionGrantPolicies/allProperties/update\nUpdate all properties of permission grant policies\nmicrosoft.directory/permissionGrantPolicies/create\nCreate permission grant policies\nmicrosoft.directory/permissionGrantPolicies/delete\nDelete permission grant policies\nmicrosoft.directory/privilegedIdentityManagement/allProperties/allTasks\nCreate and delete all resources, and read and update standard properties in Privileged Identity Management\nmicrosoft.directory/roleAssignments/allProperties/allTasks\nCreate and delete role assignments, and read and update all role assignment properties\nmicrosoft.directory/roleDefinitions/allProperties/allTasks\nCreate and delete role definitions, and read and update all properties\nmicrosoft.directory/scopedRoleMemberships/allProperties/allTasks\nCreate and delete scopedRoleMemberships, and read and update all properties\nmicrosoft.directory/servicePrincipals/appRoleAssignedTo/update\nUpdate service principal role assignments\nmicrosoft.directory/servicePrincipals/managePermissionGrantsForAll.microsoft-company-admin\nGrant consent for any permission to any application\nmicrosoft.directory/servicePrincipals/permissions/update\nUpdate permissions of service principals\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nReports Reader\nUsers with this role can view usage reporting data and the reports dashboard in Microsoft 365 admin center and the adoption context pack in Fabric and Power BI. Additionally, the role provides access to all sign-in logs, audit logs, and activity reports in Microsoft Entra ID and data returned by the Microsoft Graph reporting API. A user assigned to the Reports Reader role can access only relevant usage and adoption metrics. They don't have any admin permissions to configure settings or access the product-specific admin centers like Exchange. This role has no access to view, create, or manage support tickets.\nActions\nDescription\nmicrosoft.azure.serviceHealth/allEntities/allTasks\nRead and configure Azure Service Health\nmicrosoft.directory/auditLogs/allProperties/read\nRead all properties on audit logs, excluding custom security attributes audit logs\nmicrosoft.directory/provisioningLogs/allProperties/read\nRead all properties of provisioning logs\nmicrosoft.directory/signInReports/allProperties/read\nRead all properties on sign-in reports, including privileged properties\nmicrosoft.office365.network/performance/allProperties/read\nRead all network performance properties in the Microsoft 365 admin center\nmicrosoft.office365.usageReports/allEntities/allProperties/read\nRead Office 365 usage reports\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nSearch Administrator\nUsers in this role have full access to all Microsoft Search management features in the Microsoft 365 admin center. Additionally, these users can view the message center, monitor service health, and create service requests.\nActions\nDescription\nmicrosoft.office365.messageCenter/messages/read\nRead messages in Message Center in the Microsoft 365 admin center, excluding security messages\nmicrosoft.office365.search/content/manage\nCreate and delete content, and read and update all properties in Microsoft Search\nmicrosoft.office365.serviceHealth/allEntities/allTasks\nRead and configure Service Health in the Microsoft 365 admin center\nmicrosoft.office365.supportTickets/allEntities/allTasks\nCreate and manage Microsoft 365 service requests\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nSearch Editor\nUsers in this role can create, manage, and delete content for Microsoft Search in the Microsoft 365 admin center, including bookmarks, Q&As, and locations.\nActions\nDescription\nmicrosoft.office365.messageCenter/messages/read\nRead messages in Message Center in the Microsoft 365 admin center, excluding security messages\nmicrosoft.office365.search/content/manage\nCreate and delete content, and read and update all properties in Microsoft Search\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nSecurity Administrator\nThis is a\nprivileged role\n. Users with this role have permissions to manage security-related features in the Microsoft 365 Defender portal, Microsoft Entra ID Protection, Microsoft Entra Authentication, Azure Information Protection, and Microsoft Purview portal. For more information about Office 365 permissions, see\nRoles and role groups in Microsoft Defender for Office 365 and Microsoft Purview compliance\n.\nIn\nCan do\nMicrosoft 365 Defender portal\nMonitor security-related policies across Microsoft 365 services\nManage security threats and alerts\nView reports\nMicrosoft Entra ID Protection\nAll permissions of the Security Reader role\nPerform all ID Protection operations except for resetting passwords\nPrivileged Identity Management\nAll permissions of the Security Reader role\nCannot\nmanage Microsoft Entra role assignments or settings\nMicrosoft Purview portal\nManage security policies\nView, investigate, and respond to security threats\nView reports\nAzure Advanced Threat Protection\nMonitor and respond to suspicious security activity\nMicrosoft Defender for Endpoint\nAssign roles\nManage machine groups\nConfigure endpoint threat detection and automated remediation\nView, investigate, and respond to alerts\nView machines/device inventory\nIntune\nMaps to the\nIntune Endpoint Security Manager role\nMicrosoft Defender for Cloud Apps\nAdd admins, add policies and settings, upload logs and perform governance actions\nMicrosoft 365 service health\nView the health of Microsoft 365 services\nSmart lockout\nDefine the threshold and duration for lockouts when failed sign-in events happen.\nPassword Protection\nConfigure custom banned password list or on-premises password protection.\nCross-tenant synchronization\nConfigure cross-tenant access settings for users in another tenant. Security Administrators can't directly create and delete users, but can indirectly create and delete synchronized users from another tenant when both tenants are configured for cross-tenant synchronization, which is a privileged permission.\nActions\nDescription\nmicrosoft.azure.serviceHealth/allEntities/allTasks\nRead and configure Azure Service Health\nmicrosoft.azure.supportTickets/allEntities/allTasks\nCreate and manage Azure support tickets\nmicrosoft.directory/applications/policies/update\nUpdate policies of applications\nmicrosoft.directory/auditLogs/allProperties/read\nRead all properties on audit logs, excluding custom security attributes audit logs\nmicrosoft.directory/authorizationPolicy/standard/read\nRead standard properties of authorization policy\nmicrosoft.directory/bitlockerKeys/key/read\nRead bitlocker metadata and key on devices\nmicrosoft.directory/conditionalAccessPolicies/basic/update\nUpdate basic properties for Conditional Access policies\nmicrosoft.directory/conditionalAccessPolicies/create\nCreate Conditional Access policies\nmicrosoft.directory/conditionalAccessPolicies/delete\nDelete Conditional Access policies\nmicrosoft.directory/conditionalAccessPolicies/owners/read\nRead the owners of Conditional Access policies\nmicrosoft.directory/conditionalAccessPolicies/owners/update\nUpdate owners for Conditional Access policies\nmicrosoft.directory/conditionalAccessPolicies/policyAppliedTo/read\nRead the \"applied to\" property for Conditional Access policies\nmicrosoft.directory/conditionalAccessPolicies/standard/read\nRead Conditional Access for policies\nmicrosoft.directory/conditionalAccessPolicies/tenantDefault/update\nUpdate the default tenant for Conditional Access policies\nmicrosoft.directory/crossTenantAccessPolicy/allowedCloudEndpoints/update\nUpdate allowed cloud endpoints of cross-tenant access policy\nmicrosoft.directory/crossTenantAccessPolicy/basic/update\nUpdate basic settings of cross-tenant access policy\nmicrosoft.directory/crossTenantAccessPolicy/default/b2bCollaboration/update\nUpdate Microsoft Entra B2B collaboration settings of the default cross-tenant access policy\nmicrosoft.directory/crossTenantAccessPolicy/default/b2bDirectConnect/update\nUpdate Microsoft Entra B2B direct connect settings of the default cross-tenant access policy\nmicrosoft.directory/crossTenantAccessPolicy/default/crossCloudMeetings/update\nUpdate cross-cloud Teams meeting settings of the default cross-tenant access policy\nmicrosoft.directory/crossTenantAccessPolicy/default/standard/read\nRead basic properties of the default cross-tenant access policy\nmicrosoft.directory/crossTenantAccessPolicy/default/tenantRestrictions/update\nUpdate tenant restrictions of the default cross-tenant access policy\nmicrosoft.directory/crossTenantAccessPolicy/partners/b2bCollaboration/update\nUpdate Microsoft Entra B2B collaboration settings of cross-tenant access policy for partners\nmicrosoft.directory/crossTenantAccessPolicy/partners/b2bDirectConnect/update\nUpdate Microsoft Entra B2B direct connect settings of cross-tenant access policy for partners\nmicrosoft.directory/crossTenantAccessPolicy/partners/create\nCreate cross-tenant access policy for partners\nmicrosoft.directory/crossTenantAccessPolicy/partners/crossCloudMeetings/update\nUpdate cross-cloud Teams meeting settings of cross-tenant access policy for partners\nmicrosoft.directory/crossTenantAccessPolicy/partners/delete\nDelete cross-tenant access policy for partners\nmicrosoft.directory/crossTenantAccessPolicy/partners/identitySynchronization/basic/update\nUpdate basic settings of cross-tenant sync policy\nmicrosoft.directory/crossTenantAccessPolicy/partners/identitySynchronization/create\nCreate cross-tenant sync policy for partners\nmicrosoft.directory/crossTenantAccessPolicy/partners/identitySynchronization/standard/read\nRead basic properties of cross-tenant sync policy\nmicrosoft.directory/crossTenantAccessPolicy/partners/standard/read\nRead basic properties of cross-tenant access policy for partners\nmicrosoft.directory/crossTenantAccessPolicy/partners/templates/multiTenantOrganizationIdentitySynchronization/basic/update\nUpdate cross tenant sync policy templates for multi-tenant organization\nmicrosoft.directory/crossTenantAccessPolicy/partners/templates/multiTenantOrganizationIdentitySynchronization/resetToDefaultSettings\nReset cross tenant sync policy template for multi-tenant organization to default settings\nmicrosoft.directory/crossTenantAccessPolicy/partners/templates/multiTenantOrganizationIdentitySynchronization/standard/read\nRead basic properties of cross tenant sync policy templates for multi-tenant organization\nmicrosoft.directory/crossTenantAccessPolicy/partners/templates/multiTenantOrganizationPartnerConfiguration/basic/update\nUpdate cross tenant access policy templates for multi-tenant organization\nmicrosoft.directory/crossTenantAccessPolicy/partners/templates/multiTenantOrganizationPartnerConfiguration/resetToDefaultSettings\nReset cross tenant access policy template for multi-tenant organization to default settings\nmicrosoft.directory/crossTenantAccessPolicy/partners/templates/multiTenantOrganizationPartnerConfiguration/standard/read\nRead basic properties of cross tenant access policy templates for multi-tenant organization\nmicrosoft.directory/crossTenantAccessPolicy/partners/tenantRestrictions/update\nUpdate tenant restrictions of cross-tenant access policy for partners\nmicrosoft.directory/crossTenantAccessPolicy/standard/read\nRead basic properties of cross-tenant access policy\nmicrosoft.directory/deviceLocalCredentials/standard/read\nRead all properties of the backed up local administrator account credentials for Microsoft Entra joined devices, except the password\nmicrosoft.directory/domains/federation/update\nUpdate federation property of domains\nmicrosoft.directory/domains/federationConfiguration/basic/update\nUpdate basic federation configuration for domains\nmicrosoft.directory/domains/federationConfiguration/create\nCreate federation configuration for domains\nmicrosoft.directory/domains/federationConfiguration/delete\nDelete federation configuration for domains\nmicrosoft.directory/domains/federationConfiguration/standard/read\nRead standard properties of federation configuration for domains\nmicrosoft.directory/entitlementManagement/allProperties/read\nRead all properties in Microsoft Entra entitlement management\nmicrosoft.directory/identityProtection/allProperties/read\nRead all resources in Microsoft Entra ID Protection\nmicrosoft.directory/identityProtection/allProperties/update\nUpdate all resources in Microsoft Entra ID Protection\nmicrosoft.directory/multiTenantOrganization/basic/update\nUpdate basic properties of a multi-tenant organization\nmicrosoft.directory/multiTenantOrganization/create\nCreate a multi-tenant organization\nmicrosoft.directory/multiTenantOrganization/joinRequest/organizationDetails/update\nJoin a multi-tenant organization\nmicrosoft.directory/multiTenantOrganization/joinRequest/standard/read\nRead properties of a multi-tenant organization join request\nmicrosoft.directory/multiTenantOrganization/standard/read\nRead basic properties of a multi-tenant organization\nmicrosoft.directory/multiTenantOrganization/tenants/create\nCreate a tenant in a multi-tenant organization\nmicrosoft.directory/multiTenantOrganization/tenants/delete\nDelete a tenant participating in a multi-tenant organization\nmicrosoft.directory/multiTenantOrganization/tenants/organizationDetails/read\nRead organization details of a tenant participating in a multi-tenant organization\nmicrosoft.directory/multiTenantOrganization/tenants/organizationDetails/update\nUpdate basic properties of a tenant participating in a multi-tenant organization\nmicrosoft.directory/multiTenantOrganization/tenants/standard/read\nRead basic properties of a tenant participating in a multi-tenant organization\nmicrosoft.directory/namedLocations/basic/update\nUpdate basic properties of custom rules that define network locations\nmicrosoft.directory/namedLocations/create\nCreate custom rules that define network locations\nmicrosoft.directory/namedLocations/delete\nDelete custom rules that define network locations\nmicrosoft.directory/namedLocations/standard/read\nRead basic properties of custom rules that define network locations\nmicrosoft.directory/policies/basic/update\nUpdate basic properties on policies\nmicrosoft.directory/policies/create\nCreate policies in Microsoft Entra ID\nmicrosoft.directory/policies/delete\nDelete policies in Microsoft Entra ID\nmicrosoft.directory/policies/owners/update\nUpdate owners of policies\nmicrosoft.directory/policies/tenantDefault/update\nUpdate default organization policies\nmicrosoft.directory/privilegedIdentityManagement/allProperties/read\nRead all resources in Privileged Identity Management\nmicrosoft.directory/provisioningLogs/allProperties/read\nRead all properties of provisioning logs\nmicrosoft.directory/resourceNamespaces/resourceActions/authenticationContext/update\nUpdate Conditional Access authentication context of Microsoft 365 role-based access control (RBAC) resource actions\nmicrosoft.directory/servicePrincipals/policies/update\nUpdate policies of service principals\nmicrosoft.directory/signInReports/allProperties/read\nRead all properties on sign-in reports, including privileged properties\nmicrosoft.networkAccess/allEntities/allProperties/allTasks\nManage all aspects of Microsoft Entra Network Access\nmicrosoft.office365.protectionCenter/allEntities/basic/update\nUpdate basic properties of all resources in the Security and Compliance centers\nmicrosoft.office365.protectionCenter/allEntities/standard/read\nRead standard properties of all resources in the Security and Compliance centers\nmicrosoft.office365.protectionCenter/attackSimulator/payload/allProperties/allTasks\nCreate and manage attack payloads in Attack Simulator\nmicrosoft.office365.protectionCenter/attackSimulator/reports/allProperties/read\nRead reports of attack simulation, responses, and associated training\nmicrosoft.office365.protectionCenter/attackSimulator/simulation/allProperties/allTasks\nCreate and manage attack simulation templates in Attack Simulator\nmicrosoft.office365.serviceHealth/allEntities/allTasks\nRead and configure Service Health in the Microsoft 365 admin center\nmicrosoft.office365.supportTickets/allEntities/allTasks\nCreate and manage Microsoft 365 service requests\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nSecurity Operator\nThis is a\nprivileged role\n. Users with this role can manage alerts and have global read-only access on security-related features, including all information in Microsoft 365 Defender portal, Microsoft Entra ID Protection, Privileged Identity Management, and Microsoft Purview portal. For more information about Office 365 permissions, see\nRoles and role groups in Microsoft Defender for Office 365 and Microsoft Purview compliance\n.\nIn\nCan do\nMicrosoft 365 Defender portal\nAll permissions of the Security Reader role\nView, investigate, and respond to security threats alerts\nManage security settings in Microsoft 365 Defender portal\nMicrosoft Entra ID Protection\nAll permissions of the Security Reader role\nPerform all ID Protection operations except for configuring or changing risk-based policies, resetting passwords, and configuring alert e-mails.\nPrivileged Identity Management\nAll permissions of the Security Reader role\nMicrosoft Purview portal\nAll permissions of the Security Reader role\nView, investigate, and respond to security alerts\nMicrosoft Defender for Endpoint\nAll permissions of the Security Reader role\nView, investigate, and respond to security alerts\nWhen you turn on role-based access control in Microsoft Defender for Endpoint, users with read-only permissions such as the Security Reader role lose access until they're assigned a Microsoft Defender for Endpoint role.\nIntune\nAll permissions of the Security Reader role\nMicrosoft Defender for Cloud Apps\nAll permissions of the Security Reader role\nView, investigate, and respond to security alerts\nMicrosoft 365 service health\nView the health of Microsoft 365 services\nActions\nDescription\nmicrosoft.azure.advancedThreatProtection/allEntities/allTasks\nManage all aspects of Azure Advanced Threat Protection\nmicrosoft.azure.supportTickets/allEntities/allTasks\nCreate and manage Azure support tickets\nmicrosoft.directory/auditLogs/allProperties/read\nRead all properties on audit logs, excluding custom security attributes audit logs\nmicrosoft.directory/authorizationPolicy/standard/read\nRead standard properties of authorization policy\nmicrosoft.directory/cloudAppSecurity/allProperties/allTasks\nCreate and delete all resources, and read and update standard properties in Microsoft Defender for Cloud Apps\nmicrosoft.directory/identityProtection/allProperties/allTasks\nCreate and delete all resources, and read and update standard properties in Microsoft Entra ID Protection\nmicrosoft.directory/privilegedIdentityManagement/allProperties/read\nRead all resources in Privileged Identity Management\nmicrosoft.directory/provisioningLogs/allProperties/read\nRead all properties of provisioning logs\nmicrosoft.directory/signInReports/allProperties/read\nRead all properties on sign-in reports, including privileged properties\nmicrosoft.intune/allEntities/read\nRead all resources in Microsoft Intune\nmicrosoft.office365.securityComplianceCenter/allEntities/allTasks\nCreate and delete all resources, and read and update standard properties in the Microsoft 365 Security and Compliance Center\nmicrosoft.office365.supportTickets/allEntities/allTasks\nCreate and manage Microsoft 365 service requests\nmicrosoft.windows.defenderAdvancedThreatProtection/allEntities/allTasks\nManage all aspects of Microsoft Defender for Endpoint\nSecurity Reader\nThis is a\nprivileged role\n. Users with this role have global read-only access on security-related feature, including all information in Microsoft 365 Defender portal, Microsoft Entra ID Protection, Privileged Identity Management, and the ability to read Microsoft Entra sign-in reports and audit logs, and in Microsoft Purview portal. For more information about Office 365 permissions, see\nRoles and role groups in Microsoft Defender for Office 365 and Microsoft Purview compliance\n.\nIn\nCan do\nMicrosoft 365 Defender portal\nView security-related policies across Microsoft 365 services\nView security threats and alerts\nView reports\nMicrosoft Entra ID Protection\nView all ID Protection reports and Overview\nPrivileged Identity Management\nHas read-only access to all information surfaced in Microsoft Entra Privileged Identity Management: Policies and reports for Microsoft Entra role assignments and security reviews.\nCannot\nsign up for Microsoft Entra Privileged Identity Management or make any changes to it. In the Privileged Identity Management portal or via PowerShell, someone in this role can activate additional roles (for example, Privileged Role Administrator), if the user is eligible for them.\nMicrosoft Purview portal\nView security policies\nView and investigate security threats\nView reports\nMicrosoft Defender for Endpoint\nView and investigate alerts\nWhen you turn on role-based access control in Microsoft Defender for Endpoint, users with read-only permissions such as the Security Reader role lose access until they're assigned a Microsoft Defender for Endpoint role.\nIntune\nViews user, device, enrollment, configuration, and application information. Cannot make changes to Intune.\nMicrosoft Defender for Cloud Apps\nHas read permissions.\nMicrosoft 365 service health\nView the health of Microsoft 365 services\nActions\nDescription\nmicrosoft.azure.serviceHealth/allEntities/allTasks\nRead and configure Azure Service Health\nmicrosoft.directory/accessReviews/definitions/allProperties/read\nRead all properties of access reviews of all reviewable resources in Microsoft Entra ID\nmicrosoft.directory/auditLogs/allProperties/read\nRead all properties on audit logs, excluding custom security attributes audit logs\nmicrosoft.directory/authorizationPolicy/standard/read\nRead standard properties of authorization policy\nmicrosoft.directory/bitlockerKeys/key/read\nRead bitlocker metadata and key on devices\nmicrosoft.directory/conditionalAccessPolicies/owners/read\nRead the owners of Conditional Access policies\nmicrosoft.directory/conditionalAccessPolicies/policyAppliedTo/read\nRead the \"applied to\" property for Conditional Access policies\nmicrosoft.directory/conditionalAccessPolicies/standard/read\nRead Conditional Access for policies\nmicrosoft.directory/crossTenantAccessPolicy/partners/templates/multiTenantOrganizationIdentitySynchronization/standard/read\nRead basic properties of cross tenant sync policy templates for multi-tenant organization\nmicrosoft.directory/crossTenantAccessPolicy/partners/templates/multiTenantOrganizationPartnerConfiguration/standard/read\nRead basic properties of cross tenant access policy templates for multi-tenant organization\nmicrosoft.directory/deviceLocalCredentials/standard/read\nRead all properties of the backed up local administrator account credentials for Microsoft Entra joined devices, except the password\nmicrosoft.directory/domains/federationConfiguration/standard/read\nRead standard properties of federation configuration for domains\nmicrosoft.directory/entitlementManagement/allProperties/read\nRead all properties in Microsoft Entra entitlement management\nmicrosoft.directory/identityProtection/allProperties/read\nRead all resources in Microsoft Entra ID Protection\nmicrosoft.directory/multiTenantOrganization/joinRequest/standard/read\nRead properties of a multi-tenant organization join request\nmicrosoft.directory/multiTenantOrganization/standard/read\nRead basic properties of a multi-tenant organization\nmicrosoft.directory/multiTenantOrganization/tenants/organizationDetails/read\nRead organization details of a tenant participating in a multi-tenant organization\nmicrosoft.directory/multiTenantOrganization/tenants/standard/read\nRead basic properties of a tenant participating in a multi-tenant organization\nmicrosoft.directory/namedLocations/standard/read\nRead basic properties of custom rules that define network locations\nmicrosoft.directory/policies/owners/read\nRead owners of policies\nmicrosoft.directory/policies/policyAppliedTo/read\nRead policies.policyAppliedTo property\nmicrosoft.directory/policies/standard/read\nRead basic properties on policies\nmicrosoft.directory/privilegedIdentityManagement/allProperties/read\nRead all resources in Privileged Identity Management\nmicrosoft.directory/provisioningLogs/allProperties/read\nRead all properties of provisioning logs\nmicrosoft.directory/signInReports/allProperties/read\nRead all properties on sign-in reports, including privileged properties\nmicrosoft.networkAccess/allEntities/allProperties/read\nRead all aspects of Microsoft Entra Network Access\nmicrosoft.office365.protectionCenter/allEntities/standard/read\nRead standard properties of all resources in the Security and Compliance centers\nmicrosoft.office365.protectionCenter/attackSimulator/payload/allProperties/read\nRead all properties of attack payloads in Attack Simulator\nmicrosoft.office365.protectionCenter/attackSimulator/reports/allProperties/read\nRead reports of attack simulation, responses, and associated training\nmicrosoft.office365.protectionCenter/attackSimulator/simulation/allProperties/read\nRead all properties of attack simulation templates in Attack Simulator\nmicrosoft.office365.serviceHealth/allEntities/allTasks\nRead and configure Service Health in the Microsoft 365 admin center\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nService Support Administrator\nUsers with this role can create and manage support requests with Microsoft for Azure and Microsoft 365 services, and view the service dashboard and message center in the\nAzure portal\nand\nMicrosoft 365 admin center\n. For more information, see\nAbout admin roles in the Microsoft 365 admin center\n.\nNote\nThis role was previously named Service Administrator in the\nAzure portal\nand\nMicrosoft 365 admin center\n. It was renamed to Service Support Administrator to align with the existing name in the Microsoft Graph API and Microsoft Graph PowerShell.\nActions\nDescription\nmicrosoft.azure.serviceHealth/allEntities/allTasks\nRead and configure Azure Service Health\nmicrosoft.azure.supportTickets/allEntities/allTasks\nCreate and manage Azure support tickets\nmicrosoft.office365.network/performance/allProperties/read\nRead all network performance properties in the Microsoft 365 admin center\nmicrosoft.office365.serviceHealth/allEntities/allTasks\nRead and configure Service Health in the Microsoft 365 admin center\nmicrosoft.office365.supportTickets/allEntities/allTasks\nCreate and manage Microsoft 365 service requests\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nSharePoint Administrator\nUsers with this role have global permissions within Microsoft SharePoint Online, when the service is present, as well as the ability to create and manage all Microsoft 365 groups, manage support tickets, and monitor service health. For more information, see\nAbout admin roles in the Microsoft 365 admin center\n.\nNote\nIn the Microsoft Graph API and Microsoft Graph PowerShell, this role is named SharePoint Service Administrator. In the\nAzure portal\n, it is named SharePoint Administrator.\nNote\nThis role also grants scoped permissions to the Microsoft Graph API for Microsoft Intune, allowing the management and configuration of policies related to SharePoint and OneDrive resources.\nActions\nDescription\nmicrosoft.azure.serviceHealth/allEntities/allTasks\nRead and configure Azure Service Health\nmicrosoft.azure.supportTickets/allEntities/allTasks\nCreate and manage Azure support tickets\nmicrosoft.backup/oneDriveForBusinessProtectionPolicies/allProperties/allTasks\nCreate and manage OneDrive protection policy in Microsoft 365 Backup\nmicrosoft.backup/oneDriveForBusinessRestoreSessions/allProperties/allTasks\nRead and configure restore session for OneDrive in Microsoft 365 Backup\nmicrosoft.backup/restorePoints/sites/allProperties/allTasks\nManage all restore points associated with selected SharePoint sites in Microsoft 365 Backup\nmicrosoft.backup/restorePoints/userDrives/allProperties/allTasks\nManage all restore points associated with selected OneDrive accounts in Microsoft 365 Backup\nmicrosoft.backup/sharePointProtectionPolicies/allProperties/allTasks\nCreate and manage SharePoint protection policy in Microsoft 365 Backup\nmicrosoft.backup/sharePointRestoreSessions/allProperties/allTasks\nRead and configure restore session for SharePoint in Microsoft 365 Backup\nmicrosoft.backup/siteProtectionUnits/allProperties/allTasks\nManage sites added to SharePoint protection policy in Microsoft 365 Backup\nmicrosoft.backup/siteRestoreArtifacts/allProperties/allTasks\nManage sites added to restore session for SharePoint in Microsoft 365 Backup\nmicrosoft.backup/userDriveProtectionUnits/allProperties/allTasks\nManage accounts added to OneDrive protection policy in Microsoft 365 Backup\nmicrosoft.backup/userDriveRestoreArtifacts/allProperties/allTasks\nManage accounts added to restore session for OneDrive in Microsoft 365 Backup\nmicrosoft.directory/groups.unified/assignedLabels/update\nUpdate the assigned labels property on Microsoft 365 groups of assigned membership type, excluding role-assignable groups\nmicrosoft.directory/groups.unified/basic/update\nUpdate basic properties on Microsoft 365 groups, excluding role-assignable groups\nmicrosoft.directory/groups.unified/create\nCreate Microsoft 365 groups, excluding role-assignable groups\nmicrosoft.directory/groups.unified/delete\nDelete Microsoft 365 groups, excluding role-assignable groups\nmicrosoft.directory/groups.unified/members/update\nUpdate members of Microsoft 365 groups, excluding role-assignable groups\nmicrosoft.directory/groups.unified/owners/update\nUpdate owners of Microsoft 365 groups, excluding role-assignable groups\nmicrosoft.directory/groups.unified/restore\nRestore Microsoft 365 groups from soft-deleted container, excluding role-assignable groups\nmicrosoft.directory/groups/hiddenMembers/read\nRead hidden members of Security groups and Microsoft 365 groups, including role-assignable groups\nmicrosoft.office365.migrations/allEntities/allProperties/allTasks\nManage all aspects of Microsoft 365 migrations\nmicrosoft.office365.network/performance/allProperties/read\nRead all network performance properties in the Microsoft 365 admin center\nmicrosoft.office365.serviceHealth/allEntities/allTasks\nRead and configure Service Health in the Microsoft 365 admin center\nmicrosoft.office365.sharePoint/allEntities/allTasks\nCreate and delete all resources, and read and update standard properties in SharePoint\nmicrosoft.office365.supportTickets/allEntities/allTasks\nCreate and manage Microsoft 365 service requests\nmicrosoft.office365.usageReports/allEntities/allProperties/read\nRead Office 365 usage reports\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nSharePoint Advanced Management Administrator\nAssign the SharePoint Advanced Management Administrator role to users who need to do the following tasks:\nPerform all actions available to SharePoint Administrators, including global management of SharePoint Online, support ticket handling, and service health monitoring\nView names, paths, and URLs of files, folders, libraries, documents, and lists within SharePoint sites, without accessing file or item content\nRemove permissions from files, folders, libraries, documents, and lists within SharePoint sites\nActions\nDescription\nmicrosoft.azure.serviceHealth/allEntities/allTasks\nRead and configure Azure Service Health\nmicrosoft.azure.supportTickets/allEntities/allTasks\nCreate and manage Azure support tickets\nmicrosoft.backup/oneDriveForBusinessProtectionPolicies/allProperties/allTasks\nCreate and manage OneDrive protection policy in Microsoft 365 Backup\nmicrosoft.backup/oneDriveForBusinessRestoreSessions/allProperties/allTasks\nRead and configure restore session for OneDrive in Microsoft 365 Backup\nmicrosoft.backup/restorePoints/sites/allProperties/allTasks\nManage all restore points associated with selected SharePoint sites in Microsoft 365 Backup\nmicrosoft.backup/restorePoints/userDrives/allProperties/allTasks\nManage all restore points associated with selected OneDrive accounts in Microsoft 365 Backup\nmicrosoft.backup/sharePointProtectionPolicies/allProperties/allTasks\nCreate and manage SharePoint protection policy in Microsoft 365 Backup\nmicrosoft.backup/sharePointRestoreSessions/allProperties/allTasks\nRead and configure restore session for SharePoint in Microsoft 365 Backup\nmicrosoft.backup/siteProtectionUnits/allProperties/allTasks\nManage sites added to SharePoint protection policy in Microsoft 365 Backup\nmicrosoft.backup/siteRestoreArtifacts/allProperties/allTasks\nManage sites added to restore session for SharePoint in Microsoft 365 Backup\nmicrosoft.backup/userDriveProtectionUnits/allProperties/allTasks\nManage accounts added to OneDrive protection policy in Microsoft 365 Backup\nmicrosoft.backup/userDriveRestoreArtifacts/allProperties/allTasks\nManage accounts added to restore session for OneDrive in Microsoft 365 Backup\nmicrosoft.directory/groups.unified/assignedLabels/update\nUpdate the assigned labels property on Microsoft 365 groups of assigned membership type, excluding role-assignable groups\nmicrosoft.directory/groups.unified/basic/update\nUpdate basic properties on Microsoft 365 groups, excluding role-assignable groups\nmicrosoft.directory/groups.unified/create\nCreate Microsoft 365 groups, excluding role-assignable groups\nmicrosoft.directory/groups.unified/delete\nDelete Microsoft 365 groups, excluding role-assignable groups\nmicrosoft.directory/groups.unified/members/update\nUpdate members of Microsoft 365 groups, excluding role-assignable groups\nmicrosoft.directory/groups.unified/owners/update\nUpdate owners of Microsoft 365 groups, excluding role-assignable groups\nmicrosoft.directory/groups.unified/restore\nRestore Microsoft 365 groups from soft-deleted container, excluding role-assignable groups\nmicrosoft.directory/groups/hiddenMembers/read\nRead hidden members of Security groups and Microsoft 365 groups, including role-assignable groups\nmicrosoft.office365.migrations/allEntities/allProperties/allTasks\nManage all aspects of Microsoft 365 migrations\nmicrosoft.office365.network/performance/allProperties/read\nRead all network performance properties in the Microsoft 365 admin center\nmicrosoft.office365.serviceHealth/allEntities/allTasks\nRead and configure Service Health in the Microsoft 365 admin center\nmicrosoft.office365.sharePoint/allEntities/allTasks\nCreate and delete all resources, and read and update standard properties in SharePoint\nmicrosoft.office365.sharePointAdvancedManagement/allEntities/allProperties/allTasks\nManage all aspects of SharePoint Advanced Management\nmicrosoft.office365.supportTickets/allEntities/allTasks\nCreate and manage Microsoft 365 service requests\nmicrosoft.office365.usageReports/allEntities/allProperties/read\nRead Office 365 usage reports\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nSharePoint Backup Administrator\nAssign the SharePoint Backup Administrator role to users who need to do the following tasks:\nManage all aspects of Microsoft 365 Backup for SharePoint and OneDrive\nBack up and restore content including granular restore across SharePoint and OneDrive\nCreate, edit, and manage backup configuration policies for SharePoint and OneDrive\nPerform restore operations for SharePoint and OneDrive\nActions\nDescription\nmicrosoft.azure.serviceHealth/allEntities/allTasks\nRead and configure Azure Service Health\nmicrosoft.azure.supportTickets/allEntities/allTasks\nCreate and manage Azure support tickets\nmicrosoft.backup/oneDriveForBusinessProtectionPolicies/allProperties/allTasks\nCreate and manage OneDrive protection policy in Microsoft 365 Backup\nmicrosoft.backup/oneDriveForBusinessRestoreSessions/allProperties/allTasks\nRead and configure restore session for OneDrive in Microsoft 365 Backup\nmicrosoft.backup/restorePoints/sites/allProperties/allTasks\nManage all restore points associated with selected SharePoint sites in Microsoft 365 Backup\nmicrosoft.backup/restorePoints/userDrives/allProperties/allTasks\nManage all restore points associated with selected OneDrive accounts in Microsoft 365 Backup\nmicrosoft.backup/sharePointProtectionPolicies/allProperties/allTasks\nCreate and manage SharePoint protection policy in Microsoft 365 Backup\nmicrosoft.backup/sharePointRestoreSessions/allProperties/allTasks\nRead and configure restore session for SharePoint in Microsoft 365 Backup\nmicrosoft.backup/siteProtectionUnits/allProperties/allTasks\nManage sites added to SharePoint protection policy in Microsoft 365 Backup\nmicrosoft.backup/siteRestoreArtifacts/allProperties/allTasks\nManage sites added to restore session for SharePoint in Microsoft 365 Backup\nmicrosoft.backup/userDriveProtectionUnits/allProperties/allTasks\nManage accounts added to OneDrive protection policy in Microsoft 365 Backup\nmicrosoft.backup/userDriveRestoreArtifacts/allProperties/allTasks\nManage accounts added to restore session for OneDrive in Microsoft 365 Backup\nmicrosoft.office365.network/performance/allProperties/read\nRead all network performance properties in the Microsoft 365 admin center\nmicrosoft.office365.serviceHealth/allEntities/allTasks\nRead and configure Service Health in the Microsoft 365 admin center\nmicrosoft.office365.supportTickets/allEntities/allTasks\nCreate and manage Microsoft 365 service requests\nmicrosoft.office365.usageReports/allEntities/allProperties/read\nRead Office 365 usage reports\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nSharePoint Embedded Administrator\nAssign the SharePoint Embedded Administrator role to users who need to do the following tasks:\nPerform all tasks using PowerShell, Microsoft Graph API, or SharePoint admin center\nManage, configure, and maintain SharePoint Embedded containers\nEnumerate and manage SharePoint Embedded containers\nEnumerate and manage permissions for SharePoint Embedded containers\nManage storage of SharePoint Embedded containers in a tenant\nAssign security and compliance policies on SharePoint Embedded containers\nApply security and compliance policies on SharePoint Embedded containers in a tenant\nLearn more\nActions\nDescription\nmicrosoft.office365.fileStorageContainers/allEntities/allProperties/allTasks\nManage all aspects of SharePoint Embedded containers\nmicrosoft.office365.network/performance/allProperties/read\nRead all network performance properties in the Microsoft 365 admin center\nmicrosoft.office365.serviceHealth/allEntities/allTasks\nRead and configure Service Health in the Microsoft 365 admin center\nmicrosoft.office365.supportTickets/allEntities/allTasks\nCreate and manage Microsoft 365 service requests\nmicrosoft.office365.usageReports/allEntities/allProperties/read\nRead Office 365 usage reports\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nSkype for Business Administrator\nUsers with this role have global permissions within Microsoft Skype for Business, when the service is present, as well as manage Skype-specific user attributes in Microsoft Entra ID. Additionally, this role grants the ability to manage support tickets and monitor service health, and to access the Teams and Skype for Business admin center. The account must also be licensed for Teams or it can't run Teams PowerShell cmdlets. For more information, see\nSkype for Business Online Admin\nand Teams licensing information at\nSkype for Business add-on licensing\n.\nNote\nIn the Microsoft Graph API and Microsoft Graph PowerShell, this role is named Lync Service Administrator. In the\nAzure portal\n, it is named Skype for Business Administrator.\nActions\nDescription\nmicrosoft.azure.serviceHealth/allEntities/allTasks\nRead and configure Azure Service Health\nmicrosoft.azure.supportTickets/allEntities/allTasks\nCreate and manage Azure support tickets\nmicrosoft.office365.serviceHealth/allEntities/allTasks\nRead and configure Service Health in the Microsoft 365 admin center\nmicrosoft.office365.skypeForBusiness/allEntities/allTasks\nManage all aspects of Skype for Business Online\nmicrosoft.office365.supportTickets/allEntities/allTasks\nCreate and manage Microsoft 365 service requests\nmicrosoft.office365.usageReports/allEntities/allProperties/read\nRead Office 365 usage reports\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nTeams Administrator\nUsers in this role can manage all aspects of the Microsoft Teams workload via the Microsoft Teams & Skype for Business admin center and the respective PowerShell modules. This includes, among other areas, all management tools related to telephony, messaging, meetings, and the teams themselves. This role additionally grants the ability to create and manage all Microsoft 365 groups, manage support tickets, and monitor service health.\nActions\nDescription\nmicrosoft.azure.serviceHealth/allEntities/allTasks\nRead and configure Azure Service Health\nmicrosoft.azure.supportTickets/allEntities/allTasks\nCreate and manage Azure support tickets\nmicrosoft.directory/authorizationPolicy/standard/read\nRead standard properties of authorization policy\nmicrosoft.directory/crossTenantAccessPolicy/allowedCloudEndpoints/update\nUpdate allowed cloud endpoints of cross-tenant access policy\nmicrosoft.directory/crossTenantAccessPolicy/default/crossCloudMeetings/update\nUpdate cross-cloud Teams meeting settings of the default cross-tenant access policy\nmicrosoft.directory/crossTenantAccessPolicy/default/standard/read\nRead basic properties of the default cross-tenant access policy\nmicrosoft.directory/crossTenantAccessPolicy/partners/create\nCreate cross-tenant access policy for partners\nmicrosoft.directory/crossTenantAccessPolicy/partners/crossCloudMeetings/update\nUpdate cross-cloud Teams meeting settings of cross-tenant access policy for partners\nmicrosoft.directory/crossTenantAccessPolicy/partners/standard/read\nRead basic properties of cross-tenant access policy for partners\nmicrosoft.directory/crossTenantAccessPolicy/standard/read\nRead basic properties of cross-tenant access policy\nmicrosoft.directory/externalUserProfiles/basic/update\nUpdate basic properties of external user profiles in the extended directory for Teams\nmicrosoft.directory/externalUserProfiles/delete\nDelete external user profiles in the extended directory for Teams\nmicrosoft.directory/externalUserProfiles/standard/read\nRead standard properties of external user profiles in the extended directory for Teams\nmicrosoft.directory/groups.unified/assignedLabels/update\nUpdate the assigned labels property on Microsoft 365 groups of assigned membership type, excluding role-assignable groups\nmicrosoft.directory/groups.unified/basic/update\nUpdate basic properties on Microsoft 365 groups, excluding role-assignable groups\nmicrosoft.directory/groups.unified/create\nCreate Microsoft 365 groups, excluding role-assignable groups\nmicrosoft.directory/groups.unified/delete\nDelete Microsoft 365 groups, excluding role-assignable groups\nmicrosoft.directory/groups.unified/members/update\nUpdate members of Microsoft 365 groups, excluding role-assignable groups\nmicrosoft.directory/groups.unified/owners/update\nUpdate owners of Microsoft 365 groups, excluding role-assignable groups\nmicrosoft.directory/groups.unified/restore\nRestore Microsoft 365 groups from soft-deleted container, excluding role-assignable groups\nmicrosoft.directory/groups/hiddenMembers/read\nRead hidden members of Security groups and Microsoft 365 groups, including role-assignable groups\nmicrosoft.directory/pendingExternalUserProfiles/basic/update\nUpdate basic properties of external user profiles in the extended directory for Teams\nmicrosoft.directory/pendingExternalUserProfiles/create\nCreate external user profiles in the extended directory for Teams\nmicrosoft.directory/pendingExternalUserProfiles/delete\nDelete external user profiles in the extended directory for Teams\nmicrosoft.directory/pendingExternalUserProfiles/standard/read\nRead standard properties of external user profiles in the extended directory for Teams\nmicrosoft.directory/permissionGrantPolicies/standard/read\nRead standard properties of permission grant policies\nmicrosoft.office365.network/performance/allProperties/read\nRead all network performance properties in the Microsoft 365 admin center\nmicrosoft.office365.serviceHealth/allEntities/allTasks\nRead and configure Service Health in the Microsoft 365 admin center\nmicrosoft.office365.skypeForBusiness/allEntities/allTasks\nManage all aspects of Skype for Business Online\nmicrosoft.office365.supportTickets/allEntities/allTasks\nCreate and manage Microsoft 365 service requests\nmicrosoft.office365.usageReports/allEntities/allProperties/read\nRead Office 365 usage reports\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nmicrosoft.teams/allEntities/allProperties/allTasks\nManage all resources in Teams\nTeams Communications Administrator\nUsers in this role can manage aspects of the Microsoft Teams workload related to voice & telephony. This includes the management tools for telephone number assignment, voice and meeting policies, and full access to the call analytics toolset.\nActions\nDescription\nmicrosoft.azure.serviceHealth/allEntities/allTasks\nRead and configure Azure Service Health\nmicrosoft.azure.supportTickets/allEntities/allTasks\nCreate and manage Azure support tickets\nmicrosoft.directory/authorizationPolicy/standard/read\nRead standard properties of authorization policy\nmicrosoft.office365.serviceHealth/allEntities/allTasks\nRead and configure Service Health in the Microsoft 365 admin center\nmicrosoft.office365.skypeForBusiness/allEntities/allTasks\nManage all aspects of Skype for Business Online\nmicrosoft.office365.supportTickets/allEntities/allTasks\nCreate and manage Microsoft 365 service requests\nmicrosoft.office365.usageReports/allEntities/allProperties/read\nRead Office 365 usage reports\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nmicrosoft.teams/callQuality/allProperties/read\nRead all data in the Call Quality Dashboard (CQD)\nmicrosoft.teams/meetings/allProperties/allTasks\nManage meetings including meeting policies, configurations, and conference bridges\nmicrosoft.teams/voice/allProperties/allTasks\nManage voice including calling policies and phone number inventory and assignment\nTeams Communications Support Engineer\nUsers in this role can troubleshoot communication issues within Microsoft Teams & Skype for Business using the user call troubleshooting tools in the Microsoft Teams & Skype for Business admin center. Users in this role can view full call record information for all participants involved. This role has no access to view, create, or manage support tickets.\nActions\nDescription\nmicrosoft.azure.serviceHealth/allEntities/allTasks\nRead and configure Azure Service Health\nmicrosoft.directory/authorizationPolicy/standard/read\nRead standard properties of authorization policy\nmicrosoft.office365.serviceHealth/allEntities/allTasks\nRead and configure Service Health in the Microsoft 365 admin center\nmicrosoft.office365.skypeForBusiness/allEntities/allTasks\nManage all aspects of Skype for Business Online\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nmicrosoft.teams/callQuality/allProperties/read\nRead all data in the Call Quality Dashboard (CQD)\nTeams Communications Support Specialist\nUsers in this role can troubleshoot communication issues within Microsoft Teams & Skype for Business using the user call troubleshooting tools in the Microsoft Teams & Skype for Business admin center. Users in this role can only view user details in the call for the specific user they have looked up. This role has no access to view, create, or manage support tickets.\nActions\nDescription\nmicrosoft.azure.serviceHealth/allEntities/allTasks\nRead and configure Azure Service Health\nmicrosoft.directory/authorizationPolicy/standard/read\nRead standard properties of authorization policy\nmicrosoft.office365.serviceHealth/allEntities/allTasks\nRead and configure Service Health in the Microsoft 365 admin center\nmicrosoft.office365.skypeForBusiness/allEntities/allTasks\nManage all aspects of Skype for Business Online\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nmicrosoft.teams/callQuality/standard/read\nRead basic data in the Call Quality Dashboard (CQD)\nTeams Devices Administrator\nUsers with this role can manage\nTeams-certified devices\nfrom the Teams admin center. This role allows viewing all devices at single glance, with ability to search and filter devices. The user can check details of each device including logged-in account, make and model of the device. The user can change the settings on the device and update the software versions. This role does not grant permissions to check Teams activity and call quality of the device.\nActions\nDescription\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nmicrosoft.teams/devices/standard/read\nManage all aspects of Teams-certified devices including configuration policies\nTeams Reader\nAssign the Teams Reader role to users who need to do the following tasks:\nRead settings and administrative information in the Teams admin center, but not perform any management actions\nRead the Microsoft Call Quality Dashboard (CQD), but not access any troubleshooting capabilities\nUsers with this role\ncannot\ndo the following tasks:\nCannot view Teams management\nCannot access Meetings & Calls details of users\nCannot access Notifications & Rules management\nCannot access Frontline worker deployment management\nCannot access the advanced collaboration insights dashboard\nLearn more\nActions\nDescription\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nmicrosoft.teams/allEntities/allProperties/read\nRead all properties of Microsoft Teams\nTeams Telephony Administrator\nAssign the Teams Telephony Administrator role to users who need to do the following tasks:\nManage voice and telephony, including calling policies, phone number management and assignment, and voice applications\nAccess to only Public Switched Telephone Network (PSTN) usage reports from Teams admin center\nView user profile page\nCreate and manage support tickets in Azure and the Microsoft 365 admin center\nLearn more\nActions\nDescription\nmicrosoft.azure.serviceHealth/allEntities/allTasks\nRead and configure Azure Service Health\nmicrosoft.azure.supportTickets/allEntities/allTasks\nCreate and manage Azure support tickets\nmicrosoft.directory/authorizationPolicy/standard/read\nRead standard properties of authorization policy\nmicrosoft.office365.serviceHealth/allEntities/allTasks\nRead and configure Service Health in the Microsoft 365 admin center\nmicrosoft.office365.skypeForBusiness/allEntities/allTasks\nManage all aspects of Skype for Business Online\nmicrosoft.office365.supportTickets/allEntities/allTasks\nCreate and manage Microsoft 365 service requests\nmicrosoft.office365.usageReports/allEntities/allProperties/read\nRead Office 365 usage reports\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nmicrosoft.teams/callQuality/allProperties/read\nRead all data in the Call Quality Dashboard (CQD)\nmicrosoft.teams/voice/allProperties/allTasks\nManage voice including calling policies and phone number inventory and assignment\nTenant Creator\nAssign the Tenant Creator role to users who need to do the following tasks:\nCreate both Microsoft Entra and Azure Active Directory B2C tenants even if the tenant creation toggle is turned off in the user settings\nNote\nThe tenant creators will be assigned the Global Administrator role on the new tenants they create.\nActions\nDescription\nmicrosoft.directory/tenantManagement/tenants/create\nCreate new tenants in Microsoft Entra ID\nUsage Summary Reports Reader\nAssign the Usage Summary Reports Reader role to users who need to do the following tasks in the Microsoft 365 admin center:\nView the Usage reports and Adoption Score\nRead organizational insights, but not personally identifiable information (PII) of users\nThis role only allows users to view organizational-level data with the following exceptions:\nMember users can view user management data and settings.\nGuest users assigned this role cannot view user management data and settings.\nActions\nDescription\nmicrosoft.office365.network/performance/allProperties/read\nRead all network performance properties in the Microsoft 365 admin center\nmicrosoft.office365.usageReports/allEntities/standard/read\nRead tenant-level aggregated Office 365 usage reports\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nUser Administrator\nThis is a\nprivileged role\n. Assign the User Administrator role to users who need to do the following:\nPermission\nMore information\nCreate users\nUpdate most user properties for all users, including all administrators\nWho can perform sensitive actions\nUpdate sensitive properties (including user principal name) for some users\nWho can perform sensitive actions\nDisable or enable some users\nWho can perform sensitive actions\nDelete or restore some users\nWho can perform sensitive actions\nCreate and manage user views\nCreate and manage all groups\nAssign and read licenses for all users, including all administrators\nReset passwords\nWho can reset passwords\nInvalidate refresh tokens\nWho can reset passwords\nUpdate (FIDO) device keys\nUpdate password expiration policies\nCreate and manage support tickets in Azure and the Microsoft 365 admin center\nMonitor service health\nUsers with this role\ncannot\ndo the following:\nCannot manage MFA.\nCannot change the credentials or reset MFA for members and owners of a\nrole-assignable group\n.\nCannot manage shared mailboxes.\nCannot modify security questions for password reset operation.\nImportant\nUsers with this role can change passwords for people who may have access to sensitive or private information or critical configuration inside and outside of Microsoft Entra ID. Changing the password of a user may mean the ability to assume that user's identity and permissions. For example:\nApplication Registration and Enterprise Application owners, who can manage credentials of apps they own. Those apps may have privileged permissions in Microsoft Entra ID and elsewhere not granted to User Administrators. Through this path a User Administrator may be able to assume the identity of an application owner and then further assume the identity of a privileged application by updating the credentials for the application.\nAzure subscription owners, who may have access to sensitive or private information or critical configuration in Azure.\nSecurity Group and Microsoft 365 group owners, who can manage group membership. Those groups may grant access to sensitive or private information or critical configuration in Microsoft Entra ID and elsewhere.\nAdministrators in other services outside of Microsoft Entra ID like Exchange Online, Microsoft 365 Defender portal, Microsoft Purview portal, and human resources systems.\nNon-administrators like executives, legal counsel, and human resources employees who may have access to sensitive or private information.\nActions\nDescription\nmicrosoft.azure.serviceHealth/allEntities/allTasks\nRead and configure Azure Service Health\nmicrosoft.azure.supportTickets/allEntities/allTasks\nCreate and manage Azure support tickets\nmicrosoft.directory/accessReviews/definitions.applications/allProperties/allTasks\nManage access reviews of application role assignments in Microsoft Entra ID\nmicrosoft.directory/accessReviews/definitions.directoryRoles/allProperties/read\nRead all properties of access reviews for Microsoft Entra role assignments\nmicrosoft.directory/accessReviews/definitions.entitlementManagement/allProperties/allTasks\nManage access reviews for access package assignments in entitlement management\nmicrosoft.directory/accessReviews/definitions.groups/allProperties/read\nRead all properties of access reviews for membership in Security and Microsoft 365 groups, including role-assignable groups.\nmicrosoft.directory/accessReviews/definitions.groups/allProperties/update\nUpdate all properties of access reviews for membership in Security and Microsoft 365 groups, excluding role-assignable groups.\nmicrosoft.directory/accessReviews/definitions.groups/create\nCreate access reviews for membership in Security and Microsoft 365 groups.\nmicrosoft.directory/accessReviews/definitions.groups/delete\nDelete access reviews for membership in Security and Microsoft 365 groups.\nmicrosoft.directory/contacts/basic/update\nUpdate basic properties on contacts\nmicrosoft.directory/contacts/create\nCreate contacts\nmicrosoft.directory/contacts/delete\nDelete contacts\nmicrosoft.directory/deletedItems.groups/restore\nRestore soft deleted groups to original state\nmicrosoft.directory/deletedItems.users/restore\nRestore soft deleted users to original state\nmicrosoft.directory/entitlementManagement/allProperties/allTasks\nCreate and delete resources, and read and update all properties in Microsoft Entra entitlement management\nmicrosoft.directory/groups.unified/assignedLabels/update\nUpdate the assigned labels property on Microsoft 365 groups of assigned membership type, excluding role-assignable groups\nmicrosoft.directory/groups/assignLicense\nAssign product licenses to groups for group-based licensing\nmicrosoft.directory/groups/basic/update\nUpdate basic properties on Security groups and Microsoft 365 groups, excluding role-assignable groups\nmicrosoft.directory/groups/classification/update\nUpdate the classification property on Security groups and Microsoft 365 groups, excluding role-assignable groups\nmicrosoft.directory/groups/create\nCreate Security groups and Microsoft 365 groups, excluding role-assignable groups\nmicrosoft.directory/groups/delete\nDelete Security groups and Microsoft 365 groups, excluding role-assignable groups\nmicrosoft.directory/groups/dynamicMembershipRule/update\nUpdate the dynamic membership rule on Security groups and Microsoft 365 groups, excluding role-assignable groups\nmicrosoft.directory/groups/groupType/update\nUpdate properties that would affect the group type of Security groups and Microsoft 365 groups, excluding role-assignable groups\nmicrosoft.directory/groups/hiddenMembers/read\nRead hidden members of Security groups and Microsoft 365 groups, including role-assignable groups\nmicrosoft.directory/groups/members/update\nUpdate members of Security groups and Microsoft 365 groups, excluding role-assignable groups\nmicrosoft.directory/groups/onPremWriteBack/update\nUpdate Microsoft Entra groups to be written back to on-premises with Microsoft Entra Connect\nmicrosoft.directory/groups/owners/update\nUpdate owners of Security groups and Microsoft 365 groups, excluding role-assignable groups\nmicrosoft.directory/groups/reprocessLicenseAssignment\nReprocess license assignments for group-based licensing\nmicrosoft.directory/groups/restore\nRestore groups from soft-deleted container\nmicrosoft.directory/groups/settings/update\nUpdate settings of groups\nmicrosoft.directory/groups/visibility/update\nUpdate the visibility property of Security groups and Microsoft 365 groups, excluding role-assignable groups\nmicrosoft.directory/oAuth2PermissionGrants/allProperties/allTasks\nCreate and delete OAuth 2.0 permission grants, and read and update all properties\nmicrosoft.directory/onPremisesSynchronization/standard/read\nRead standard on-premises directory synchronization information\nmicrosoft.directory/policies/standard/read\nRead basic properties on policies\nmicrosoft.directory/servicePrincipals/appRoleAssignedTo/update\nUpdate service principal role assignments\nmicrosoft.directory/users/assignLicense\nManage user licenses\nmicrosoft.directory/users/basic/update\nUpdate basic properties on users\nmicrosoft.directory/users/convertExternalToInternalMemberUser\nConvert external user to internal user\nmicrosoft.directory/users/create\nAdd users\nmicrosoft.directory/users/delete\nDelete users\nmicrosoft.directory/users/disable\nDisable users\nmicrosoft.directory/users/enable\nEnable users\nmicrosoft.directory/users/invalidateAllRefreshTokens\nForce sign-out by invalidating user refresh tokens\nmicrosoft.directory/users/inviteGuest\nInvite guest users\nmicrosoft.directory/users/lifeCycleInfo/read\nRead lifecycle information of users, such as employeeLeaveDateTime\nmicrosoft.directory/users/manager/update\nUpdate manager for users\nmicrosoft.directory/users/password/update\nReset passwords for all users\nmicrosoft.directory/users/photo/update\nUpdate photo of users\nmicrosoft.directory/users/reprocessLicenseAssignment\nReprocess license assignments for users\nmicrosoft.directory/users/restore\nRestore deleted users\nmicrosoft.directory/users/sponsors/update\nUpdate sponsors of users\nmicrosoft.directory/users/usageLocation/update\nUpdate usage location of users\nmicrosoft.directory/users/userPrincipalName/update\nUpdate User Principal Name of users\nmicrosoft.office365.serviceHealth/allEntities/allTasks\nRead and configure Service Health in the Microsoft 365 admin center\nmicrosoft.office365.supportTickets/allEntities/allTasks\nCreate and manage Microsoft 365 service requests\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nUser Experience Success Manager\nAssign the User Experience Success Manager role to users who need to do the following tasks:\nRead organizational-level usage reports for Microsoft 365 Apps and services, but not user details\nView your organization's product feedback, Net Promoter Score (NPS) survey results, and help article views to identify communication and training opportunities\nRead message center posts and service health data\nLearn more\nActions\nDescription\nmicrosoft.commerce.billing/purchases/standard/read\nRead purchase services in Microsoft 365 admin center.\nmicrosoft.office365.messageCenter/messages/read\nRead messages in Message Center in the Microsoft 365 admin center, excluding security messages\nmicrosoft.office365.network/performance/allProperties/read\nRead all network performance properties in the Microsoft 365 admin center\nmicrosoft.office365.organizationalMessages/allEntities/allProperties/read\nRead all aspects of Microsoft 365 Organizational Messages\nmicrosoft.office365.serviceHealth/allEntities/allTasks\nRead and configure Service Health in the Microsoft 365 admin center\nmicrosoft.office365.usageReports/allEntities/standard/read\nRead tenant-level aggregated Office 365 usage reports\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nVirtual Visits Administrator\nUsers with this role can do the following tasks:\nManage and configure all aspects of Virtual Visits in Bookings in the Microsoft 365 admin center, and in the Teams EHR connector\nView usage reports for Virtual Visits in the Teams admin center, Microsoft 365 admin center, Fabric, and Power BI\nView features and settings in the Microsoft 365 admin center, but can't edit any settings\nVirtual Visits are a simple way to schedule and manage online and video appointments for staff and attendees. For example, usage reporting can show how sending SMS text messages before appointments can reduce the number of people who don't show up for appointments.\nActions\nDescription\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nmicrosoft.virtualVisits/allEntities/allProperties/allTasks\nManage and share Virtual Visits information and metrics from admin centers or the Virtual Visits app\nActions\nDescription\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nmicrosoft.virtualVisits/allEntities/allProperties/allTasks\nManage and share Virtual Visits information and metrics from admin centers or the Virtual Visits app\nViva Glint Tenant Administrator\nAssign the Viva Glint Tenant Administrator role to users who need to do the following tasks:\nRead and configure Viva Glint settings in the Microsoft 365 admin center\nAssign or remove Viva Glint service admins\nCreate and manage Viva Feature Access Management policies\nView and manage Viva Glint experiences (if applicable)\nCreate and manage Azure support tickets\nFor more information, see\nKey roles for Viva Glint\nand\nAssign Viva Glint Tenant and Service Administrators\n.\nActions\nDescription\nmicrosoft.azure.serviceHealth/allEntities/allTasks\nRead and configure Azure Service Health\nmicrosoft.azure.supportTickets/allEntities/allTasks\nCreate and manage Azure support tickets\nmicrosoft.office365.messageCenter/messages/read\nRead messages in Message Center in the Microsoft 365 admin center, excluding security messages\nmicrosoft.office365.supportTickets/allEntities/allTasks\nCreate and manage Microsoft 365 service requests\nmicrosoft.office365.usageReports/allEntities/allProperties/read\nRead Office 365 usage reports\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nmicrosoft.viva.glint/allEntities/allProperties/allTasks\nManage and configure all Microsoft Viva Glint settings in the Microsoft 365 admin center\nViva Goals Administrator\nAssign the Viva Goals Administrator role to users who need to do the following tasks:\nManage and configure all aspects of the Microsoft Viva Goals application\nConfigure Microsoft Viva Goals admin settings\nRead Microsoft Entra tenant information\nMonitor Microsoft 365 service health\nCreate and manage Microsoft 365 service requests\nFor more information, see\nRoles and permissions in Viva Goals\nand\nIntroduction to Microsoft Viva Goals\n.\nActions\nDescription\nmicrosoft.office365.supportTickets/allEntities/allTasks\nCreate and manage Microsoft 365 service requests\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nmicrosoft.viva.goals/allEntities/allProperties/allTasks\nManage all aspects of Microsoft Viva Goals\nViva Pulse Administrator\nAssign the Viva Pulse Administrator role to users who need to do the following tasks:\nRead and configure all settings of Viva Pulse\nRead basic properties on all resources in the Microsoft 365 admin center\nRead and configure Azure Service Health\nCreate and manage Azure support tickets\nRead messages in Message Center in the Microsoft 365 admin center, excluding security messages\nRead usage reports in the Microsoft 365 admin center\nFor more information, see\nAssign a Viva Pulse admin in the Microsoft 365 admin center\n.\nActions\nDescription\nmicrosoft.office365.messageCenter/messages/read\nRead messages in Message Center in the Microsoft 365 admin center, excluding security messages\nmicrosoft.office365.serviceHealth/allEntities/allTasks\nRead and configure Service Health in the Microsoft 365 admin center\nmicrosoft.office365.supportTickets/allEntities/allTasks\nCreate and manage Microsoft 365 service requests\nmicrosoft.office365.usageReports/allEntities/allProperties/read\nRead Office 365 usage reports\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nmicrosoft.viva.pulse/allEntities/allProperties/allTasks\nManage all aspects of Microsoft Viva Pulse\nWindows 365 Administrator\nUsers with this role have global permissions on Windows 365 resources, when the service is present. Additionally, this role contains the ability to manage users and devices in order to associate policy, as well as create and manage groups.\nThis role can create and manage security groups, but does not have administrator rights over Microsoft 365 groups. That means administrators cannot update owners or memberships of Microsoft 365 groups in the organization. However, they can manage the Microsoft 365 group they create, which is a part of their end-user privileges. So, any Microsoft 365 group (not security group) they create is counted against their quota of 250.\nAssign the Windows 365 Administrator role to users who need to do the following tasks:\nManage Windows 365 Cloud PCs in Microsoft Intune\nEnroll and manage devices in Microsoft Entra ID, including assigning users and policies\nCreate and manage security groups, but not role-assignable groups\nView basic properties in the Microsoft 365 admin center\nRead usage reports in the Microsoft 365 admin center\nCreate and manage support tickets in Azure and the Microsoft 365 admin center\nActions\nDescription\nmicrosoft.azure.supportTickets/allEntities/allTasks\nCreate and manage Azure support tickets\nmicrosoft.cloudPC/allEntities/allProperties/allTasks\nManage all aspects of Windows 365\nmicrosoft.directory/deletedItems.devices/delete\nPermanently delete devices, which can no longer be restored\nmicrosoft.directory/deletedItems.devices/restore\nRestore soft deleted devices to original state\nmicrosoft.directory/deviceManagementPolicies/standard/read\nRead standard properties on mobile device management and mobile app management policies\nmicrosoft.directory/deviceRegistrationPolicy/standard/read\nRead standard properties on device registration policies\nmicrosoft.directory/devices/basic/update\nUpdate basic properties on devices\nmicrosoft.directory/devices/create\nCreate devices (enroll in Microsoft Entra ID)\nmicrosoft.directory/devices/delete\nDelete devices from Microsoft Entra ID\nmicrosoft.directory/devices/disable\nDisable devices in Microsoft Entra ID\nmicrosoft.directory/devices/enable\nEnable devices in Microsoft Entra ID\nmicrosoft.directory/devices/extensionAttributeSet1/update\nUpdate the extensionAttribute1 to extensionAttribute5 properties on devices\nmicrosoft.directory/devices/extensionAttributeSet2/update\nUpdate the extensionAttribute6 to extensionAttribute10 properties on devices\nmicrosoft.directory/devices/extensionAttributeSet3/update\nUpdate the extensionAttribute11 to extensionAttribute15 properties on devices\nmicrosoft.directory/devices/registeredOwners/update\nUpdate registered owners of devices\nmicrosoft.directory/devices/registeredUsers/update\nUpdate registered users of devices\nmicrosoft.directory/groups.security/assignedLabels/update\nUpdate the assigned labels property on Security groups of assigned membership type, excluding role-assignable groups\nmicrosoft.directory/groups.security/basic/update\nUpdate basic properties on Security groups, excluding role-assignable groups\nmicrosoft.directory/groups.security/classification/update\nUpdate the classification property on Security groups, excluding role-assignable groups\nmicrosoft.directory/groups.security/create\nCreate Security groups, excluding role-assignable groups\nmicrosoft.directory/groups.security/delete\nDelete Security groups, excluding role-assignable groups\nmicrosoft.directory/groups.security/dynamicMembershipRule/update\nUpdate the dynamic membership rule on Security groups, excluding role-assignable groups\nmicrosoft.directory/groups.security/members/update\nUpdate members of Security groups, excluding role-assignable groups\nmicrosoft.directory/groups.security/owners/update\nUpdate owners of Security groups, excluding role-assignable groups\nmicrosoft.directory/groups.security/visibility/update\nUpdate the visibility property on Security groups, excluding role-assignable groups\nmicrosoft.office365.supportTickets/allEntities/allTasks\nCreate and manage Microsoft 365 service requests\nmicrosoft.office365.usageReports/allEntities/allProperties/read\nRead Office 365 usage reports\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nWindows Update Deployment Administrator\nUUsers in this role can create and manage all aspects of Windows Update deployments through the Windows Update for Business deployment service. The deployment service enables users to define settings for when and how updates are deployed, and specify which updates are offered to groups of devices in their tenant. It also allows users to monitor the update progress.\nActions\nDescription\nmicrosoft.windows.updatesDeployments/allEntities/allProperties/allTasks\nRead and configure all aspects of Windows Update Service\nYammer Administrator\nAssign the Yammer Administrator role to users who need to do the following tasks:\nManage all aspects of Yammer\nCreate, manage, and restore Microsoft 365 Groups, but not role-assignable groups\nView the hidden members of Security groups and Microsoft 365 groups, including role assignable groups\nRead usage reports in the Microsoft 365 admin center\nCreate and manage service requests in the Microsoft 365 admin center\nView announcements in the Message center, but not security announcements\nView service health\nLearn more\nActions\nDescription\nmicrosoft.directory/groups.unified/assignedLabels/update\nUpdate the assigned labels property on Microsoft 365 groups of assigned membership type, excluding role-assignable groups\nmicrosoft.directory/groups.unified/basic/update\nUpdate basic properties on Microsoft 365 groups, excluding role-assignable groups\nmicrosoft.directory/groups.unified/create\nCreate Microsoft 365 groups, excluding role-assignable groups\nmicrosoft.directory/groups.unified/delete\nDelete Microsoft 365 groups, excluding role-assignable groups\nmicrosoft.directory/groups.unified/members/update\nUpdate members of Microsoft 365 groups, excluding role-assignable groups\nmicrosoft.directory/groups.unified/owners/update\nUpdate owners of Microsoft 365 groups, excluding role-assignable groups\nmicrosoft.directory/groups.unified/restore\nRestore Microsoft 365 groups from soft-deleted container, excluding role-assignable groups\nmicrosoft.directory/groups/hiddenMembers/read\nRead hidden members of Security groups and Microsoft 365 groups, including role-assignable groups\nmicrosoft.office365.messageCenter/messages/read\nRead messages in Message Center in the Microsoft 365 admin center, excluding security messages\nmicrosoft.office365.network/performance/allProperties/read\nRead all network performance properties in the Microsoft 365 admin center\nmicrosoft.office365.serviceHealth/allEntities/allTasks\nRead and configure Service Health in the Microsoft 365 admin center\nmicrosoft.office365.supportTickets/allEntities/allTasks\nCreate and manage Microsoft 365 service requests\nmicrosoft.office365.usageReports/allEntities/allProperties/read\nRead Office 365 usage reports\nmicrosoft.office365.webPortal/allEntities/standard/read\nRead basic properties on all resources in the Microsoft 365 admin center\nmicrosoft.office365.yammer/allEntities/allProperties/allTasks\nManage all aspects of Yammer\nDeprecated roles\nThe following roles should not be used. They have been deprecated and will be removed from Microsoft Entra ID in the future.\nAdHoc License Administrator\nDevice Join\nDevice Managers\nDevice Users\nEmail Verified User Creator\nMailbox Administrator\nWorkplace Device Join\nRoles not shown in the portal\nNot every role returned by PowerShell or Microsoft Graph API is visible in Microsoft Entra roles interface. The following table organizes those differences.\nAPI name\nMicrosoft Entra admin center portal name\nNotes\nAgent User\nNot shown because it's implicitly assigned to users of agents\nNA\nDevice Join\nDeprecated\nDeprecated roles documentation\nDevice Managers\nDeprecated\nDeprecated roles documentation\nDevice Users\nDeprecated\nDeprecated roles documentation\nDirectory Synchronization Accounts\nNot shown because it shouldn't be used\nDirectory Synchronization Accounts documentation\nGuest User\nNot shown because it can't be used\nNA\nMicrosoft 365 Support Engineer\nNot shown because it shouldn't be used\nMicrosoft 365 Support Engineer documentation\nModern Commerce Administrator\nNot shown because it can't be used\nModern Commerce Administrator\nPartner Tier 1 Support\nNot shown because it shouldn't be used\nPartner Tier1 Support documentation\nPartner Tier 2 Support\nNot shown because it shouldn't be used\nPartner Tier2 Support documentation\nRestricted Guest User\nNot shown because it can't be used\nNA\nUser\nNot shown because it can't be used\nNA\nWorkplace Device Join\nDeprecated\nDeprecated roles documentation\nMicrosoft 365 Support Engineer\nTemplate ID: 00cf5c54-4693-4f59-a0ac-ab79ef0a974d\nDo not use - not intended for general use.\nActions\nDescription\nmicrosoft.directory/applications/allProperties/read\nRead all properties (including privileged properties) on all types of applications\nmicrosoft.directory/auditLogs/allProperties/read\nRead all properties on audit logs, excluding custom security attributes audit logs\nmicrosoft.directory/authorizationPolicy/standard/read\nRead standard properties of authorization policy\nmicrosoft.directory/conditionalAccessPolicies/standard/read\nRead Conditional Access for policies\nmicrosoft.directory/crossTenantAccessPolicy/default/standard/read\nRead basic properties of the default cross-tenant access policy\nmicrosoft.directory/deviceManagementPolicies/standard/read\nRead standard properties on mobile device management and mobile app management policies\nmicrosoft.directory/deviceRegistrationPolicy/standard/read\nRead standard properties on device registration policies\nmicrosoft.directory/devices/standard/read\nRead basic properties on devices\nmicrosoft.directory/directoryRoles/allProperties/read\nRead all properties of directory roles\nmicrosoft.directory/directoryRoles/members/read\nRead all members of Microsoft Entra roles\nmicrosoft.directory/domains/allProperties/read\nRead all properties of domains\nmicrosoft.directory/domains/standard/read\nRead basic properties on domains\nmicrosoft.directory/groups/allProperties/read\nRead all properties (including privileged properties) on Security groups and Microsoft 365 groups, including role-assignable groups\nmicrosoft.directory/groupSettings/allProperties/read\nRead all properties of group settings\nmicrosoft.directory/groups/members/read\nRead members of Security groups and Microsoft 365 groups, including role-assignable groups\nmicrosoft.directory/groups/owners/read\nRead owners of Security groups and Microsoft 365 groups, including role-assignable groups\nmicrosoft.directory/groups/standard/read\nRead standard properties of Security groups and Microsoft 365 groups, including role-assignable groups\nmicrosoft.directory/organization/allProperties/read\nRead all properties for an organization\nmicrosoft.directory/policies/standard/read\nRead basic properties on policies\nmicrosoft.directory/securityRiskPolicy/standard/read\nRead basic properties of security risk policy that includes Microsoft Entra security defaults, strong authentication, and account compromise\nmicrosoft.directory/servicePrincipals/allProperties/read\nRead all properties (including privileged properties) on servicePrincipals\nmicrosoft.directory/servicePrincipals/appRoleAssignments/limitedRead\nRead application roles assigned to a specific service principal, but cannot enumerate service principals\nmicrosoft.directory/servicePrincipals/standard/read\nRead basic properties of service principals\nmicrosoft.directory/subscribedSkus/allProperties/read\nRead all properties of product subscriptions\nmicrosoft.directory/users/directReports/read\nRead the direct reports for users\nmicrosoft.directory/users/licenseDetails/read\nRead license details of users\nmicrosoft.directory/users/manager/read\nRead manager of users\nmicrosoft.directory/users/memberOf/read\nRead the group memberships of users\nmicrosoft.directory/users/registeredDevices/read\nRead registered devices of users\nmicrosoft.directory/users/standard/read\nRead basic properties on users\nmicrosoft.office365.protectionCenter/attackSimulator/payload/allProperties/read\nRead all properties of attack payloads in Attack Simulator\nmicrosoft.office365.protectionCenter/attackSimulator/reports/allProperties/read\nRead reports of attack simulation, responses, and associated training\nmicrosoft.teams/allEntities/allProperties/read\nRead all properties of Microsoft Teams\nModern Commerce Administrator\nTemplate ID: d24aef57-1500-4070-84db-2666f29cf966\nDon't use. This role isn't returned by PowerShell or the Microsoft Graph API. It's automatically assigned from Commerce, and isn't intended or supported for any other use.\nThe Modern Commerce Administrator role gives certain users permission to access Microsoft 365 admin center and see the left navigation entries for\nHome\n,\nBilling\n, and\nSupport\n. The content available in these areas is controlled by\ncommerce-specific roles\nassigned to users to manage products that they bought for themselves or your organization. This might include tasks like paying bills, or for access to billing accounts and billing profiles.\nUsers with the Modern Commerce Administrator role typically have administrative permissions in other Microsoft purchasing systems, but don't have Global Administrator or Billing Administrator roles used to access the admin center.\nWhen is the Modern Commerce Administrator role assigned?\nSelf-service purchase in Microsoft 365 admin center\n– Self-service purchase gives users a chance to try out new products by buying or signing up for them on their own. These products are managed in the admin center. Users who make a self-service purchase are assigned a role in the commerce system, and the Modern Commerce Administrator role so they can manage their purchases in admin center. Admins can block self-service purchases (for Fabric, Power BI, Power Apps, Power Automate) through\nPowerShell\n. For more information, see\nSelf-service purchase FAQ\n.\nPurchases from Microsoft commercial marketplace\n– Similar to self-service purchase, when a user buys a product or service from Microsoft AppSource or Azure Marketplace, the Modern Commerce Administrator role is assigned if they don't have the Global Administrator or Billing Administrator role. In some cases, users might be blocked from making these purchases. For more information, see\nMicrosoft commercial marketplace\n.\nProposals from Microsoft\n– A proposal is a formal offer from Microsoft for your organization to buy Microsoft products and services. When the person who is accepting the proposal doesn't have a Global Administrator or Billing Administrator role in Microsoft Entra ID, they're assigned both a commerce-specific role to complete the proposal and the Modern Commerce Administrator role to access admin center. When they access the admin center, they can only use features that are authorized by their commerce-specific role.\nCommerce-specific roles\n– Some users are assigned commerce-specific roles. If a user isn't a Global Administrator or Billing Administrator, they get the Modern Commerce Administrator role so they can access the admin center.\nIf the Modern Commerce Administrator role is unassigned from a user, they lose access to Microsoft 365 admin center. If they were managing any products, either for themselves or for your organization, they won't be able to manage them. This might include assigning licenses, changing payment methods, paying bills, or other tasks for managing subscriptions.\nActions\nDescription\nmicrosoft.commerce.billing/partners/read\nmicrosoft.commerce.volumeLicenseServiceCenter/allEntities/allTasks\nManage all aspects of Volume Licensing Service Center\nmicrosoft.office365.supportTickets/allEntities/allTasks\nCreate and manage Microsoft 365 service requests\nmicrosoft.office365.webPortal/allEntities/basic/read\nRead basic properties on all resources in the Microsoft 365 admin center\nNext steps\nAssign Microsoft Entra roles\nUnderstand the different roles\nAssign a user as an administrator of an Azure subscription\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Admin Roles", "section": "Microsoft Entra ID" }, "https://learn.microsoft.com/en-us/entra/id-governance/access-reviews-overview": { - "content_hash": "sha256:ae080c9864f448acdc745810f860aebdfe62c8343a3c013dceab082244e97cd4", - "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nWhat are access reviews?\nFeedback\nSummarize this article for me\nAccess reviews in Microsoft Entra ID, part of Microsoft Entra, enable organizations to efficiently manage group memberships, access to enterprise applications, and role assignments. User access can be reviewed regularly to make sure only the right people have continued access.\nHere's a video that provides a quick overview of access reviews:\nWhy are access reviews important?\nMicrosoft Entra ID enables you to collaborate with users from inside your organization, and with external users. Users can join groups, invite guests, connect to cloud apps, and work remotely from either their work or personal devices. The convenience of using self-service has led to a need for better access management capabilities.\nAs new employees join, how do you ensure they have the access they need to be productive?\nAs people move teams or leave the company, how do you make sure that their old access is removed?\nExcessive access rights can lead to compromises.\nExcessive access right can also lead audit findings as they indicate a lack of control over access.\nYou have to proactively engage with resource owners to ensure they regularly review who has access to their resources.\nWhen should you use access reviews?\nToo many users in privileged roles:\nIt's a good idea to check how many users have administrative access, how many of them are Global Administrators, and if there are any invited guests or partners that haven't been removed after being assigned to do an administrative task. You can recertify the role assignment users in\nMicrosoft Entra roles\nsuch as Global Administrators, or\nAzure resources roles\nsuch as User Access Administrator in the\nMicrosoft Entra Privileged Identity Management (PIM)\nexperience.\nWhen automation is not possible:\nYou can create rules for dynamic membership groups, security groups, or Microsoft 365 Groups, but what if the HR data isn't in Microsoft Entra ID or if users still need access after leaving the group to train their replacement? You can then create a review on that group to ensure those who still need access keeps access.\nWhen a group is used for a new purpose:\nIf you have a group that is going to be synced to Microsoft Entra ID, or if you plan to enable the application Salesforce for everyone in the Sales team group, it would be useful to ask the group owner to review the dynamic membership group before it's used in a different risk content.\nBusiness critical data access:\nfor certain resources, such as\nbusiness critical applications\n, it might be required as part of compliance processes to ask people to regularly reconfirm and give a justification on why they need continued access.\nTo maintain a policy's exception list:\nIn an ideal world, all users would follow the access policies to secure access to your organization's resources. However, sometimes there are business cases that require you to make exceptions. As the IT admin, you can manage this task, avoid oversight of policy exceptions, and provide auditors with proof that these exceptions are reviewed regularly.\nAsk group owners to confirm they still need guests in their groups:\nEmployee access might be automated with other identity and access management features such lifecycle workflows based on data from an HR source, but not invited guests. If a group gives guests access to business sensitive content, then it's the group owner's responsibility to confirm the guests still have a legitimate business need for access.\nHave reviews recur periodically:\nYou can set up recurring access reviews of users at set frequencies such as weekly, monthly, quarterly or annually, and the reviewers are notified at the start of each review. Reviewers can approve or deny access with a friendly interface and with the help of smart recommendations.\nNote\nIf you're ready to try Access reviews take a look at\nCreate an access review of groups or applications\nWhere do you create reviews?\nDepending on what you want to review, you either create your access review in access reviews, Microsoft Entra enterprise apps, PIM, or entitlement management.\nAccess rights of users\nReviewers can be\nReview created in\nReviewer experience\nSecurity group members\nOffice group members\nSpecified reviewers\nGroup owners\nSelf-review\naccess reviews\nMicrosoft Entra groups\nAccess panel\nAssigned to a connected app\nSpecified reviewers\nSelf-review\naccess reviews\nMicrosoft Entra enterprise apps\nAccess panel\nMicrosoft Entra role\nSpecified reviewers\nSelf-review\nPIM\nMicrosoft Entra admin center\nAzure resource role\nSpecified reviewers\nSelf-review\nPIM\nMicrosoft Entra admin center\nAccess package assignments\nSpecified reviewers\nGroup members\nSelf-review\nentitlement management\nAccess panel\nAccess rights from custom data resources (preview)\nManagers\naccess reviews\nAccess panel\nLicense requirements\nThis feature requires Microsoft Entra ID Governance or Microsoft Entra Suite subscriptions, for your organization's users. Some capabilities, within this feature, may operate with a Microsoft Entra ID P2 subscription. For more information, see the articles of each capability for more details. To find the right license for your requirements, see\nMicrosoft Entra ID Governance licensing fundamentals\n.\nNote\nCreating a review on inactive users and with\nuser-to-group affiliation\nrecommendations, or an\naccess review of multiple resources together (preview)\n, requires a Microsoft Entra ID Governance license.\nAccess Review Agent (Preview)\nThe Access Review Agent works for your reviewers by automatically gathering insights and generating recommendations. It then guides reviewers through the review process in Microsoft Teams with natural language, with simple summaries and proposed decisions, so they can make the final call with confidence and clarity. For more information, see\nAccess Review Agent\n.\nNext steps\nPrepare for an access review of users' access to an application\nCreate an access review of groups or applications\nCreate an access review of users in a Microsoft Entra administrative role\nCreate an access review to multiple resources in a catalog (preview)\nReview access to groups or applications\nComplete an access review of groups or applications\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "content_hash": "sha256:15753cdfcd96a6e31a38d3b95179893ead8cd55eea0d9c5c895210b9d943db89", + "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nWhat are access reviews?\nFeedback\nSummarize this article for me\nAccess reviews in Microsoft Entra ID, part of Microsoft Entra, enable organizations to efficiently manage group memberships, access to enterprise applications, and role assignments. User access can be reviewed regularly to make sure only the right people have continued access.\nHere's a video that provides a quick overview of access reviews:\nWhy are access reviews important?\nMicrosoft Entra ID enables you to collaborate with users from inside your organization, and with external users. Users can join groups, invite guests, connect to cloud apps, and work remotely from either their work or personal devices. The convenience of using self-service has led to a need for better access management capabilities.\nAs new employees join, how do you ensure they have the access they need to be productive?\nAs people move teams or leave the company, how do you make sure that their old access is removed?\nExcessive access rights can lead to compromises.\nExcessive access rights can also lead to audit findings as they indicate a lack of control over access.\nYou have to proactively engage with resource owners to ensure they regularly review who has access to their resources.\nWhen should you use access reviews?\nToo many users in privileged roles:\nIt's a good idea to check how many users have administrative access, how many of them are Global Administrators, and if there are any invited guests or partners that haven't been removed after being assigned to do an administrative task. You can recertify the role assignment users in\nMicrosoft Entra roles\nsuch as Global Administrators, or\nAzure resources roles\nsuch as User Access Administrator in the\nMicrosoft Entra Privileged Identity Management (PIM)\nexperience.\nWhen automation is not possible:\nYou can create rules for dynamic membership groups, security groups, or Microsoft 365 Groups, but what if the HR data isn't in Microsoft Entra ID or if users still need access after leaving the group to train their replacements? You can then create a review on that group to ensure those who still need access keeps access.\nWhen a group is used for a new purpose:\nIf you have a group that is going to be synced to Microsoft Entra ID, or if you plan to enable the application Salesforce for everyone in the Sales team group, it would be useful to ask the group owner to review the dynamic membership group before it's used in a different risk context.\nBusiness critical data access:\nfor certain resources, such as\nbusiness critical applications\n, it might be required as part of compliance processes to ask people to regularly reconfirm and give a justification on why they need continued access.\nTo maintain a policy's exception list:\nIn an ideal world, all users would follow the access policies to secure access to your organization's resources. However, sometimes there are business cases that require you to make exceptions. As the IT admin, you can manage this task, avoid oversight of policy exceptions, and provide auditors with proof that these exceptions are reviewed regularly.\nAsk group owners to confirm they still need guests in their groups:\nEmployee access might be automated with other identity and access management features such lifecycle workflows based on data from an HR source, but not invited guests. If a group gives guests access to business sensitive content, then it's the group owner's responsibility to confirm the guests still have a legitimate business need for access.\nHave reviews recur periodically:\nYou can set up recurring access reviews of users at set frequencies such as weekly, monthly, quarterly or annually, and the reviewers are notified at the start of each review. Reviewers can approve or deny access with a friendly interface and with the help of smart recommendations.\nNote\nIf you're ready to try access reviews, take a look at\nCreate an access review of groups or applications\n.\nWhere do you create reviews?\nDepending on what you want to review, you create your access review in access reviews, Microsoft Entra enterprise apps, PIM, or entitlement management.\nAccess rights of users\nReviewers can be\nReview created in\nReviewer experience\nSecurity group members\nOffice group members\nSpecified reviewers\nGroup owners\nSelf-review\naccess reviews\nMicrosoft Entra groups\nAccess panel\nAssigned to a connected app\nSpecified reviewers\nSelf-review\naccess reviews\nMicrosoft Entra enterprise apps\nAccess panel\nMicrosoft Entra role\nSpecified reviewers\nSelf-review\nPIM\nMicrosoft Entra admin center\nAzure resource role\nSpecified reviewers\nSelf-review\nPIM\nMicrosoft Entra admin center\nAccess package assignments\nSpecified reviewers\nGroup members\nSelf-review\nentitlement management\nAccess panel\nAccess rights from custom data resources (preview)\nManagers\naccess reviews\nAccess panel\nLicense requirements\nThis feature requires Microsoft Entra ID Governance or Microsoft Entra Suite subscriptions, for your organization's users. Some capabilities, within this feature, may operate with a Microsoft Entra ID P2 subscription. For more information, see the articles of each capability for more details. To find the right license for your requirements, see\nMicrosoft Entra ID Governance licensing fundamentals\n.\nNote\nCreating a review on inactive users and with\nuser-to-group affiliation\nrecommendations, or an\naccess review of multiple resources together (preview)\n, requires a Microsoft Entra ID Governance license.\nAccess Review Agent (Preview)\nThe Access Review Agent works for your reviewers by automatically gathering insights and generating recommendations. It then guides reviewers through the review process in Microsoft Teams with natural language, with simple summaries and proposed decisions, so they can make the final call with confidence and clarity. For more information, see\nAccess Review Agent\n.\nNext steps\nPrepare for an access review of users' access to an application\nCreate an access review of groups or applications\nCreate an access review of users in a Microsoft Entra administrative role\nCreate an access review to multiple resources in a catalog (preview)\nReview access to groups or applications\nComplete an access review of groups or applications\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, - "last_changed": "2026-03-11T12:59:29.613756+00:00", + "last_changed": "2026-03-14T06:51:10.083468+00:00", "topic": "Access Reviews", "section": "Microsoft Entra ID" }, "https://learn.microsoft.com/en-us/entra/id-governance/create-access-review": { - "content_hash": "sha256:3b68fb9a181d504f4181ab9c3559d57f686e70ac03caba122e88252b38d9e3f1", - "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nCreate an access review of groups and applications in Microsoft Entra ID\nFeedback\nSummarize this article for me\nAccess to groups and applications for employees and guests changes over time. To reduce the risk associated with stale access assignments, administrators can use Microsoft Entra ID to create access reviews for group members or application access.\nMicrosoft 365 and Security group owners can also use Microsoft Entra ID to create access reviews for group members as long as a user with at least the Identity Governance Administrator role enables the setting via the\nAccess Reviews Settings\npane. For more information about these scenarios, see\nManage access reviews\n.\nWatch a short video that talks about enabling access reviews.\nThis article describes how to create one or more access reviews for group members or application access.\nTo review access package assignments, see\nconfigure an access review in entitlement management\n.\nTo review Azure resource or Microsoft Entra roles, see\nCreate an access review of Azure resource and Microsoft Entra roles in Privileged Identity Management\n.\nFor reviews of PIM for Groups, see\ncreate an access review of PIM for Groups\n.\nFor reviews across multiple groups, applications and custom data providers, see\ncatalog access reviews (preview)\n.\nPrerequisites\nUsing this feature requires Microsoft Entra ID Governance or Microsoft Entra Suite licenses. To find the right license for your requirements, see\nMicrosoft Entra ID Governance licensing fundamentals\n.\nIf you're reviewing access to an application, then before you create the review, see the article on how to\nprepare for an access review of users' access to an application\nto ensure the application is integrated with Microsoft Entra ID in your tenant.\nNote\nAccess reviews capture a snapshot of access at the beginning of each review instance. Any changes made during the review process will be reflected in the subsequent review cycle. Essentially, with the commencement of each new recurrence, pertinent data regarding the users, resources under review, and their respective reviewers is retrieved.\nNote\nIn a group review, nested groups are automatically flattened, so users from nested groups appear as individual users. If a user is flagged for removal due to their membership in a nested group, they won't be automatically removed from the nested group, but only from direct group membership.\nCreate a single-stage access review\nScope\nSign in to the\nMicrosoft Entra admin center\nas at least an\nIdentity Governance Administrator\n.\nBrowse to\nID Governance\n>\nAccess Reviews\n.\nSelect\nNew access review\nto create a new access review.\nOn the Access reviews template screen, select\nReview access to a resource type\n.\nIn the\nSelect what to review\nbox, select which resource you want to review.\nIf you selected\nTeams + Groups\n, you have two options:\nAll Microsoft 365 groups with guest users\n: Select this option if you want to create recurring reviews on all your guest users across all your Microsoft Teams and Microsoft 365 groups in your organization. Dynamic groups and role-assignable groups aren't included. You can also choose to exclude individual groups by selecting\nSelect group(s) to exclude\n.\nSelect Teams + groups\n: Select this option if you want to specify a finite set of teams or groups to review. A list of groups to choose from appears on the right.\nIf you selected\nApplications\n, select one or more applications.\nNote\nSelecting multiple groups or applications results in the creation of multiple access reviews. For example, if you select five groups to review, the result is five separate access reviews.\nNow you can select a scope for the review. Your options are:\nGuest users only\n: This option limits the access review to only the Microsoft Entra B2B guest users in your directory.\nEveryone\n: This option scopes the access review to all user objects associated with the resource.\nNote\nIf you selected\nAll Microsoft 365 groups with guest users\n, your only option is to review\nGuest users only\n.\nOr if you're conducting group membership review, you can create access reviews for only the inactive users in the group. In the\nUsers scope\nsection, check the box next to\nInactive users (on tenant level)\n. If you check the box, the scope of the review focuses on inactive users only, those who haven't signed in either interactively or non-interactively to the tenant. Then, specify\nDays inactive\nwith many days inactive up to 730 days (two years). Users in the group inactive for the specified number of days are the only users in the review.\nNote\nRecently created users aren't affected when configuring the inactivity time. The Access Review checks if a user has been created in the time frame configured and disregard users who haven’t existed for at least that amount of time. For example, if you set the inactivity time as 90 days and a guest user was created or invited less than 90 days ago, the guest user won't be in scope of the Access Review. This ensures that a user can sign in at least once before being removed.\nSelect\nNext: Reviews\n.\nNext: Reviews\nYou can create a single-stage or multi-stage review. For a single stage review, continue here. To create a multi-stage access review, follow the steps in\nCreate a multi-stage access review\nIn the\nSpecify reviewers\nsection, in the\nSelect reviewers\nbox, select either one or more people to make decisions in the access reviews. You can choose from:\nGroup owner(s)\n: This option is only available when you do a review on a team or group.\nSelected user(s) or groups(s)\nUsers review their own access\nManagers of users\nIf you choose either\nManagers of users\nor\nGroup owner(s)\n, you can also specify a fallback reviewer. Fallback reviewers are asked to do a review when the user has no manager specified in the directory or if the group doesn't have an owner.\nNote\nIn a team or group access review, only the group owners (at the time a review starts) are considered as reviewers. During a review, if the list of group owners is updated, new group owners won't be considered reviewers and old group owners will still be considered reviewers. However, in the case of a recurring review, any changes on the group owners list will be considered in the next instance of that review.\nImportant\nFor access reviews of PIM for Groups (preview), when selecting the group owner as the reviewer, it's mandatory to assign at least one fallback reviewer. The review will only assign active owner(s) as the reviewer(s). Eligible owners aren't included. If there are no active owners when the review begins, the fallback reviewer(s) will be assigned to the review.\nIn the\nSpecify recurrence of review\nsection, specify the following selections:\nDuration (in days)\n: How long a review is open for input from reviewers.\nStart date\n: When the series of reviews begins.\nEnd date\n: When the series of reviews ends. You can specify that it\nNever\nends. Or, you can select\nEnd on a specific date\nor\nEnd after number of occurrences\n.\nSelect\nNext: Settings\n.\nNote\nWhen creating an access review, you're able to specify the start date, but the start time could vary a few hours based on system processing. For example, if you create an access review at 03:00 UTC on 09/09 that is set to run on 09/12, then the review is scheduled to run at 03:00 UTC on the start date, but could be delayed due to system processing.\nYou're able to specify the start date, but the start time can vary a few hours based on system processing.\nNext: Settings\nIn the\nUpon completion settings\nsection, you can specify what happens after the review finishes.\nAuto apply results to resource\n: Select this checkbox if you want access of denied users to be removed automatically after the review duration ends. If the option is disabled, you have to manually apply the results when the review finishes. To learn more about applying the results of the review, see\nManage access reviews\n.\nIf reviewers don't respond\n: Use this option to specify what happens for users not reviewed by any reviewer within the review period. This setting doesn't affect users who were reviewed by a reviewer. The dropdown list shows the following options:\nNo change\n: Leaves a user's access unchanged.\nRemove access\n: Removes a user's access.\nApprove access\n: Approves a user's access.\nTake recommendations\n: Takes the system's recommendation to deny or approve the user's continued access.\nWarning\nIf the settings\nIf reviewers don't respond\nis set to\nRemove access\nor\nTake recommendations\nand\nAuto apply results to resource\nis enabled, all access to this resource could potentially be revoked if the reviewers fail to respond.\nAction to apply on denied guest users\n: This option is only available if the access review is scoped to include only guest users to specify what happens to guest users if they're denied either by a reviewer or by the\nIf reviewers don't respond\nsetting.\nRemove user's membership from the resource\n: This option removes a denied guest user's access to the group or application being reviewed. They can still sign in to the tenant and won't lose any other access.\nBlock user from signing-in for 30 days, then remove user from the tenant\n: This option blocks a denied guest user from signing in to the tenant, no matter if they have access to other resources. If this action was taken in error, admins can reenable the guest user's access within 30 days after the guest user was disabled. If no action is taken on the disabled guest user after 30 days, they're deleted from the tenant.\nTo learn more about best practices for removing guest users who no longer have access to resources in your organization, see\nUse Microsoft Entra ID Governance to review and remove external users who no longer have resource access\n.\nNote\nAction to apply on denied guest users\nisn't configurable on reviews scoped to more than guest users. It's also not configurable for reviews of\nAll Microsoft 365 groups with guest users.\nWhen not configurable, the default option of removing a user's membership from the resource is used on denied users.\nUse the\nAt end of review, send notification to\noption to send notifications to other users or groups with completion updates. This feature allows for stakeholders other than the review creator to be updated on the progress of the review. To use this feature, choose\nSelect User(s) or Group(s)\nand add another user or group for which you want to receive the status of completion.\nIn the\nEnable review decision helpers\nsection choose whether you want your reviewer to receive recommendations during the review process:\nIf you select\nNo sign-in within 30 days\n, users who have signed in during the previous 30-day period are recommended for approval. Users who haven't signed in during the past 30 days are recommended for denial. This 30-day interval is irrespective of whether the sign-ins were interactive or not. The last sign-in date for the specified user will also display along with the recommendation.\nIf you select\nUser-to-Group Affiliation\n, reviewers get the recommendation to Approve or Deny access for the users based on user’s average distance in the organization’s reporting-structure. Users who are distant from all the other users within the group are considered to have \"low affiliation\" and will get a deny recommendation in the group access reviews.\nNote\nIf you create an access review based on applications, your recommendations are based on the 30-day interval period depending on when the user last signed in to the application rather than the tenant.\nIn the\nAdvanced settings\nsection, you can choose the following:\nJustification required\n: Select this checkbox to require the reviewer to supply a reason for approval or denial.\nEmail notifications\n: Select this checkbox to have Microsoft Entra ID send email notifications to reviewers when an access review starts and to administrators when a review finishes.\nReminders\n: Select this checkbox to have Microsoft Entra ID send reminders of access reviews in progress to all reviewers. Reviewers receive the reminders halfway through the review, no matter if they've finished their review or not.\nAdditional content for reviewer email\n: The content of the email sent to reviewers is autogenerated based on the review details, such as review name, resource name, and due date. If you need to communicate more information, you can specify details such as instructions or contact information in the box. The information that you enter is included in the invitation, and reminder emails are sent to assigned reviewers. The section highlighted in the following image shows where this information appears.\nAccess Review Agent (Preview)\n: Select this checkbox to allow reviewers to complete the access review in Microsoft Teams with natural language, insights, and recommendations.\nNote\nThis setting is only available for review configurations currently supported by the Access Review Agent and additional setup steps are required. For more information, see:\nAccess Review Agent with Microsoft Security Copilot\n.\nSelect\nNext: Review + Create\n.\nNext: Review + Create\nName the access review. Optionally, give the review a description. The name and description are shown to the reviewers.\nReview the information and select\nCreate\n.\nCreate a multi-stage access review\nA multi-stage review allows the administrator to define two or three sets of reviewers to complete a review one after another. In a single-stage review, all reviewers make a decision within the same period and the last reviewer to make a decision, has their decision applied. In a multi-stage review, two or three independent sets of reviewers each make a decision within their own stage. The stages are sequential, and the next stage doesn't happen until a decision is recorded in the previous stage. Multi-stage reviews can be used to reduce the burden on later-stage reviewers, allow for escalation of reviewers, or have independent groups of reviewers agree on decisions.\nNote\nData of users included in multi-stage access reviews are a part of the audit record at the start of the review. Administrators can delete the data at any time by deleting the multi-stage access review series. For general information about GDPR and protecting user data, see the\nGDPR section of the Microsoft Trust Center\nand the\nGDPR section of the Service Trust portal\n.\nAfter you have selected the resource and scope of your review, move on to the\nReviews\ntab.\nSelect the checkbox next to\nMulti-stage review\n.\nUnder\nFirst stage review\n, select the reviewers from the dropdown menu next to\nSelect reviewers\n.\nIf you select\nGroup owner(s)\nor\nManagers of Users\n, you have the option to add a fallback reviewer. To add a fallback, select\nSelect fallback reviewers\nand add the users you want to be fallback reviewers.\nAdd the duration for the first stage. To add the duration, enter a number in the field next to\nStage duration (in days)\n. This is the number of days you wish for the first stage to be open to the first stage reviewers to make decisions.\nUnder\nSecond stage review\n, select the reviewers from the dropdown menu next to\nSelect reviewers\n. These reviewers will be asked to review after the duration of the first stage review ends.\nAdd any fallback reviewers if necessary.\nAdd the duration for the second stage.\nBy default, you see two stages when you create a multi-stage review. However, you can add up to three stages. If you want to add a third stage, select\n+ Add a stage\nand complete the required fields.\nYou can decide to allow 2nd and 3rd stage reviewers to see decisions made in the previous stage(s). If you want to allow them to see the decisions made prior, select the box next to\nShow previous stage(s) decisions to later stage reviewers\nunder\nReveal review results\n. Leave the box unchecked to disable this setting if you’d like your reviewers to review independently.\nThe duration of each recurrence is set to the sum of the duration day(s) you specified in each stage.\nSpecify the\nReview recurrence\n, the\nStart date\n, and\nEnd date\nfor the review. The recurrence type must be at least as long as the total duration of the recurrence (that is, the max duration for a weekly review recurrence is 7 days).\nTo specify which reviewees will continue from stage to stage, select one or multiple of the following options next to\nSpecify reviewees to go to next stage\n:\nApproved reviewees\n- Only reviewees that were approved move on to the next stage(s).\nDenied reviewees\n- Only reviewees that were denied move on to the next stage(s).\nNot reviewed reviewees\n- Only reviewees that haven't been reviewed will move on to the next stage(s).\nReviewees marked as \"Don't Know\"\n- Only reviewees marked as \"Don't know\" move on to the next stage(s).\nAll\n: everyone moves on to the next stage if you’d like all stages of reviewers to make a decision.\nContinue on to the\nsettings tab\nand finish the rest of the settings and create the review. Follow the instructions in\nNext: Settings\n.\nInclude B2B direct connect users and teams accessing Teams Shared Channels in access reviews\nYou can create access reviews for B2B direct connect users via shared channels in Microsoft Teams. As you collaborate externally, you can use Microsoft Entra access reviews to make sure external access to shared channels stays current. External users in the shared channels are called B2B direct connect users. To learn more about Teams Shared Channels and B2B direct connect users, read the\nB2B direct connect\narticle.\nWhen you create an access review on a Team with shared channels, your reviewers can review continued need for access of those external users and Teams in the shared channels. You can review access of B2B connect users and other supported B2B collaboration users and non-B2B internal users in the same review.\nNote\nCurrently, B2B direct connect users and teams are only included in single-stage reviews. If multi-stage reviews are enabled, the B2B direct connect users and teams won't be included in the access review.\nB2B direct connect users and teams are included in access reviews of the Teams-enabled Microsoft 365 group that the shared channels are a part of. To create the review, you must have at least the role of User Administrator or Identity Governance Administrator.\nUse the following instructions to create an access review on a team with shared channels:\nSign in to the\nMicrosoft Entra admin center\nas at least an\nIdentity Governance Administrator\n.\nBrowse to\nID Governance\n>\nAccess Reviews\n.\nSelect\n+ New access review\n.\nSelect\nTeams + Groups\nand then select\nSelect teams + groups\nto set the\nReview scope\n. B2B direct connect users and teams aren't included in reviews of\nAll Microsoft 365 groups with guest users\n.\nSelect a Team that has shared channels shared with 1 or more B2B direct connect users or Teams.\nSet the\nScope\n.\nChoose\nAll users\nto include:\nAll internal users\nB2B collaboration users that are members of the Team\nB2B direct connect users\nTeams that access shared channels\nOr, choose\nGuest users only\nto only include B2B direct connect users and Teams and B2B collaboration users.\nContinue on to the\nReviews\ntab. Select a reviewer to complete the review, then specify the\nDuration\nand\nReview recurrence\n.\nNote\nIf you set\nSelect reviewers\nto\nUsers review their own access\nor\nManagers of users\n, B2B direct connect users and Teams won't be able to review their own access in your tenant. The owner of the Team under review gets an email that asks the owner to review the B2B direct connect user and Teams.\nIf you select\nManagers of users\n, a selected fallback reviewer reviews any user without a manager in the home tenant. This includes B2B direct connect users and Teams without a manager.\nGo on to the\nSettings\ntab and configure additional settings. Then go to the\nReview and Create\ntab to start your access review. For more detailed information about creating a review and configuration settings, see our\nCreate a single-stage access review\n.\nAllow group owners to create and manage access reviews of their groups\nSign in to the\nMicrosoft Entra admin center\nas at least an\nIdentity Governance Administrator\n.\nBrowse to\nID Governance\n>\nAccess Reviews\n>\nSettings\n.\nOn the\nDelegate who can create and manage access reviews\npage, set\nGroup owners can create and manage access reviews for groups they own\nto\nYes\n.\nNote\nBy default, the setting is set to\nNo\n. To allow group owners to create and manage access reviews, change the setting to\nYes\n.\nCreate an access review programmatically\nYou can also create an access review using Microsoft Graph or PowerShell.\nTo create an access review using Graph, call the Graph API to\ncreate an access review schedule definition\n. The caller must either be a user in an appropriate role with an application that has the delegated\nAccessReview.ReadWrite.All\npermission, or an application with the\nAccessReview.ReadWrite.All\napplication permission. For more information, see the\nOverview of access reviews APIs\nand the tutorials for how to\nreview members of a security group\nor\nreview guests in Microsoft 365 groups\n.\nYou can also create an access review in PowerShell with the\nNew-MgIdentityGovernanceAccessReviewDefinition\ncmdlet from the\nMicrosoft Graph PowerShell cmdlets for Identity Governance\nmodule. For more information, see the\nexamples\n.\nWhen an access review starts\nAfter you've specified the settings for an access review, and created it, the access review appears in your list with an indicator of its status.\nBy default, Microsoft Entra ID sends an email to reviewers shortly after a one-time review, or a recurrence of a recurring review, starts. If you choose not to have Microsoft Entra ID send the email, be sure to inform the reviewers that an access review is waiting for them to complete. You can show them the instructions for how to\nreview access to groups or applications\n. If your review is for guests to review their own access, show them the instructions for how to\nreview access for yourself to groups or applications\n.\nIf you've assigned guests as reviewers and they haven't accepted their invitation to the tenant, they won't receive an email from access reviews. They must first accept the invitation before they can begin reviewing.\nUpdate the access review\nAfter one or more access reviews have started, you might want to modify or update the settings of your existing access reviews. Here are some common scenarios to consider:\nUpdate settings or reviewers:\nIf an access review is recurring, there are separate settings under\nCurrent\nand under\nSeries\n. Updating the settings or reviewers under\nCurrent\nonly applies changes to the current access review. Updating the settings under\nSeries\nupdates the settings for all future recurrences.\nAdd and remove reviewers:\nWhen you update access reviews, you might choose to add a fallback reviewer in addition to the primary reviewer. Primary reviewers might be removed when you update an access review. Fallback reviewers aren't removable by design.\nNote\nFallback reviewers can only be added when the reviewer type is a manager or a group owner. Primary reviewers can be added when the reviewer type is the selected user.\nRemind the reviewers:\nWhen you update access reviews, you might choose to enable the\nReminders\noption under\nAdvanced settings\n. Users then receive an email notification at the midpoint of the review period, whether they've finished the review or not.\nNote\nOnce the access review is initiated, you can use the\ncontactedReviewers\nAPI call to see the list of all reviewers notified, or who would be if notifications are turned off, via email for an access review. Time stamps for when these users were notified are also provided.\nNote\nGroups and users in a restricted management administrative unit can't be managed with Microsoft Entra ID Governance features such as\nAccess reviews\n.\nNext steps\nComplete an access review of groups or applications\nAccess Review Agent (preview)\nCreate an access review of PIM for Groups (preview)\nReview access to groups or applications\nReview access for yourself to groups or applications\nCreate an access review of Azure resource and Microsoft Entra roles in PIM\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "content_hash": "sha256:448638d802fe234740dba712e5f3d7e960e5200d37717c0fd37e21a4df2b169e", + "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nCreate an access review of groups and applications in Microsoft Entra ID\nFeedback\nSummarize this article for me\nAccess to groups and applications for employees and guests changes over time. To reduce the risk associated with stale access assignments, administrators can use Microsoft Entra ID to create access reviews for group members or application access.\nMicrosoft 365 and Security group owners can also use Microsoft Entra ID to create access reviews for group members as long as a user with at least the Identity Governance Administrator role enables the setting via the\nAccess Reviews Settings\npane. For more information about these scenarios, see\nManage access reviews\n.\nWatch a short video that talks about enabling access reviews.\nThis article describes how to create one or more access reviews for group members or application access.\nTo review access package assignments, see\nconfigure an access review in entitlement management\n.\nTo review Azure resource or Microsoft Entra roles, see\nCreate an access review of Azure resource and Microsoft Entra roles in Privileged Identity Management\n.\nFor reviews of PIM for Groups, see\ncreate an access review of PIM for Groups\n.\nFor reviews across multiple groups, applications and custom data providers, see\ncatalog access reviews (preview)\n.\nPrerequisites\nUsing this feature requires Microsoft Entra ID Governance or Microsoft Entra Suite licenses. To find the right license for your requirements, see\nMicrosoft Entra ID Governance licensing fundamentals\n.\nIf you're reviewing access to an application, then before you create the review, see the article on how to\nprepare for an access review of users' access to an application\nto ensure the application is integrated with Microsoft Entra ID in your tenant.\nNote\nAccess reviews capture a snapshot of access at the beginning of each review instance. Any changes made during the review process will be reflected in the subsequent review cycle. Essentially, with the commencement of each new recurrence, pertinent data regarding the users, resources under review, and their respective reviewers is retrieved.\nNote\nIn a group review, nested groups are automatically flattened, so users from nested groups appear as individual users. If a user is flagged for removal due to their membership in a nested group, they won't be automatically removed from the nested group, but only from direct group membership.\nCreate a single-stage access review\nScope\nSign in to the\nMicrosoft Entra admin center\nas at least an\nIdentity Governance Administrator\n.\nBrowse to\nID Governance\n>\nAccess Reviews\n.\nSelect\nNew access review\nto create a new access review.\nOn the Access reviews template screen, select\nReview access to a resource type\n.\nIn the\nSelect what to review\nbox, select which resource you want to review.\nIf you selected\nTeams + Groups\n, you have two options:\nAll Microsoft 365 groups with guest users\n: Select this option if you want to create recurring reviews on all your guest users across all your Microsoft Teams and Microsoft 365 groups in your organization. Dynamic groups and role-assignable groups aren't included. You can also choose to exclude individual groups by selecting\nSelect group(s) to exclude\n.\nSelect Teams + groups\n: Select this option if you want to specify a finite set of teams or groups to review. A list of groups to choose from appears on the right.\nIf you selected\nApplications\n, select one or more applications.\nNote\nSelecting multiple groups or applications results in the creation of multiple access reviews. For example, if you select five groups to review, the result is five separate access reviews.\nNow you can select a scope for the review. Your options are:\nGuest users only\n: This option limits the access review to only the Microsoft Entra B2B guest users in your directory.\nEveryone\n: This option scopes the access review to all user objects associated with the resource.\nNote\nIf you selected\nAll Microsoft 365 groups with guest users\n, your only option is to review\nGuest users only\n.\nOr if you're conducting group membership review, you can create access reviews for only the inactive users in the group. In the\nUsers scope\nsection, check the box next to\nInactive users (on tenant level)\n. If you check the box, the scope of the review focuses on inactive users only, those who haven't signed in either interactively or non-interactively to the tenant. Then, specify\nDays inactive\nwith the number of days inactive up to 730 days (two years). Users in the group inactive for the specified number of days are the only users in the review.\nNote\nRecently created users aren't affected when configuring the inactivity time. The Access Review checks if a user has been created in the time frame configured and disregard users who haven’t existed for at least that amount of time. For example, if you set the inactivity time as 90 days and a guest user was created or invited less than 90 days ago, the guest user won't be in scope of the Access Review. This ensures that a user can sign in at least once before being removed.\nSelect\nNext: Reviews\n.\nNext: Reviews\nYou can create a single-stage or multi-stage review. For a single stage review, continue here. To create a multi-stage access review, follow the steps in\nCreate a multi-stage access review\n.\nIn the\nSpecify reviewers\nsection, in the\nSelect reviewers\nbox, select either one or more people to make decisions in the access reviews. You can choose from:\nGroup owner(s)\n: This option is only available when you do a review on a team or group.\nSelected user(s) or groups(s)\nUsers review their own access\nManagers of users\nIf you choose either\nManagers of users\nor\nGroup owner(s)\n, you can also specify a fallback reviewer. Fallback reviewers are asked to do a review when the user has no manager specified in the directory or if the group doesn't have an owner.\nNote\nIn a team or group access review, only the group owners (at the time a review starts) are considered as reviewers. During a review, if the list of group owners is updated, new group owners won't be considered reviewers and old group owners will still be considered reviewers. However, in the case of a recurring review, any changes on the group owners list will be considered in the next instance of that review.\nImportant\nFor access reviews of PIM for Groups (preview), when selecting the group owner as the reviewer, it's mandatory to assign at least one fallback reviewer. The review will only assign active owner(s) as the reviewer(s). Eligible owners aren't included. If there are no active owners when the review begins, the fallback reviewer(s) will be assigned to the review.\nIn the\nSpecify recurrence of review\nsection, specify the following selections:\nDuration (in days)\n: How long a review is open for input from reviewers.\nStart date\n: When the series of reviews begins.\nEnd date\n: When the series of reviews ends. You can specify that it\nNever\nends. Or, you can select\nEnd on a specific date\nor\nEnd after number of occurrences\n.\nSelect\nNext: Settings\n.\nNote\nWhen creating an access review, you're able to specify the start date, but the start time could vary a few hours based on system processing. For example, if you create an access review at 03:00 UTC on 09/09 that is set to run on 09/12, then the review is scheduled to run at 03:00 UTC on the start date, but could be delayed due to system processing.\nNext: Settings\nIn the\nUpon completion settings\nsection, you can specify what happens after the review finishes.\nAuto apply results to resource\n: Select this checkbox if you want access of denied users to be removed automatically after the review duration ends. If the option is disabled, you have to manually apply the results when the review finishes. To learn more about applying the results of the review, see\nManage access reviews\n.\nIf reviewers don't respond\n: Use this option to specify what happens for users not reviewed by any reviewer within the review period. This setting doesn't affect users who were reviewed by a reviewer. The dropdown list shows the following options:\nNo change\n: Leaves a user's access unchanged.\nRemove access\n: Removes a user's access.\nApprove access\n: Approves a user's access.\nTake recommendations\n: Takes the system's recommendation to deny or approve the user's continued access.\nWarning\nIf the setting\nIf reviewers don't respond\nis set to\nRemove access\nor\nTake recommendations\nand\nAuto apply results to resource\nis enabled, all access to this resource could potentially be revoked if the reviewers fail to respond.\nAction to apply on denied guest users\n: This option is only available if the access review is scoped to include only guest users to specify what happens to guest users if they're denied either by a reviewer or by the\nIf reviewers don't respond\nsetting.\nRemove user's membership from the resource\n: This option removes a denied guest user's access to the group or application being reviewed. They can still sign in to the tenant and won't lose any other access.\nBlock user from signing-in for 30 days, then remove user from the tenant\n: This option blocks a denied guest user from signing in to the tenant, no matter if they have access to other resources. If this action was taken in error, admins can reenable the guest user's access within 30 days after the guest user was disabled. If no action is taken on the disabled guest user after 30 days, they're deleted from the tenant.\nTo learn more about best practices for removing guest users who no longer have access to resources in your organization, see\nUse Microsoft Entra ID Governance to review and remove external users who no longer have resource access\n.\nNote\nAction to apply on denied guest users\nisn't configurable on reviews scoped to more than guest users. It's also not configurable for reviews of\nAll Microsoft 365 groups with guest users.\nWhen not configurable, the default option of removing a user's membership from the resource is used on denied users.\nUse the\nAt end of review, send notification to\noption to send notifications to other users or groups with completion updates. This feature allows for stakeholders other than the review creator to be updated on the progress of the review. To use this feature, choose\nSelect User(s) or Group(s)\nand add another user or group for which you want to receive the status of completion.\nIn the\nEnable review decision helpers\nsection choose whether you want your reviewer to receive recommendations during the review process:\nIf you select\nNo sign-in within 30 days\n, users who have signed in during the previous 30-day period are recommended for approval. Users who haven't signed in during the past 30 days are recommended for denial. This 30-day interval is irrespective of whether the sign-ins were interactive or not. The last sign-in date for the specified user will also display along with the recommendation.\nIf you select\nUser-to-Group Affiliation\n, reviewers get the recommendation to Approve or Deny access for the users based on user’s average distance in the organization’s reporting-structure. Users who are distant from all the other users within the group are considered to have \"low affiliation\" and will get a deny recommendation in the group access reviews.\nNote\nIf you create an access review based on applications, your recommendations are based on the 30-day interval period depending on when the user last signed in to the application rather than the tenant.\nIn the\nAdvanced settings\nsection, you can choose the following:\nJustification required\n: Select this checkbox to require the reviewer to supply a reason for approval or denial.\nEmail notifications\n: Select this checkbox to have Microsoft Entra ID send email notifications to reviewers when an access review starts and to administrators when a review finishes.\nReminders\n: Select this checkbox to have Microsoft Entra ID send reminders of access reviews in progress to all reviewers. Reviewers receive the reminders halfway through the review, no matter if they've finished their review or not.\nAdditional content for reviewer email\n: The content of the email sent to reviewers is autogenerated based on the review details, such as review name, resource name, and due date. If you need to communicate more information, you can specify details such as instructions or contact information in the box. The information that you enter is included in the invitation, and reminder emails are sent to assigned reviewers. The section highlighted in the following image shows where this information appears.\nAccess Review Agent (Preview)\n: Select this checkbox to allow reviewers to complete the access review in Microsoft Teams with natural language, insights, and recommendations.\nNote\nThis setting is only available for review configurations currently supported by the Access Review Agent and additional setup steps are required. For more information, see:\nAccess Review Agent with Microsoft Security Copilot\n.\nSelect\nNext: Review + Create\n.\nNext: Review + Create\nName the access review. Optionally, give the review a description. The name and description are shown to the reviewers.\nReview the information and select\nCreate\n.\nCreate a multi-stage access review\nA multi-stage review allows the administrator to define two or three sets of reviewers to complete a review one after another. In a single-stage review, all reviewers make a decision within the same period and the last reviewer to make a decision, has their decision applied. In a multi-stage review, two or three independent sets of reviewers each make a decision within their own stage. The stages are sequential, and the next stage doesn't happen until a decision is recorded in the previous stage. Multi-stage reviews can be used to reduce the burden on later-stage reviewers, allow for escalation of reviewers, or have independent groups of reviewers agree on decisions.\nNote\nData of users included in multi-stage access reviews are a part of the audit record at the start of the review. Administrators can delete the data at any time by deleting the multi-stage access review series. For general information about GDPR and protecting user data, see the\nGDPR section of the Microsoft Trust Center\nand the\nGDPR section of the Service Trust portal\n.\nAfter you have selected the resource and scope of your review, move on to the\nReviews\ntab.\nSelect the checkbox next to\nMulti-stage review\n.\nUnder\nFirst stage review\n, select the reviewers from the dropdown menu next to\nSelect reviewers\n.\nIf you select\nGroup owner(s)\nor\nManagers of Users\n, you have the option to add a fallback reviewer. To add a fallback, select\nSelect fallback reviewers\nand add the users you want to be fallback reviewers.\nAdd the duration for the first stage. To add the duration, enter a number in the field next to\nStage duration (in days)\n. This is the number of days you wish for the first stage to be open to the first stage reviewers to make decisions.\nUnder\nSecond stage review\n, select the reviewers from the dropdown menu next to\nSelect reviewers\n. These reviewers will be asked to review after the duration of the first stage review ends.\nAdd any fallback reviewers if necessary.\nAdd the duration for the second stage.\nBy default, you see two stages when you create a multi-stage review. However, you can add up to three stages. If you want to add a third stage, select\n+ Add a stage\nand complete the required fields.\nYou can decide to allow 2nd and 3rd stage reviewers to see decisions made in the previous stage(s). If you want to allow them to see the prior decisions, select the box next to\nShow previous stage(s) decisions to later stage reviewers\nunder\nReveal review results\n. Leave the box unchecked to disable this setting if you’d like your reviewers to review independently.\nThe duration of each recurrence is set to the sum of the duration day(s) you specified in each stage.\nSpecify the\nReview recurrence\n, the\nStart date\n, and\nEnd date\nfor the review. The recurrence type must be at least as long as the total duration of the recurrence (that is, the max duration for a weekly review recurrence is 7 days).\nTo specify which reviewees will continue from stage to stage, select one or multiple of the following options next to\nSpecify reviewees to go to next stage\n:\nApproved reviewees\n- Only reviewees that were approved move on to the next stage(s).\nDenied reviewees\n- Only reviewees that were denied move on to the next stage(s).\nNot reviewed reviewees\n- Only reviewees that haven't been reviewed will move on to the next stage(s).\nReviewees marked as \"Don't Know\"\n- Only reviewees marked as \"Don't know\" move on to the next stage(s).\nAll\n: everyone moves on to the next stage if you’d like all stages of reviewers to make a decision.\nContinue on to the\nsettings tab\nand finish the rest of the settings and create the review. Follow the instructions in\nNext: Settings\n.\nInclude B2B direct connect users and teams accessing Teams Shared Channels in access reviews\nYou can create access reviews for B2B direct connect users via shared channels in Microsoft Teams. As you collaborate externally, you can use Microsoft Entra access reviews to make sure external access to shared channels stays current. External users in the shared channels are called B2B direct connect users. To learn more about Teams Shared Channels and B2B direct connect users, read the\nB2B direct connect\narticle.\nWhen you create an access review on a Team with shared channels, your reviewers can review continued need for access of those external users and Teams in the shared channels. You can review access of B2B connect users and other supported B2B collaboration users and non-B2B internal users in the same review.\nNote\nCurrently, B2B direct connect users and teams are only included in single-stage reviews. If multi-stage reviews are enabled, the B2B direct connect users and teams won't be included in the access review.\nB2B direct connect users and teams are included in access reviews of the Teams-enabled Microsoft 365 group that the shared channels are a part of. To create the review, you must have at least the role of User Administrator or Identity Governance Administrator.\nUse the following instructions to create an access review on a team with shared channels:\nSign in to the\nMicrosoft Entra admin center\nas at least an\nIdentity Governance Administrator\n.\nBrowse to\nID Governance\n>\nAccess Reviews\n.\nSelect\n+ New access review\n.\nSelect\nTeams + Groups\nand then select\nSelect teams + groups\nto set the\nReview scope\n. B2B direct connect users and teams aren't included in reviews of\nAll Microsoft 365 groups with guest users\n.\nSelect a Team that has shared channels shared with 1 or more B2B direct connect users or Teams.\nSet the\nScope\n.\nChoose\nAll users\nto include:\nAll internal users\nB2B collaboration users that are members of the Team\nB2B direct connect users\nTeams that access shared channels\nOr, choose\nGuest users only\nto only include B2B direct connect users and Teams and B2B collaboration users.\nContinue on to the\nReviews\ntab. Select a reviewer to complete the review, then specify the\nDuration\nand\nReview recurrence\n.\nNote\nIf you set\nSelect reviewers\nto\nUsers review their own access\nor\nManagers of users\n, B2B direct connect users and Teams won't be able to review their own access in your tenant. The owner of the Team under review gets an email that asks the owner to review the B2B direct connect user and Teams.\nIf you select\nManagers of users\n, a selected fallback reviewer reviews any user without a manager in the home tenant. This includes B2B direct connect users and Teams without a manager.\nGo on to the\nSettings\ntab and configure additional settings. Then go to the\nReview and Create\ntab to start your access review. For more detailed information about creating a review and configuration settings, see\nCreate a single-stage access review\n.\nAllow group owners to create and manage access reviews of their groups\nSign in to the\nMicrosoft Entra admin center\nas at least an\nIdentity Governance Administrator\n.\nBrowse to\nID Governance\n>\nAccess Reviews\n>\nSettings\n.\nOn the\nDelegate who can create and manage access reviews\npage, set\nGroup owners can create and manage access reviews for groups they own\nto\nYes\n.\nNote\nBy default, the setting is set to\nNo\n. To allow group owners to create and manage access reviews, change the setting to\nYes\n.\nCreate an access review programmatically\nYou can also create an access review using Microsoft Graph or PowerShell.\nTo create an access review using Graph, call the Graph API to\ncreate an access review schedule definition\n. The caller must either be a user in an appropriate role with an application that has the delegated\nAccessReview.ReadWrite.All\npermission, or an application with the\nAccessReview.ReadWrite.All\napplication permission. For more information, see the\nOverview of access reviews APIs\nand the tutorials for how to\nreview members of a security group\nor\nreview guests in Microsoft 365 groups\n.\nYou can also create an access review in PowerShell with the\nNew-MgIdentityGovernanceAccessReviewDefinition\ncmdlet from the\nMicrosoft Graph PowerShell cmdlets for Identity Governance\nmodule. For more information, see the\nexamples\n.\nWhen an access review starts\nAfter you've specified the settings for an access review, and created it, the access review appears in your list with an indicator of its status.\nBy default, Microsoft Entra ID sends an email to reviewers shortly after a one-time review, or a recurrence of a recurring review, starts. If you choose not to have Microsoft Entra ID send the email, be sure to inform the reviewers that an access review is waiting for them to complete. You can show them the instructions for how to\nreview access to groups or applications\n. If your review is for guests to review their own access, show them the instructions for how to\nreview access for yourself to groups or applications\n.\nIf you've assigned guests as reviewers and they haven't accepted their invitation to the tenant, they won't receive an email from access reviews. They must first accept the invitation before they can begin reviewing.\nUpdate the access review\nAfter one or more access reviews have started, you might want to modify or update the settings of your existing access reviews. Here are some common scenarios to consider:\nUpdate settings or reviewers:\nIf an access review is recurring, there are separate settings under\nCurrent\nand under\nSeries\n. Updating the settings or reviewers under\nCurrent\nonly applies changes to the current access review. Updating the settings under\nSeries\nupdates the settings for all future recurrences.\nAdd and remove reviewers:\nWhen you update access reviews, you might choose to add a fallback reviewer in addition to the primary reviewer. Primary reviewers might be removed when you update an access review. Fallback reviewers aren't removable by design.\nNote\nFallback reviewers can only be added when the reviewer type is a manager or a group owner. Primary reviewers can be added when the reviewer type is the selected user.\nRemind the reviewers:\nWhen you update access reviews, you might choose to enable the\nReminders\noption under\nAdvanced settings\n. Users then receive an email notification at the midpoint of the review period, whether they've finished the review or not.\nNote\nOnce the access review is initiated, you can use the\ncontactedReviewers\nAPI call to see the list of all reviewers notified, or who would be if notifications are turned off, via email for an access review. Time stamps for when these users were notified are also provided.\nNote\nGroups and users in a restricted management administrative unit can't be managed with Microsoft Entra ID Governance features such as\nAccess reviews\n.\nNext steps\nComplete an access review of groups or applications\nAccess Review Agent (preview)\nCreate an access review of PIM for Groups (preview)\nReview access to groups or applications\nReview access for yourself to groups or applications\nCreate an access review of Azure resource and Microsoft Entra roles in PIM\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, - "last_changed": "2026-03-11T12:59:29.613756+00:00", + "last_changed": "2026-03-14T06:51:10.083468+00:00", "topic": "Create Access Review", "section": "Microsoft Entra ID" }, "https://learn.microsoft.com/en-us/entra/id-governance/privileged-identity-management/pim-configure": { "content_hash": "sha256:c19f0040590efd084bc110f8937fa5bd9850b9255a23cbc6243ab389edf4388b", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nWhat is Microsoft Entra Privileged Identity Management?\nFeedback\nSummarize this article for me\nPrivileged Identity Management (PIM) is a service in Microsoft Entra ID that enables you to manage, control, and monitor access to important resources in your organization. These resources include resources in Microsoft Entra ID, Azure, and other Microsoft Online Services such as Microsoft 365 or Microsoft Intune. The following video explains important PIM concepts and features.\nReasons to use\nOrganizations want to minimize the number of people who have access to secure information or resources, because that reduces the chance of\na malicious actor getting access\nan authorized user inadvertently impacting a sensitive resource\nHowever, users still need to carry out privileged operations in Microsoft Entra ID, Azure, Microsoft 365, or SaaS apps. Organizations can give users just-in-time privileged access to Azure and Microsoft Entra resources and can oversee what those users are doing with their privileged access.\nLicense requirements\nUsing Privileged Identity Management requires licenses. For more information on licensing, see\nMicrosoft Entra ID Governance licensing fundamentals\n.\nWhat does it do?\nPrivileged Identity Management provides time-based and approval-based role activation to mitigate the risks of excessive, unnecessary, or misused access permissions on resources that you care about. Here are some of the key features of Privileged Identity Management:\nProvide\njust-in-time\nprivileged access to Microsoft Entra ID and Azure resources\nAssign\ntime-bound\naccess to resources using start and end dates\nRequire\napproval\nto activate privileged roles\nEnforce\nmultifactor authentication\nto activate any role\nUse\njustification\nto understand why users activate\nGet\nnotifications\nwhen privileged roles are activated\nConduct\naccess reviews\nto ensure users still need roles\nDownload\naudit history\nfor internal or external audit\nPrevents removal of the\nlast active Global Administrator\nand\nPrivileged Role Administrator\nrole assignments\nWhat can I do with it?\nOnce you set up Privileged Identity Management, you'll see\nTasks\n,\nManage\n, and\nActivity\noptions in the left navigation menu. As an administrator, you can choose between options such as managing\nMicrosoft Entra roles\n, managing\nAzure resource\nroles, or PIM for Groups. When you choose what you want to manage, you see the appropriate set of options for that option.\nWho can do what?\nFor Microsoft Entra roles in Privileged Identity Management, only a user who is in the Privileged Role Administrator or Global Administrator role can manage assignments for other administrators. Global Administrators, Security Administrators, Global Readers, and Security Readers can also view assignments to Microsoft Entra roles in Privileged Identity Management.\nFor Azure resource roles in Privileged Identity Management, only a subscription administrator, a resource Owner, or a resource User Access Administrator can manage assignments for other administrators. Users who are Privileged Role Administrators, Security Administrators, or Security Readers don't by default have access to view assignments to Azure resource roles in Privileged Identity Management.\nTerminology\nTo better understand Privileged Identity Management and its documentation, you should review the following terms.\nTerm or concept\nRole assignment category\nDescription\neligible\nType\nA role assignment that requires a user to perform one or more actions to use the role. If a user is eligible for a role, they can activate the role when they need to perform privileged tasks. There's no difference in the access given to someone with a permanent versus an eligible role assignment. The only difference is that some people don't need that access all the time.\nactive\nType\nA role assignment that doesn't require a user to perform any action to use the role. Users assigned as active have the privileges assigned to the role.\nactivate\nThe process of performing one or more actions to use a role that a user is eligible for. Actions might include performing a multifactor authentication (MFA) check, providing a business justification, or requesting approval from designated approvers.\nassigned\nState\nA user that has an active role assignment.\nactivated\nState\nA user that has an eligible role assignment, performed the actions to activate the role, and is now active. Once activated, the user can use the role for a preconfigured period of time before they need to activate again.\npermanent eligible\nDuration\nA role assignment where a user is always eligible to activate the role.\npermanent active\nDuration\nA role assignment where a user can always use the role without performing any actions.\ntime-bound eligible\nDuration\nA role assignment where a user is eligible to activate the role only within start and end dates.\ntime-bound active\nDuration\nA role assignment where a user can use the role only within start and end dates.\njust-in-time (JIT) access\nA model in which users receive temporary permissions to perform privileged tasks, which prevents malicious or unauthorized users from gaining access after the permissions expire. Access is granted only when users need it.\nprinciple of least privilege access\nA recommended security practice in which every user is provided with only the minimum privileges needed to accomplish the tasks they're authorized to perform. This practice minimizes the number of Global Administrators and instead uses specific administrator roles for certain scenarios.\nRole assignment overview\nThe PIM role assignments give you a secure way to grant access to resources in your organization. This section describes the assignment process. It includes assign roles to members, activate assignments, approve or deny requests, extend and renew assignments.\nPIM keeps you informed by sending you and other participants\nemail notifications\n. These emails might also include links to relevant tasks, such activating, approve or deny a request.\nThe following screenshot shows an email message sent by PIM. The email informs Patti that Alex updated a role assignment for Emily.\nAssign\nThe assignment process starts by assigning roles to members. To grant access to a resource, the administrator assigns roles to users, groups, service principals, or managed identities. The assignment includes the following data:\nThe members or owners to assign the role.\nThe scope of the assignment. The scope limits the assigned role to a particular set of resources.\nThe type of the assignment\nEligible\nassignments require the member of the role to perform an action to use the role. Actions might include activation, or requesting approval from designated approvers.\nActive\nassignments don't require the member to perform any action to use the role. Members assigned as active have the privileges assigned to the role.\nThe duration of the assignment, using start and end dates or permanent. For eligible assignments, the members can activate or requesting approval during the start and end dates. For active assignments, the members can use the assigned role during this period of time.\nThe following screenshot shows how administrator assigns a role to members.\nFor more information, check out the following articles:\nAssign Microsoft Entra roles\n,\nAssign Azure resource roles\n, and\nAssign eligibility for a PIM for Groups\nActivate\nIf users are eligible for a role, then they must activate the role assignment before using the role. To activate the role, users select specific activation duration within the maximum (configured by administrators), and the reason for the activation request.\nThe following screenshot shows how members activate their role to a limited time.\nIf the role requires\napproval\nto activate, a notification appears in the upper right corner of the user's browser informing them the request is pending approval. If an approval isn't required, the member can start using the role.\nFor more information, check out the following articles:\nActivate Microsoft Entra roles\n,\nActivate my Azure resource roles\n, and\nActivate my PIM for Groups roles\nApprove or deny\nDelegated approvers receive email notifications when a role request is pending their approval. Approvers can view, approve, or deny these pending requests in PIM. After the request is approved, the member can start using the role. For example, if a user or a group was assigned with Contribution role to a resource group, they are able to manage that particular resource group.\nFor more information, check out the following articles:\nApprove or deny requests for Microsoft Entra roles\n,\nApprove or deny requests for Azure resource roles\n, and\nApprove activation requests for PIM for Groups\nExtend and renew assignments\nAfter administrators set up time-bound owner or member assignments, the first question you might ask is what happens if an assignment expires? In this new version, we provide two options for this scenario:\nExtend\n– When a role assignment nears expiration, the user can use Privileged Identity Management to request an extension for the role assignment\nRenew\n– When a role assignment expires, the user can use Privileged Identity Management to request a renewal for the role assignment\nBoth user-initiated actions require an approval from a Global Administrator or Privileged Role Administrator. Admins don't need to be in the business of managing assignment expirations. You can just wait for the extension or renewal requests to arrive for simple approval or denial.\nFor more information, check out the following articles:\nExtend or renew Microsoft Entra role assignments\n,\nExtend or renew Azure resource role assignments\n, and\nExtend or renew PIM for Groups assignments\nScenarios\nPrivileged Identity Management supports the following scenarios:\nPrivileged Role Administrator permissions\nEnable approval for specific roles\nSpecify approver users or groups to approve requests\nView request and approval history for all privileged roles\nApprover permissions\nView pending approvals (requests)\nApprove or reject requests for role elevation (single and bulk)\nProvide justification for my approval or rejection\nEligible role user permissions\nRequest activation of a role that requires approval\nView the status of your request to activate\nComplete your task in Microsoft Entra ID if activation was approved\nMicrosoft Graph APIs\nYou can use Privileged Identity Management programmatically through the following Microsoft Graph APIs:\nPIM for Microsoft Entra roles APIs\nPIM for groups APIs\nNext steps\nLicense requirements to use Privileged Identity Management\nSecuring privileged access for hybrid and cloud deployments in Microsoft Entra ID\nDeploy Privileged Identity Management\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Privileged Identity Management", @@ -1124,7 +1124,7 @@ "https://learn.microsoft.com/en-us/entra/identity/users/": { "content_hash": "sha256:4e79ca0128d4634567c01f8f89ae01ca8d1a2e1e7198c3abee14070a82513e02", "normalized_content": "Table of contents\nRead in English\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nEnterprise user management documentation\nMicrosoft Entra ID provides user management services so that you can assign licenses, manage your groups and users, and add or manage domain names.\nAbout enterprise user management\nOverview\nUsers, groups, domains, and licenses\nCustom roles overview\nConcept\nMicrosoft Entra organizational independence\nManage access using groups\nGet started\nQuickstart\nAdd a user\nSet expiration policy for groups\nAssign licenses to users\nManage Microsoft Entra domain names\nHow-To Guide\nAdd your custom domain name\nManaging custom domain names\nManage groups\nHow-To Guide\nCreate a dynamic group\nGroup settings in PowerShell\nSet naming policy for groups\nAdd domains\nHow-To Guide\nAdd a custom domain name\nManaging custom domain names", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-01-25T23:12:28.455268+00:00", "topic": "User Management", @@ -1133,7 +1133,7 @@ "https://learn.microsoft.com/en-us/entra/agent-id/": { "content_hash": "sha256:358f7020477818f3798f4e5931e968d809426870b73c3fd7fb75c6bdd1aec444", "normalized_content": "Microsoft Entra Agent ID documentation\nSecure and govern AI agents at enterprise scale with Microsoft Entra Agent ID and the Microsoft agent identity platform. Create agent identities, apply Zero Trust controls, and manage agent access to your organization's resources.\nOverview\nWhat is Microsoft Entra Agent ID?\nOverview\nMicrosoft agent identity platform for developers\nOverview\nWhat is an agent identity?\nHow-To Guide\nCreate an agent identity\nMicrosoft Entra Agent ID\nThe comprehensive solution for protecting and governing AI agents in enterprise environments. Includes advanced security controls, and governance policies for agent identities and access management.\nManage agents\nWhat is Microsoft Entra Agent ID?\nOwners, sponsors, and managers\nManage agents in the end user experience\nAgent governance and lifecycles\nIdentity Governance for agents\nAgent identity lifecycle management\nProtect agent access to resources\nConditional Access for agents\nIdentity Protection for agents\nNetwork controls for agents\nMicrosoft agent identity platform for developers\nThe platform that enables you to create and manage agent identities.\nLearn about key concepts\nWhat is an agent identity blueprint?\nWhat is an agent identity?\nWhat are agent users?\nBuild with the Microsoft agent identity platform\nCreate an agent identity blueprint\nCreate an agent identity\nConfigure Microsoft Entra SDK for agent identities\nAuthenticate and call APIs\nRequest tokens for autonomous agents\nCall Microsoft Graph from an agent\nGrant app permissions to an autonomous agent\nLearn more\nExplore SDK references, API documentation, and code samples to build and manage agent identities.\nTechnical references\nSDK Reference\nMicrosoft Graph API reference\nOAuth protocols for Agent identities\nRelated content\nSecurity for AI\nSecurity Copilot + Microsoft Entra", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Agent ID Overview", @@ -1142,7 +1142,7 @@ "https://learn.microsoft.com/en-us/entra/agent-id/identity-professional/microsoft-entra-agent-identities-for-ai-agents": { "content_hash": "sha256:b9dcf456a2bafcf15789f5cd823010333d3259e9daddec3b3191acd521d82fae", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nWhat is Microsoft Entra Agent ID?\nFeedback\nSummarize this article for me\nAs assistive and autonomous agents become more prevalent in organizations, new security, governance, and compliance challenges must be addressed. Microsoft Entra Agent ID extends the comprehensive security capabilities of Microsoft Entra to agents, enabling organizations to build, discover, govern, and protect agent identities.\nSecurity for AI\nspans multiple Microsoft Entra features and is integrated through Microsoft Entra Agent ID and the\nMicrosoft agent identity platform for developers\n.\nThis article explains how Microsoft Entra Agent ID extends security capabilities to agents through conditional access policies, identity protection, identity governance, network-level controls, and the agent identity platform.\nImportant\nMicrosoft Entra Agent ID\nis currently in PREVIEW.\nThis information relates to a prerelease product that may be substantially modified before it's released. Microsoft makes no warranties, expressed or implied, with respect to the information provided here.\nConditional Access for agents\nConditional Access enables organizations to define and enforce adaptive policies that evaluate agent context and risk before granting access to resources. It's achieved by:\nEnforcing adaptive access control policies for all agent patterns across assistive, autonomous, and agent user types.\nUsing real-time signals such as agent identities risk controlling agent access to resources, with Microsoft Managed Policies providing a secure baseline by blocking high-risk agents.\nDeploying conditional access policies at scale using custom security attributes, while still supporting fine-grained controls for individual agents.\nFor more information, see\nConditional Access\n.\nID Governance for agents\nMicrosoft Entra Agent ID brings agent identities into similar identity governance processes as users, enabling them to be managed at scale. You can establish controls for agent access lifecycle using features such as entitlement management access packages.\nGovern agent identities at scale, from deployment to expiration.\nEnsure sponsors and owners are assigned and maintained for each agent ID, preventing orphaned agent IDs.\nEnforce that agent access to resources is intentional, auditable, and time-bound through access packages.\nFor more information, see\nidentity governance for agents\n.\nID Protection for agents\nMicrosoft Entra ID Protection detects and blocks threats by flagging anomalous activities involving agents. Risk signals are used to enforce risk-based access policies and inform agent discoverability.\nDetect agent identity risk derived from user risk and based on agents' own actions, including unusual or unauthorized activities.\nProvide risk signals to conditional access to enforce risk-based policies and session management controls.\nProvide risk signals to the Agent Registry to inform agent discoverability and access, with automatic remediation of compromised agents using preconfigured policies.\nFor more information, see\nidentity protection for agents\nNetwork controls for agents\nNetwork controls enforce consistent network security policies across users and agents across any platform or application. Provide full network visibility to all agent actions, filter malicious web content, enable network-based security controls, and prevent data exfiltration.\nLog agent network activity to remote tools for audit and threat detection, and apply web categorization to control access to APIs and MCP servers.\nRestrict file uploads and downloads using file-type policies to minimize risk, and automatically block and alert on malicious destinations using threat intelligence-based filtering.\nDetect and block prompt injection attacks that attempt to manipulate agent behavior through malicious instructions.\nFor more information, see\nNetwork controls for agents\n.\nMicrosoft Entra Agent identity platform for developers\nThe Microsoft Entra Agent identity platform enables you to assign identities to agents, autodiscover them across your organization, and manage all agent metadata in one place including capabilities, tasks, and protocols.\nProvides visibility into all organization agents with agent-to-agent discovery and authorization based on standard protocols such as MCP and A2A.\nAssign secure, scalable identities to every agent.​ Authenticates and authorizes agents based on standard protocols.\nLog and monitor agent activity for compliance.​\nFor more information, see\nMicrosoft Entra Agent Identity Platform\nHow to get started\nMicrosoft Entra Agent ID is part of\nMicrosoft Agent 365\n. Both are available through the\nFrontier program\nin Microsoft 365. To access these features you must have a license for Microsoft 365 Copilot and have enabled Frontier for your users.\nFollow the\nFrontier getting started guide\nor use the following steps to check if Frontier is enabled:\nSign in to the\nMicrosoft 365 admin center\nas a\nBilling Administrator\n.\nBrowse to\nCopilot\n>\nSettings\n>\nUser access\n>\nCopilot Frontier\nand make sure it's enabled for users. If you don't see these options, contact your administrator to check your Microsoft 365 Copilot licensing.\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Agent Identities for AI Agents", @@ -1151,7 +1151,7 @@ "https://learn.microsoft.com/en-us/entra/id-governance/agent-id-governance-overview": { "content_hash": "sha256:7b60ed4019571c7056b0b5bbcb194ac4695fbc2feb7f4e7d742d1ff5af7b3d3b", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nGoverning Agent Identities (Preview)\nFeedback\nSummarize this article for me\nMicrosoft Entra allows you to ensure that the right people have the right access to the right apps and services at the right time. With the addition of the Microsoft agent identity platform, managing the access rights of agents in the same way is just as important in the governance lifecycle of your organization's identities. The Microsoft agent identity platform introduces the concept of Agent Identities (IDs). Agent identities are accounts within Microsoft Entra ID that provide unique identification and authentication capabilities for AI agents.\nThis allows agent identities to be governed with Microsoft Entra features in the same style as you would govern human identities. With Agent identities, you can govern and manage the identity and access lifecycle of agents, ensuring the agents have a responsible person providing oversight throughout the agent lifecycle and agent's access does not persist longer than it is needed. This article provides an overview of how Microsoft Entra can be utilized to govern agent identities.\nLicense requirements\nMicrosoft Entra Agent ID is part of\nMicrosoft Agent 365\n. Both are available through the\nFrontier program\nin Microsoft 365. To access these features you must have a license for Microsoft 365 Copilot and have enabled Frontier for your users.\nFollow the\nFrontier getting started guide\nor use the following steps to check if Frontier is enabled:\nSign in to the\nMicrosoft 365 admin center\nas a\nBilling Administrator\n.\nBrowse to\nCopilot\n>\nSettings\n>\nUser access\n>\nCopilot Frontier\nand make sure it's enabled for users. If you don't see these options, contact your administrator to check your Microsoft 365 Copilot licensing.\nAgent identities basics\nHistorically, AI agents would rely upon tools to interact with various applications and systems, and each of those tools would have their own identities in those applications and systems. Some of those tools would use service principals to authenticate to Microsoft services via Microsoft Graph or Microsoft Azure APIs.\nMicrosoft Entra Agent ID\nintroduces support for identities for the agents themselves, with four new types of object: agent identity blueprint, agent identity blueprint principal, agent identity, and agent user. Through the\nagent identity blueprint\n, the agent can create one or more agent identities, and optionally an agent user for each agent identity. Each agent identity and agent user can have distinct access rights.\nFor a multitenant-capable agent, an agent identity blueprint principal can be brought into the tenant with resources so it can create agent identities in that tenant, similar to how a multitenant application can have a service principal in each tenant.\nThe agent identity and the agent user allow AI agents to take on digital identities within Microsoft Entra. Once agent identities are created, these agent identities are able to be governed using lifecycle and access features. Sponsors can be assigned to agent identities after creation. Sponsors of agent identities are human users accountable for making decisions about its lifecycle and access. For more information about the role of a sponsor of agent identities, see:\nAdministrative relationships for agent IDs\n.\nAgent identities in other Microsoft products and portals\nMicrosoft Foundry\nautomatically provisions and manages agent identities throughout the agent lifecycle. When the first agent in a Foundry project is created, Microsoft Foundry provisions a default agent identity blueprint and a default agent identity for the project, and agents in the project authenticate by using the shared project's agent identity. Publishing an agent automatically creates a dedicated agent identity blueprint and agent identity, and the agent will authenticate by using the unique agent identity. Foundry supports use of the agent identity for authentication in Model Context Protocol (MCP) and Agent-to-Agent (A2A) tools. For more information, see\nAgent identity concepts in Microsoft Foundry\n.\nYou can configure an\nAzure App Service or Azure Functions app\nto use the Microsoft Entra agent identity platform to securely connect to resources as an agent. For more information, see\nHow to use an agent identity in App Service and Azure Functions\n.\nAgents created in\nMicrosoft Copilot Studio\ncan be configured to automatically be assigned to an agent identity. When an agent identity is first created in a Power Platform environment after enabling this setting, a Microsoft Copilot Studio agent identity blueprint, and an agent identity blueprint principal, are automatically created. For more information, see\nAutomatically create Entra agent identities for Copilot Studio agents (preview)\n.\nFor agents in the\nMicrosoft Teams\nplatform, a developer can create and manage agent identity blueprints in the Developer Portal for Teams. For more information, see\nManage your apps in Developer Portal\n.\nMicrosoft Agent 365\ngives each AI agent its own Microsoft Entra Agent ID, for identity, lifecycle, and access management. For more information, see\nAgent identity platform capabilities for Agent 365\n.\nAssigning access to agent identities\nWhen created, agent identities have limited permissions, such as OAuth 2 delegated permission scopes\ninherited from their parent agent identity blueprint\n. In addition, agent identities can have resource access assigned to them directly via access packages. Agents can request an access package for own agent IDs, or have their owner or sponsor request one on their behalf. With access packages, you're able to assign agent identities access to the following resources:\nSecurity Group memberships\nApplication OAuth API permissions\n, including Graph application permissions\nMicrosoft Entra roles\nTo use access packages for agent identities, configure an access package with the required policy settings. When creating an access package assignment policy, in the\nWho can get access\nsection, select\nFor users, service principals, and agent identities in your directory\n, and then select the option of\nAll agents (preview)\n.\nNote\nIf your agents aren't using Microsoft Entra agent IDs, then also create an access package assignment policy with the option\nAll Service principals (preview)\nto allow service principals in your directory to be able to request this access package.\nAgents can then be assigned access packages through three different request pathways.\nThe agent identity itself can programmatically request an access package when needed for its operations, by creating an\naccessPackageAssignmentRequest\n.\nThe agent's sponsor can request access on behalf of the agent ID, providing human oversight in the access request process. For more information, see\nRequest an access package on behalf of an agent identity (Preview)\n.\nAn administrator can\ndirectly assign the agent identity or agent user to the access package\n.\nAfter submission, the access request is routed to designated approvers based on the access package configuration.\nWhen the agent identity has received an access package assignment with an expiry date, and if a sponsor is set on the agent identity, as the expiry date approaches, the sponsor receives notifications about the pending expiration. The sponsor then has two options: they can request an extension of the access package (if permitted by policy), or they can allow the access package assignment to expire. If the sponsor requests an extension, this request can trigger a new approval cycle, where approvers again confirm whether continued access is appropriate. If the sponsor takes no action, the access package assignment automatically expires on its end date, and the agent identity loses access to the target resources.\nFor a guide on creating an access package for agents, see:\naccess packages for agent identities in Microsoft Entra Agent ID\n. For a guide on assigning identities to an existing access package, see:\nView, add, and remove assignments for an access package in entitlement management\n.\nManagement of agents\nWhen agent identities are created, owners and sponsors of the agent can manually make decisions for the agent identity via both the My Account portal, and the My Access Portal.\nFrom the\nMy Account portal\n, Sponsors and Owners are able to manage the identity lifecycle of agents such as enabling and disabling the agent. You are also able to see information about its access, activity, and lifecycle. For more information about Managing agents, see:\nManage Agents in Microsoft Entra ID (Preview)\n.\nFrom the\nMy Access portal\n, Sponsors and Owners of agent identities are able to request access packages on behalf of their agent identities. For a guide on requesting access packages, see:\nRequest an access package on behalf of an agent identity (Preview)\n.\nAgent identities sponsor administration\nOne of the most important parts of governing agent identities is making sure that a delegated human user is always assigned to make sure the agent identity's access to resources are current. If the sponsor is leaving the organization, sponsorship of the agent identities is automatically transferred to their manager. With sponsorship transferred, there's always a human user accountable for managing the access and lifecycle of the agent identities. Microsoft Entra ID Governance features can help streamline this process within your organization. Lifecycle workflows include multiple tasks around notifying cosponsors, and managers of sponsors, of impending sponsorship changes. For a guide on setting up a workflow for agent identities sponsors, see:\nAgent identity sponsor tasks in Lifecycle Workflows (Preview)\n.\nRelated content\nWhat is entitlement management?\nWhat is Microsoft Entra ID Governance?\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Governing Agent Identities", @@ -1160,7 +1160,7 @@ "https://learn.microsoft.com/en-us/sharepoint/sharepoint-admin-role": { "content_hash": "sha256:2d357563ed7a716a3f99bc7143f53d807c02c608172ba393a4c35966f135df5c", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nAbout the SharePoint Administrator role in Microsoft 365\nFeedback\nSummarize this article for me\nUsers assigned the SharePoint Administrator role have access to the\nSharePoint admin center\nand can create and manage sites, designate site admins, manage sharing settings, and manage Microsoft 365 groups, including creating, deleting, and restoring groups, and changing group owners.\nGlobal Administrators in Microsoft 365 can assign users the SharePoint Administrator. The Global Administrator role already has all the permissions of the SharePoint Administrator role.\nFor info about assigning a user the SharePoint administrator role, see\nAssign admin roles in the Microsoft 365 admin center\n. If a user's role is changed so they gain or lose access to the SharePoint admin center, it takes about an hour for the change to take effect.\nImportant\nMicrosoft recommends that you use roles with the fewest permissions. Using lower permissioned accounts helps improve security for your organization. Global Administrator is a highly privileged role that should be limited to emergency scenarios when you can't use an existing role.\nSite management\nGlobal Administrators and SharePoint Administrators don't have automatic access to all sites and each user's OneDrive, but they can give themselves access to any site or OneDrive. They can also use Microsoft PowerShell to manage SharePoint and OneDrive. See more about this role's\nKey tasks of the SharePoint admin\n.\nSite admins have permission to manage sites, but they don't need to have an admin role in Microsoft 365, and don't have access to the SharePoint admin center.\nFor info about adding or removing a site admin, see\nManage site admins\n.\nTerm store administration\nThere's a separate role within SharePoint called the Term Store administrator. Users assigned to this role can add or change terms in the term store (a directory of common terms you want to use across your organization). To learn more, see\nAssign roles and permissions to manage term sets\n.\nAPI access\nTo manage API access in the SharePoint admin center, you need at least a\napplication administrator role\n. For more information, see\nManage access to Microsoft Entra ID-secured APIs\n.\nKey tasks of the SharePoint admin\nHere are some of the key tasks users can do when they're assigned to the SharePoint Administrator role:\nCreate sites\nDelete sites\nManage sharing settings at the organization level\nAdd and remove site admins\nManage site storage limits\nRelated articles\nAbout Microsoft 365 admin roles\nGetting started with SharePoint Online Management Shell\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "SharePoint Admin Center", @@ -1169,7 +1169,7 @@ "https://learn.microsoft.com/en-us/sharepoint/site-permissions": { "content_hash": "sha256:ffad727d1c318868d7e5703724e4268958af2b40bfcbec1a923cdacf486a97cd", "normalized_content": "Admin center site permissions reference\nOn the\nMembership\ntab, you can manage permissions for the site and also for any associated Microsoft 365 group or Microsoft Teams team. These roles are specific to the selected site or group and don't give users access to the SharePoint admin center.\nOwners\nMicrosoft 365 group owners can manage group membership, privacy, and classification, as well as the associated SharePoint site. If the Microsoft 365 group is associated with a team, then the group owners are also team owners.\nMembers\nMicrosoft 365 group members can participate in the group and have access to the associated SharePoint site. If the Microsoft 365 group is associated with a team, then the group members are also team members and can send messages and participate in channels if allowed by the team owner.\nSite admins\nSite admins (formerly called site collection administrators) have the highest level of SharePoint permissions. They have the same Full Control permissions of a site owner, plus they can do more things, such as managing search, the recycle bin, and site collection features. They also have access to any items in the site, including in subsites, even if permissions inheritance has been broken.\nIf there's a Microsoft 365 group or team connected to the site, then group or team owners are automatically included as site admins and group or team members are automatically included as site members. Managing site permissions through group or team membership is recommended over giving people permissions directly to the site. This method allows for easier administration and consistent access for users across group and team resources.\nNon-primary admins\nAdditional admins beyond the\nPrimary\nadmin are site admins only and can only manage the SharePoint site. They have no access to the associated Microsoft 365 group or team unless they have also been directly added to the group or team.\nSite owners\nSite owners have full control of the SharePoint site. If the site has an associated Microsoft 365 group or team, then group or team owners are automatically included as site owners. However, people added directly to the site owners group don't have access to the Microsoft 365 group or team unless they are added there directly.\nSite members\nSite members have edit permissions to the SharePoint site and can add and remove files, lists, and libraries. If the site has an associated Microsoft 365 group or team, then group or team members are automatically included as site members. However, people added directly to the site members group don't have access to the Microsoft 365 group or team unless they are added there directly.\nSite visitors\nSite visitors have view-only permissions to the SharePoint site. This permission level is only used by SharePoint and isn't related to permissions in an associated Microsoft 365 group or team.\nNote\nFor information on how to manage Site owners, Site members and Site visitors permission groups, see\nSharing and permissions in the SharePoint modern experience\n.\nAdditional permissions\nThere are additional\npermission levels\nin SharePoint beyond those shown on this panel. Users may have access to the site or its contents through those roles. Users may also have access to files or folders in the site through sharing links.\nSee also\nExternal sharing overview\nOverview of Microsoft 365 Groups for administrators", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-01-25T23:12:28.455268+00:00", "topic": "Site Permissions", @@ -1178,7 +1178,7 @@ "https://learn.microsoft.com/en-us/sharepoint/modern-experience-sharing-permissions": { "content_hash": "sha256:648c2c6a8ae6a730fab8468b8488201d0ad33fd9a71f27107bd01468ef8a58be", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nSharing and permissions in the SharePoint modern experience\nFeedback\nSummarize this article for me\nTraditionally, SharePoint permissions have been managed through a set of permissions groups within a site (Owners, Members, Visitors, etc.). In SharePoint in Microsoft 365, this remains true for some types of sites, but additional options are available and SharePoint is part of a much broader set of capabilities for\nsecure collaboration with Microsoft 365\n.\nThe main types of sites in SharePoint are:\nTeam sites\n- Team sites provide a collaboration environment for your teams and projects. Each team site, by default, is part of a Microsoft 365 group, which includes a mailbox, shared calendar, and other collaboration tools. Team sites may also be part of a team in Microsoft Teams. Permissions for team sites are best managed through the associated Microsoft 365 group or Teams team.\nChannel sites\n- Channel sites are team sites that are associated with a specific channel in a Teams team. Both private and shared channels create separate SharePoint sites just for the channel.\nCommunication sites\n- Communication sites are for broadcasting news and status across the organization. Communication site permissions are managed by using the SharePoint Owners, Members, and Visitors groups for the site.\nHub sites\n-\nHub sites\nare team sites or communication sites that the administrator has configured as the center of a hub. They're designed to provide connection between related sites through shared navigation. Permissions for hub sites can be managed through the Owners, Members, and Visitors groups, or through the associated Microsoft 365 group if there is one. Special permissions are needed to associate sites to a hub.\nTeam site permissions and Microsoft 365 Groups\nBy default, each SharePoint team site is part of an\nMicrosoft 365 group\n. A Microsoft 365 group is a single permissions group that is associated with various Microsoft 365 services. This includes a SharePoint site, an instance of Planner, a mailbox, a shared calendar, and others.\nWhen you add owners or members to the Microsoft 365 group, they're given access to the SharePoint site along with the other group-connected services. Group owners become site owners, and group members become site members.\nIt's possible to manage SharePoint site permissions separately from the Microsoft 365 group by using SharePoint groups, unless it's a channel site. (We recommend against this for the simplest management experience.) In such a case, group members will continue to have access to the site, but users added directly to the site won't have access to any of the group services. Microsoft 365 groups don't have view-only access, so any users you wish to have view permissions on the site must be added directly to the Visitors group on the site.\nUsing team sites with Teams\nMicrosoft Teams provides a hub for collaboration by bringing together various services including a SharePoint team site. Within the Teams experience, users can directly access SharePoint along with the other services. Each team is associated with a Microsoft 365 group and Teams uses that group to manage its permissions.\nFor scenarios where a SharePoint site is used with Teams, we recommend doing all permission management through Teams. As with Microsoft 365 groups, team owners become site owners and team members become site members.\nFor private or shared channel sites, permission management must be done in Teams. Channel owners become sites owners in SharePoint and channel members become site members. Permissions in SharePoint can't be managed separately and will display in read-only mode.\nFor details about how SharePoint and Teams interact, see\nOverview of Teams and SharePoint integration\nand\nManage settings and permissions when SharePoint and Teams are integrated\n.\nCommunication site permissions\nCommunication sites aren't connected to Microsoft 365 groups and use the standard SharePoint permissions groups:\nOwners\nMembers\nVisitors\nNormally with communication sites, you'll have one or more owners, a relatively small number of members who create the content for the site, and a large number of visitors who are the people you're sharing information with.\nYou can give people permissions to the site by adding individual users, security groups, or Microsoft 365 groups to one of the three SharePoint groups. (Nested security groups can cause performance issues and are not recommended.)\nIf a communication site is used by members of a team in Teams, you may want to add the Microsoft 365 group associated with the team to the members group of the communication site. This will allow members of the team to create content in the communication site.\nThe visitors group is a good place to use security groups. In many organizations, this is the easiest way to add large numbers of users to a site.\nFor information about how to share a site, see\nShare a site\n.\nHub site permissions\nManaging the permissions of a hub site is dependent on the underlying type of site. If the site is a group-connected team site, then you should manage permissions through the Microsoft 365 group. If it's a communication site, then you should manage permissions through the SharePoint groups.\nHub site owners define the shared experiences for hub navigation and theme. Hub site members create content on the hub as with any other SharePoint site. Owners and members of the sites associated with the hub create content on their individual sites.\nThe SharePoint Administrator must specify which users can connect other sites to the hub. This is done in the\nSharePoint admin center\nand cannot be changed by site owners.\nShareable links\nGiving people permissions to a site, group, or team gives them access to all site content. If you want to share an individual file or folder, you can do so with shareable links. There are three primary link types:\nAnyone\nlinks give access to the item to anyone who has the link, including people outside your organization. People using an\nAnyone\nlink don't have to authenticate, and their access can't be audited.\nAnyone\nlinks can't be used with files in a Teams shared channel site.\nPeople in your organization\nlinks work for only people inside your Microsoft 365 organization. (They don't work for guests or external participants in Teams shared channels).\nSpecific people\nlinks only work for the people that users specify when they share the item. For files in a Teams shared channel site,\nspecific people\nlinks can't be sent to people outside the organization unless they're in the channel.\nYou can\nchange the type of link that is presented to users by default\nfor each site.\nFor more about the different types of sharing links, see\nSecuring your data\n.\nGuest sharing\nThe external sharing features of SharePoint let users in your organization share content with people outside the organization (such as partners, vendors, clients, or customers). Planning for external sharing should be included as part of your overall permissions planning for SharePoint.\nSharePoint has external sharing settings at both the organization level and the site level (previously called the \"site collection\" level). To allow external sharing on any site, you must allow it at the organization level. You can then restrict external sharing for other sites.\nWhichever option you choose at the organization or site level, the more restrictive functionality is still available. For example, if you choose to allow sharing using\nAnyone\nlinks, users can still share with guests, who sign in, and with internal users.\nExternal sharing is turned on by default for your organization. Default settings for individual sites vary depending on the type of site. See\nSite level settings\nfor more information.\nShared channels in teams\ndo not use guest accounts for sharing with people outside the organization. However, external sharing must be enabled for people outside the organization to be invited to shared channels.\nTo set up guest sharing for a site, see\nCollaborate with guests in a site\n.\nSecurity and privacy\nIf you have confidential information that should never be shared externally, we recommend storing the information in a site that has external sharing turned off. Create additional sites as needed to use for external sharing. This helps you to manage security risk by preventing external access to sensitive information.\nSharePoint and OneDrive integration with Microsoft Entra B2B\nMicrosoft Entra B2B collaboration provides authentication and management of guests. Authentication happens via one-time passcode when they don't already have a work or school account or a Microsoft account (MSA).\nWith\nSharePoint and OneDrive integration with Microsoft Entra B2B\n, the Azure B2B collaboration one-time passcode feature is used for external sharing of files, folders, list items, document libraries, and sites. (Shared channels in Teams don't use Azure B2B collaboration, but rather\nAzure B2B direct connect\n.)\nRelated topics\nExternal sharing overview\nManage sharing settings\nCollaborating with people outside your organization\nShare SharePoint files or folders\nLimit sharing in Microsoft 365\nShared channels in Microsoft Teams\nMicrosoft 365 guest sharing settings reference\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Sharing Permissions", @@ -1187,7 +1187,7 @@ "https://learn.microsoft.com/en-us/sharepoint/external-sharing-overview": { "content_hash": "sha256:98c1ff3c759316865e112eed5ddda9c3c38461051fdc76425ab6db19a75c322e", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nOverview of external sharing in SharePoint and OneDrive in Microsoft 365\nFeedback\nSummarize this article for me\nNote\nAs of June 2024, invitations sent via the legacy SharePoint Invitation Manager no longer grant access. Users must reshare documents to generate valid invitations.\nExternal sharing in SharePoint and OneDrive allows users to share content with people outside your organization, such as partners, vendors, clients, or customers. You can also use external sharing to share between licensed users on multiple Microsoft 365 subscriptions. External sharing in SharePoint is part of\nsecure collaboration with Microsoft 365\n. Also read\nOverview of external collaboration options in Microsoft 365\n.\nImportant\nTrial tenants can utilize SharePoint's robust collaboration features, but the scope of external sharing will be restricted compared to licensed tenants. This is designed to prevent potential abuse and ensure a safe experience for all users.\nInclude external sharing as part of your overall permissions planning for SharePoint and OneDrive. This article describes what happens when users share, depending on what they're sharing and with whom.\nIf you want to get straight to setting up sharing, choose the scenario you want to enable:\nCollaborate with guests on a document\nCollaborate with guests in a site\nCollaborate with guests in a team\n(If you're trying to share a file or folder, see\nShare OneDrive files and folders\nor\nShare SharePoint files or folders in Microsoft 365\n.)\nNote\nExternal sharing is turned on by default for your entire SharePoint and OneDrive environment. You may want to\nturn it off globally\nbefore people start using sites or until you know exactly how you want to use the feature.\nHow do SharePoint and OneDrive integrate with Microsoft Entra B2B?\nThere are two external sharing models used in SharePoint and OneDrive:\nSharePoint external authentication\nSharePoint and OneDrive integration with Microsoft Entra B2B\nWhen using Microsoft Entra B2B integration, Microsoft Entra external collaboration settings, such as\nguest invite settings and collaboration restrictions\napply.\nThe following table shows the differences between the two sharing models.\nSharing method\nWhat happens when sharing files and folders?\nWhat happens when sharing sites?\nSharePoint external authentication\n(Microsoft Entra B2B integration not enabled)\nNo guest account created*\nMicrosoft Entra settings don't apply\nN/A\n(Microsoft Entra B2B always used)\nMicrosoft Entra B2B integration enabled\nGuest account always created\nMicrosoft Entra settings apply\nGuest account always created\nMicrosoft Entra settings apply\n*A guest account may already exist from another sharing workflow, such as sharing a team, in which case it's used for sharing.\nFor information on how to enable or disable Microsoft Entra B2B integration, see\nSharePoint and OneDrive integration with Microsoft Entra B2B\n.\nHow do external sharing settings work?\nSharePoint has external sharing settings at both the organization level and the site level (previously called the \"site collection\" level). To allow external sharing on any site, you must allow it at the organization level. You can then restrict external sharing for other sites. If a site's external sharing option and the organization-level sharing option don't match, the most restrictive value will always be applied. OneDrive sharing settings can be the same as or more restrictive than the SharePoint settings.\nWhichever option you choose at the organization or site level, the more restrictive functionality is still available. For example, if you choose to allow unauthenticated sharing using \"Anyone\" links, users can still share with guests, who sign in, and with internal users.\nNote\nEven if your organization-level setting allows external sharing, not all new sites allow it by default. See\nDefault site sharing settings\nfor more information.\nWhat are the security and privacy considerations?\nIf you have confidential information that should never be shared externally, we recommend storing the information in a site that has external sharing turned off. Create additional sites as needed to use for external sharing. This helps you to manage security risk by preventing external access to sensitive information.\nNote\nTo limit\ninternal\nsharing of contents on a site, you can prevent site members from sharing, and enable access requests. For info, see\nSet up and manage access requests\n.\nWhen users share a folder with multiple guests, the guests are able to see each other's names in the Manage Access panel for the folder (and any items within it).\nHow do I share Microsoft 365 group-connected team sites?\nWhen you or your users create Microsoft 365 groups (for example in Outlook, or by creating a team in Microsoft Teams), a SharePoint team site is created. Admins and users can also create team sites in SharePoint, which creates a Microsoft 365 group. For group-connected team sites, the group owners are added as site owners, and the group members are added as site members. In most cases, you want to share these sites by adding people to the Microsoft 365 group. However, you can share only the site.\nImportant\nIt's important that all group members have permission to access the team site. If you remove the group's permission, many collaboration tasks (such as sharing files in Teams chats) won't work. Only add guests to the group if you want them to be able to access the site. For info about guest access to Microsoft 365 groups, see\nManage guest access in Groups\n.\nWhat happens when users share content?\nWhen users share with people outside the organization, an invitation is sent to the person in email, which contains a link to the shared item.\nBecause these guests don't have a license in your organization, they're limited to basic collaboration tasks:\nThey can use Office.com for viewing and editing documents. If your plan includes Office Professional Plus, they can't install the desktop version of Office on their own computers unless you assign them a license.\nThey can perform tasks on a site based on the permission level that they've been given. For example, if you add a guest as a site member, they have Edit permissions and they are able to add, edit, and delete lists; they'll also be able to view, add, update, and delete list items and files.\nThey are able to see other types of content on sites, depending on the permissions they've been given. For example, they can navigate to different subsites within a shared site. They'll also be able to do things like view site feeds.\nIf your authenticated guests need greater capability such as OneDrive storage or creating a Power Automate flow, you must assign them an appropriate license.\nHow do I stop sharing?\nYou can stop sharing with guests by removing their permissions from the shared item, or by removing them as a guest in your directory.\nYou can stop sharing with people who have an\nAnyone\nlink by going to the file or folder that you shared and deleting the link or by turning off\nAnyone\nlinks for the site.\nLearn how to stop sharing an item\nNeed more help?\nIf you have technical questions about this topic, you might find it helpful to post them on the\nSharePoint discussion forum\n. It's a great resource for finding others who have worked with similar issues or who have encountered the same situation.\nSee also\nSearching for site content shared externally\nConfigure Teams with three tiers of protection\nCreate a secure guest sharing environment\nSettings interactions between Microsoft 365 Groups, Teams and SharePoint\nPricing - Active Directory External Identities\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "External Sharing", @@ -1196,7 +1196,7 @@ "https://learn.microsoft.com/en-us/sharepoint/turn-external-sharing-on-or-off": { "content_hash": "sha256:6e2fba3853051c6c6e9625303fd74756453753c3ed6ea8ae191727f8771d2f51", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nManage sharing settings for SharePoint and OneDrive in Microsoft 365\nFeedback\nSummarize this article for me\nAs a SharePoint Administrator in Microsoft 365, you can change the organization-level sharing settings for SharePoint and OneDrive. These settings control sharing at the organization level, and you can then configure more restrictive settings for specific sites if needed.\nFor end-to-end guidance around how to configure guest sharing in Microsoft 365, see:\nSet up secure collaboration with Microsoft 365\nCollaborate with guests on a document\nCollaborate with guests in a site\nCollaborate with guests in a team\nTo change the sharing settings for a site after you've set the organization-level sharing settings, see\nChange sharing settings for a site\n. To learn how to change the external sharing setting for a specific user's OneDrive, see\nChange the external sharing setting for a user's OneDrive\n.\nTip\nAs a companion to this article, see our\nMicrosoft Sharepoint setup guide\nto review best practices and understand the deployment process from site creation to data migration. Features include protection labels, data loss prevention policies, and file activity auditing. For a customized experience based on your environment, you can access the\nSet up SharePoint in Microsoft 365 guide\nin the Microsoft 365 admin center.\nHow do SharePoint and OneDrive integrate with Microsoft Entra B2B?\nThere are two external sharing models used in SharePoint and OneDrive:\nSharePoint external authentication\nSharePoint and OneDrive integration with Microsoft Entra B2B\nWhen using Microsoft Entra B2B integration, Microsoft Entra external collaboration settings, such as\nguest invite settings and collaboration restrictions\napply.\nThe following table shows the differences between the two sharing models.\nSharing method\nFiles and folders\nSites\nSharePoint external authentication\n(Microsoft Entra B2B integration not enabled)\nNo guest account created*\nMicrosoft Entra settings don't apply\nN/A\n(Microsoft Entra B2B always used)\nMicrosoft Entra B2B integration enabled\nGuest account always created\nMicrosoft Entra settings apply\nGuest account always created\nMicrosoft Entra settings apply\n*A guest account may already exist from another sharing workflow, such as sharing a team, in which case it's used for sharing.\nFor information on how to enable or disable Microsoft Entra B2B integration, see\nSharePoint and OneDrive integration with Microsoft Entra B2B\n.\nVideo demonstration\nThis video shows how the settings on the\nSharing\npage in the SharePoint admin center\naffect the sharing options available to users.\nHow do I change the organization-level external sharing setting?\nGo to\nSharing\nin the SharePoint admin center\n, and sign in with an account that has\nadmin permissions\nfor your organization.\nUnder\nExternal sharing\n, specify your sharing level for SharePoint and OneDrive. The default level for both is\nAnyone\n.\nNote\nThe SharePoint setting applies to all site types, including those connected to Microsoft 365 groups and teams. Groups and Teams guest sharing settings also affect connected SharePoint sites.\nThe OneDrive setting can be more restrictive than the SharePoint setting, but not more permissive.\nThis setting is for your organization overall. Each site has its own sharing setting that you can set independently, though it must be at the same or more restrictive setting as the organization. See\nChange the external sharing setting for a site\nfor more information.\nImportant\nMicrosoft Entra external collaboration settings\ndetermine who can invite guests in your organization for site sharing (always) and file and folder sharing (if Azure B2B collaboration is enabled). Be sure to review Microsoft Entra guest access settings as part of your SharePoint and OneDrive sharing setup.\nWhich sharing option should I select?\nSelect this option:\nIf you want to:\nAnyone\nAllow users to share files and folders by using links that let anyone who has the link access the files or folders without authenticating. This setting also allows users to share sites with new and existing guests who authenticate. If you select this setting, you can restrict the Anyone links so that they must expire within a specific number of days, or so that they can give only View permission.\nFile requests\nrequires that OneDrive be set to\nAnyone\nand edit permissions for\nAnyone\nlinks be enabled. OneDrive settings other than\nAnyone\ndisable file requests.\nSee\nBest practices for sharing files and folders with unauthenticated users\nfor more information.\nNew and existing guests\nRequire people who have received invitations to sign in with their work or school account (if their organization uses Microsoft 365) or a Microsoft account, or to provide a code to verify their identity. Users can share with guests already in your organization's directory, and they can send invitations to people who will be added to the directory if they sign in. For more info about verification codes, see\nSecure external sharing in SharePoint\nExisting guests\nAllow sharing only with guests who are already in your directory. These guests may exist in your directory because they previously accepted sharing invitations or because they were manually added, such as through\nAzure B2B collaboration\n. (To see the guests in your organization, go to the\nGuests page in the Microsoft 365 admin center\n).\nOnly people in your organization\nTurn off external sharing.\nNote\nIf you turn off external sharing for your organization and later turn it back on, guests who previously had access regain it. If you know that external sharing was previously turned on and in use for specific sites and you don't want guests to regain access, first turn off external sharing for those specific sites.\nIf you restrict or turn off external sharing, guests typically lose access within one hour of the change.\nWhat are the additional external sharing settings?\nLimit external sharing by domain\nThis is useful if you want to limit sharing with particular partners, or help prevent sharing with people at certain organizations. The organization-level setting on this page affects all SharePoint sites and each user's OneDrive. To use this setting, list the domains (maximum of 5000) in the box, using the format\ndomain.com\n. To list multiple domains, press Enter after adding each domain.\nYou can also limit external sharing by domain by using the\nSet-SPOTenant\nMicrosoft PowerShell cmdlet with -SharingDomainRestrictionMode and either -SharingAllowedDomainList or -SharingBlockedDomainList. For info about limiting external sharing by domain at the site level, see\nRestricted domains sharing\n.\nAllowed or blocked domains in Microsoft Entra ID\nalso affect SharePoint and OneDrive site sharing (always) and file and folder sharing (if Azure B2B collaboration is enabled). Be sure to review Microsoft Entra collaboration restrictions as part of your SharePoint and OneDrive sharing setup.\nAllow only users in specific security groups to share externally\nFor info about this setting, see\nManage security groups\n.\nAllow guests to share items they don't own\nBy default, guests must have full control permission to share items externally.\nGuest access to a site or OneDrive will expire automatically after this many days\nIf your administrator has set an expiration time for guest access, each guest that you invite to the site or with whom you share individual files and folders will be given access for a certain number of days. For more information visit,\nManage guest expiration for a site\nPeople who use a verification code must reauthenticate after this many days\nIf people who use a verification code have selected to \"stay signed in\" in the browser, they must prove they can still access the account they used to redeem the sharing invitation by entering a code sent to that account. If Azure B2B collaboration is enabled, the\nMicrosoft Entra setting\nis used instead of this setting.\nHow do I manage file and folder links?\nChoose the option you want to show by default when a user creates a sharing link.\nNote\nThis setting specifies the default for your organization, but you can choose a different default link type for a site.\nSpecific people\n- This option is most restrictive and impedes broad internal sharing. If you allow external sharing, this option lets users share with specific people outside the organization.\nOnly people in your organization\n- If links are forwarded, they'll work for anyone in the organization. This option is best if your organization shares broadly internally and rarely shares externally.\nAnyone with the link\n- This option is available only if your external sharing setting is set to\nAnyone\n. Forwarded links work internally or externally, but you can't track who has access to shared items or who has accessed shared items. This is best for friction-free sharing if most files and folders in SharePoint and OneDrive aren't sensitive.\nImportant\nIf you select\nAnyone with the link\n, but the site or OneDrive is set to allow sharing only with guests who sign in or provide a verification code, the default link is\nOnly people in your organization\n. Users need to change the link type to\nSpecific people\nto share files and folders in the site or OneDrive externally.\nWhat are the advanced settings for \"Anyone\" links?\nLink expiration\n- You can require all\nAnyone\nlinks to expire, and specify the maximum number of days allowed. If you change the expiration time, existing links will keep their current expiration time if the new setting is longer, or be updated to the new setting if the new setting is shorter.\nLink permissions\n- You can restrict\nAnyone\nlinks so that they can only provide view permission for files or folders.\nIf you would like to use the\nRequest Files\nfeature, the link permissions must be set to\nView, edit, and upload\nor\nView and Upload\nfor folders.\nWhat other sharing settings are available?\nShow owners the names of people who viewed their files in OneDrive\nThis setting lets you control whether the owner of a shared file can see on the file card the people who only view (and don't edit) the file in OneDrive. The file card appears when users hover over a file name or thumbnail in OneDrive. The info includes the number of views on the file, the number of people who viewed it, and the list of people who viewed it. To learn more about the file card, see\nSee files you shared in OneDrive\n.\nNote\nThis setting is selected by default. If you clear it, file viewer info is still recorded and available to you to audit as an admin. OneDrive owners can also still see people who have viewed their shared Office files by opening the files from Office.com or from the Office desktop apps.\nLet site owners choose to display the names of people who viewed files or pages in SharePoint\nThis setting lets you specify whether site owners can allow users who have access to a file, page, or news post to see on the file card who has viewed the item.\nThis setting is turned on by default at the organization level and off at the site level for existing sites. Viewer information is shown only when the setting is on at both the organization and site level. We recommend that site owners turn on this feature only on team sites that don't have sensitive information.\nLearn how site owners can turn on this feature\n.\nNote\nHistorical data is included when this setting is enabled. Likewise, if the setting is turned off and back on at the organization level or site level, the views during the off period are included in the history.\nUse short links for sharing files and folders\nUses a shorter link format for sharing files and folders. This may be useful if you have integrations that require a shorter URL.\nNeed more help?\nIf you have technical questions about this topic, you might find it helpful to post them on the\nSharePoint discussion forum\n. It's a great resource for finding others who have worked with similar issues or who have encountered the same situation.\nYou can also find help on security and permissions in these\nYouTube videos from SharePoint community experts\n.\nSee also\nLimit accidental exposure to files when sharing with guests\nCreate a secure guest sharing environment\nStop sharing files or folders or change permissions\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Manage Sharing Settings", @@ -1205,7 +1205,7 @@ "https://learn.microsoft.com/en-us/sharepoint/restricted-access-control": { "content_hash": "sha256:cb2ddd5e3a489be066e84bc394aa3bd24f65871068cc8168cb970f56fda0bc82", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nRestrict SharePoint site access with Microsoft 365 groups and Microsoft Entra security groups\nFeedback\nSummarize this article for me\nRestricted site access control lets you prevent oversharing by designating access of SharePoint sites and its content to users in a specific group. Users not in the specified group can't access the site or its content, even if they had prior permissions or a shared link. This policy can be applied on Microsoft 365 group-connected, Teams-connected, and nongroup connected sites using Microsoft 365 groups or Microsoft Entra security groups.\nSite access restriction policies are applied when a user attempts to open a site or access a file. Users with direct permissions to the file can still view files in search results. However, they can't access the files if they're not part of the specified group.\nRestricting site access via group membership can minimize the risk of oversharing content. For insights into data sharing, see\nData access governance reports\n.\nWhat do you need to restrict site access?\nWhat are the license requirements?\nYour organization needs to have the right license and meet certain administrative permissions or roles to use the feature described in this article.\nFirst, your organization must have one of the following base licenses:\nOffice 365 E3, E5, or A5\nMicrosoft 365 E1, E3, E5, or A5\nAdditionally, you need at least one of these licenses:\nMicrosoft 365 Copilot license:\nAt least one user in your organization must be assigned a Copilot license (this user doesn't need to be a SharePoint administrator).\nMicrosoft SharePoint Advanced Management license:\nAvailable as a standalone purchase.\nAdministrator requirements\nYou must be a\nSharePoint administrator\nor have equivalent permissions.\nAdditional information\nIf your organization has a Copilot license and at least one person in your organization is assigned a Copilot license, SharePoint administrators automatically gain access to the\nSharePoint Advanced Management features needed for Copilot deployment\n.\nFor organizations without a Copilot license, you can use SharePoint Advanced Management features\nby purchasing a standalone SharePoint Advanced Management license\n.\nEnable site-level access restriction for your organization\nYou must enable site-level access restriction for your organization before you can configure it for individual sites. You can also delegate site access restriction control to all the site admins of your organization.\nTo enable site-level access restriction for your organization in the SharePoint admin center, follow these steps:\nExpand\nPolicies\nand select\nAccess control\n.\nSelect\nSite-level access restriction\n.\nSelect\nAllow access restriction\nand then select\nSave\n.\nTo enable site-level access restriction for your organization using PowerShell, run the following command:\nSet-SPOTenant -EnableRestrictedAccessControl $true\nIt might take up to one hour for the command to take effect.\nDelegate Management of Restricted Access Control to Site Admins\nAs a SharePoint administrator, you can also delegate management of Restricted Access Control to site admin. Upon managing the policy, the site admins would need to provide appropriate justification on why the policy is being updated.\nBy default, the delegation is turned off. If you decide to enable it, run the following command:\nSet-SPOTenant -DelegateRestrictedAccessControlManagement $true\nOnce the Restricted Access Control is delegated to all the site admins, they can manage the policy.\nCheck status of Delegate management of Restricted Access Control to site admins\nTo check the delegation status, run the following command:\nGet-SPOTenant | Select-Object DelegateRestrictedAccessControlManagement\nNote\nFor Microsoft 365 Multi-Geo users, run this command separately for each desired geo-location.\nRestrict access to all SharePoint sites using Microsoft 365 group or Microsoft Entra security groups\nYou can restrict access to a SharePoint site by specifying Microsoft Entra security groups or Microsoft 365 groups as the Restricted Access Control group. The control group should have the users who should be allowed access to the site and its content.\nFor a site, you can configure up to 10 Microsoft Entra security groups or Microsoft 365 groups. Once the policy is applied, users in the specified group who have access permission to the content are allowed access.\nImportant\nAdding people to the Restricted Access Control group (Microsoft Entra security group or Microsoft 365 group) doesn't automatically give the users access permission to the site or the content. For a user to be able to access the content protected with this policy, the user would need to have both the site or content access permission AND be a member of the Restricted Access Control group.\nNote\nYou can also use dynamic security groups as a Restricted Access Control group if you want to base group membership on user properties.\nManage site access for a site\nTo manage site access for a site, follow these steps:\nIn SharePoint admin center, expand\nSites\nand select\nActive sites\n.\nSelect the site you want to manage and the site details panel appears.\nIn\nSettings\ntab, select\nEdit\nin the Restricted site access section.\nSelect the\nRestrict SharePoint site access to only users in specified groups\ncheck box.\nAdd or remove your security groups or Microsoft 365 groups and select\nSave\n.\nIn order for site access restriction to be applied to the site, you must add at least one group to the site access restriction policy.\nFor a group connected site, the Microsoft 365 group connected to the site is added as the default Restricted Access Control group. You can choose to keep this group and add more Microsoft 365 or Microsoft Entra Security groups as Restricted Access Control group.\nNote\nThere's a tag labeled as\nDefault group\nmarked against the Microsoft 365 group connected to the site as shown in the previous image.\nTo manage site access restriction for a SharePoint site using PowerShell, use the following commands:\nAction\nPowerShell command\nEnable site access restriction\nSet-SPOSite -Identity -RestrictedAccessControl $true\nAdd group\nSet-SPOSite -Identity -AddRestrictedAccessControlGroups \nEdit group\nSet-SPOSite -Identity -RestrictedAccessControlGroups \nView group\nGet-SPOSite -Identity Select RestrictedAccessControl, RestrictedAccessControlGroups\nRemove group\nSet-SPOSite -Identity -RemoveRestrictedAccessControlGroups \nReset site access restriction\nSet-SPOSite -Identity -ClearRestrictedAccessControl\nSite admin experience\nOnce the site access restriction control is delegated to site admins, they can configure the site access restriction setting at the Site Information panel.\nTo restrict access to a SharePoint site:\nYou can limit who can access a site by using Microsoft Entra security groups or Microsoft 365 groups.\nAdd the groups that contain the users who should have access.\nYou can add up to 10 groups for each site.\nWhen the site access restriction setting configuration is saved, only users in those groups and those who already have permission to the content can access the site.\nYou would also need to provide justification whenever the site access restriction setting is updated.\nSite owner experience:\nOnce the policy is applied to the site, the policy status and all configured control groups are displayed for site owners on the\nSite access\npanel in addition to the\nSite Information\nand\nPermissions\npanels.\nShared and private channel sites\nShared and private channel sites are separate from the Microsoft 365 group-connected site that standard channels use. Because shared and private channel sites aren't connected to the Microsoft 365 group, site access restriction policies applied to the team don't affect them. You must enable site access restriction for each shared or private channel site separately as nongroup connected sites.\nFor shared channel sites, only internal users in the resource tenant are subject to site access restriction. External channel participants are excluded from site access restriction policy and only evaluated per the site's existing site permissions.\nImportant\nAdding people to the security group or Microsoft 365 group doesn't give users access to the channel in Teams. We recommend adding or removing the same users of the Teams channel in Teams and the security group or Microsoft 365 group, so users have access to both Teams and SharePoint sites.\nAuditing\nAudit events are available in the Microsoft Purview portal to help you monitor site access restriction activities. Audit events are logged for the following activities:\nApplying site access restriction for site\nRemoving site access restriction for site\nChanging site access restriction groups for site\nSite admin's justification for updating site access restriction\nReporting\nRestricted site access policy insights\nAs an IT administrator, you can view the following reports to gain more insight about SharePoint sites protected with restricted site access policy:\nSites protected by restricted site access policy (RACProtectedSites)\nDetails of access denials due to restricted site access policy (ActionsBlockedByPolicy)\nRestricted site access policy reports are available in nongovernment cloud environments, and GCC, GCC-High, and DoD government cloud environments. The reports are currently unavailable for Gallatin, even if you have the required licenses.\nNote\nIt can take a few hours to generate each report.\nSites protected by restricted site access policy report\nYou can run the following commands in SharePoint PowerShell to generate, view, and download the reports:\nAction\nPowerShell command\nDescription\nGenerate report\nStart-SPORestrictedAccessForSitesInsights -RACProtectedSites\nGenerates a list of sites protected by restricted site access policy\nView report\nGet-SPORestrictedAccessForSitesInsights -RACProtectedSites -ReportId \nThe report shows the top 100 sites with the highest page views that are protected by the policy.\nDownload report\nGet-SPORestrictedAccessForSitesInsights -RACProtectedSites -ReportId -Action Download\nThis command must be run as an administrator. The downloaded report is located on the path where the command was run.\nPercentage of site protected with restricted site access report\nGet-SPORestrictedAccessForSitesInsights -RACProtectedSites -ReportId -InsightsSummary\nThis report shows the percentage of sites that are protected by the policy out of the total number of sites\nAccess denials due to restricted site access policy report\nYou can run the following commands to create, fetch, and view report for access denials due to restricted site access reports:\nAction\nPowerShell command\nDescription\nCreate access denials report\nStart-SPORestrictedAccessForSitesInsights -ActionsBlockedByPolicy\nCreates a new report for fetching access denial details\nFetch access denials report status\nGet-SPORestrictedAccessForSitesInsights -ActionsBlockedByPolicy\nFetches the status of the generated report.\nLatest access denials in the past 28 days\nGet-SPORestrictedAccessForSitesInsights -ActionsBlockedByPolicy -ReportId -Content AllDenials\nGets a list of the most recent 100 access denials that occurred in the past 28 days\nView list of top users who were denied access\nGet-SPORestrictedAccessForSitesInsights -ActionsBlockedByPolicy -ReportId -Content TopUsers\nGets a list of the top 100 users who received the most access denials\nView list of top sites that received the most access denials\nGet-SPORestrictedAccessForSitesInsights -ActionsBlockedByPolicy -ReportId -Content TopSites\nGets a list of the top 100 sites that had the most access denials\nDistribution of access denials across different types of sites\nGet-SPORestrictedAccessForSitesInsights -ActionsBlockedByPolicy -ReportId -Content SiteDistribution\nShows the distribution of access denials across different types of sites\nNote\nTo view up to 10,000 denials, you must download the reports. Run the download command as an administrator and the downloaded reports are located on the path from where command was run.\nSharing of site and content with users outside of Restricted Access Control Groups (opt-in capability)\nSharing of SharePoint sites and its content doesn't honor restricted site access policy by default. The SharePoint administrator can choose to restrict sharing of site and its content with users who aren't members of the Restricted Access Control group.\nTo restrict sharing capability with users outside of the Restricted Access Control group, enable it, run the following PowerShell command in SharePoint Online Management Shell as an Administrator:\nSet-SPOTenant -AllowSharingOutsideRestrictedAccessControlGroups $false\nSharing with users\nOnce sharing restriction is applied, sharing is blocked for users who aren't members of the Restricted Access Control group.\nSharing with groups\nSharing is allowed with Microsoft Entra Security or Microsoft 365 groups that are part of the Restricted Access Control groups list. Thus, sharing with all other groups including Everyone except external users or SharePoint groups aren't allowed.\nNote\nSharing of a site and its content is now allowed for the nested security groups that are part of the Restricted Access Control groups.\nConfigure the Learn more link for access denial error page (opt-in capability)\nConfigure the\nLearn more\nlink to inform users who were denied access to a SharePoint site due to the restricted site access control policy. With this customizable error link, you can provide more information and guidance to your users.\nNote\nThe\nLearn more\nlink is a tenant-level setting that applies to all sites with Restricted Access Control policy enabled.\nTo configure the link, run the following command in SharePoint PowerShell:\nSet-SPOTenant -RestrictedAccessControlForSitesErrorHelpLink \"\"\nTo fetch the value of the link, run the following command:\nGet-SPOTenant | select RestrictedAccessControlForSitesErrorHelpLink\nThe configured learn more link is launched when the user selects the\nKnow more about your organization's policies here\nlink.\nRelated articles\nConditional access policy for SharePoint sites and OneDrive\nData Access Governance reports\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Restricted Access Control", @@ -1214,7 +1214,7 @@ "https://learn.microsoft.com/en-us/sharepoint/restricted-content-discovery": { "content_hash": "sha256:6cb3c3d48882c4c344b62a13ce5769624097e42fe5c12a3274c9d80ad6ea4705", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nRestrict discovery of SharePoint sites and content\nFeedback\nSummarize this article for me\nFor organizations onboarding to Microsoft 365 Copilot, maintaining strong data governance controls for SharePoint content is critical to deploying Copilot in a safe manner. Sites identified with the highest risk of oversharing can use Restricted Content Discovery to protect content while taking time to ensure that permissions are accurate and well-managed.\nWith Restricted Content Discovery, organizations can limit the ability of end users to search for files from specific SharePoint sites. Enabling Restricted Content Discovery for each site prevents the sites from surfacing in organization-wide search and Microsoft 365 Copilot Business Chat, unless a user had a recent interaction.\nRestricted Content Discovery is a site-level setting that needs to be propagated to the search index, a large number of transactions could lead to a long queue in the ingestion pipeline and higher update latency times.\nWhile child content is hidden by default, users in your organization can still discover files they own or recently interacted with. End users can still find relevant content they need for their day-to-day tasks, even if Restricted Content Discovery is applied to the parent site.\nRestricted Content Discovery doesn't affect searches originating from a site context or other intelligent features such as Microsoft 365 Feed and Recommendations.\nNote\nRestricted Content Discovery doesn't affect existing permissions on sites. Users with access can still open files on sites with Restricted Content Discovery toggled on.\nThis feature can't be applied to OneDrive sites.\nCaution\nOveruse of Restricted Content Discovery can negatively affect performance across search, SharePoint, and Copilot. Removing sites or files from tenant-wide discovery means that there's less content for search and Copilot to ground on, leading to inaccurate or incomplete results.\nUse cases for Restricted Content Discovery\nRestricted Content Discovery can be applied to any SharePoint site in your organization. The key use case for this feature is to prevent accidental discovery of high-risk sites.\nWe recommend using tools such as Data access governance reports and SharePoint admin center's\nActive sites\ntab to first compile a selective list of targeted sites.\nWhat you need to restrict a specific SharePoint access?\nWhat are the license requirements?\nYour organization needs to have the right license and meet certain administrative permissions or roles to use the feature described in this article.\nFirst, your organization must have one of the following base licenses:\nOffice 365 E3, E5, or A5\nMicrosoft 365 E1, E3, E5, or A5\nAdditionally, you need at least one of these licenses:\nMicrosoft 365 Copilot license:\nAt least one user in your organization must be assigned a Copilot license (this user doesn't need to be a SharePoint administrator).\nMicrosoft SharePoint Advanced Management license:\nAvailable as a standalone purchase.\nAdministrator requirements\nYou must be a\nSharePoint administrator\nor have equivalent permissions.\nAdditional information\nIf your organization has a Copilot license and at least one person in your organization is assigned a Copilot license, SharePoint administrators automatically gain access to the\nSharePoint Advanced Management features needed for Copilot deployment\n.\nFor organizations without a Copilot license, you can use SharePoint Advanced Management features\nby purchasing a standalone SharePoint Advanced Management license\n.\nIn addition to above, you also need the latest version of\nMicrosoft SharePoint Online Management Shell\n.\nConfigure Restricted Content Discovery\nBy default, Restricted Content Discovery is off for all sites. As an IT administrator, you can enable or disable this feature, and check the current state of a given site. You can also delegate Restricted Content Discovery setting to all the site admins of your organization.\nEnable Restricted Content Discovery for a site\nYou can enable Restricted Content Discovery from the SharePoint admin center or via PowerShell.\nTo enable Restricted Content Discovery for a site using SharePoint admin center:\nIn SharePoint admin center, expand\nSites\nand select\nActive sites\n.\nSelect the site you want to restrict the content discovery, and the site details panel appears.\nIn the\nSettings\ntab, toggle on or off in the\nRestrict content from Microsoft 365 Copilot\nsection.\nSelect\nSave\n.\nNote\nChanges can take time to be effective.\nTo enable Restricted Content Discovery for a site using PowerShell, run the following command:\nSet-SPOSite –identity -RestrictContentOrgWideSearch $true\nCheck status of Restricted Content Discovery\nTo check the status of Restricted Content Discovery, run the following command:\nGet-SPOSite –identity | Select RestrictContentOrgWideSearch\nRemove Restricted Content Discovery from a site\nTo remove Restricted Content Discovery on a SharePoint site, run the following command:\nSet-SPOSite –identity -RestrictContentOrgWideSearch $false\nDelegate Management of Restricted Content Discovery to Site Admins\nAs a SharePoint administrator, you can also delegate management of Restricted Content Discovery control to site admin. Upon managing the policy, the site admins would need to provide appropriate justification on why the policy is being updated.\nBy default, the delegation is turned off.  If you decide to enable it, run the following command:\nSet-SPOTenant -DelegateRestrictedContentDiscoverabilityManagement $true\nCheck status of Delegate management of Restricted Content Discovery to site admins\nTo check the delegation status, run the following command:\nGet-SPOTenant | Select-Object DelegateRestrictedContentDiscoverabilityManagement\nOnce the Restricted Content Discovery setting is delegated to all the site admins, they can manage the policy.\nThe site admins would need to provide justification whenever the Restricted Content Discovery setting is updated by them, as shown below:\nOnce the policy is enabled on the site, the\nRestricted\ntag will be visible on the Home tab of the site as shown below:\nAuditing\nAudit events are available in the Microsoft's Unified Audit log to help you monitor activities related to managing of Restricted Content Discovery. Audit events are logged are:\nTurning on the Restricted Content Discovery setting for site\nTurning off the Restricted Content Discovery setting for site\nJustification for updating Restricted Content Discovery setting for site\nRestricted Content Discovery policy insights\nYou can view the following reports to gain insights on the SharePoint sites protected with Restricted Content Discovery:\nGenerate insights report\nTo generate a list of sites with Restricted Content Discovery enabled, run the following command:\nStart-SPORestrictedContentDiscoverabilityReport\nView insights report\nTo view a report displaying the Report GUID, created DateTime stamp, and status of the report generation, run the following command:\nGet-SPORestrictedContentDiscoverabilityReport\nDownload insights report\nTo download a Restricted Content Discovery insights report, you must run the following command as an administrator:\nGet-SPORestrictedContentDiscoverabilityReport –Action Download –ReportId \nThe downloaded report is located on the path where the command was run.\nNext steps\nRestricted Content Discovery gives organizations time to review and/or audit permissions and deploy access controls while onboarding Copilot in a safe manner.\nUltimately for sites that are overshared, the goal is to ensure that proper controls are in place to manage access. SharePoint Advanced Management has a suite of features, such as advanced site content lifecycle management, to help site owners and admins create a robust SharePoint governance framework.\nFrequently Asked Questions\nIs my organization eligible to use Restricted Content Discovery?\nCustomers who are licensed for Copilot and have SharePoint Advanced Management available to them can configure Restricted Content Discovery.\nWhat search scenarios enforce Restricted Content Discovery?\nRestricted Content Discovery only affects tenant-wide search (SharePoint home, Office.com, Bing) and Microsoft 365 Copilot. Only Copilot Discovery scenarios are in scope; Copilot experiences that use data-in-use, such as \"summarize the current document\" in Word aren't impacted.\nDoes Restricted Content Discovery impact other features with dependencies on the search index, such as the Microsoft Purview product suite?\nNo, Restricted Content Discovery doesn't remove content from the tenant search index, which means Microsoft Purview features such as eDiscovery and autolabeling aren't impacted.\nHow soon can I expect Search and Copilot to reflect an update made to the Restricted Content Discovery configuration of a site?\nRestricted Content Discovery is a site-level property. Index update latency is highly dependent on the number of items in the site and the number of sites getting updated at the same time. For sites with more than 500,000 items, the Restricted Content Discovery update could take more than a week to fully process and reflect in search and Copilot.\nHow does Restricted Content Discovery affect the end user experience in Copilot?\nBased on usage of this feature, Copilot has less information available to reference, which could negatively affect its ability to provide accurate and comprehensive responses.\nHow does Restricted Content Discovery fit into an overall approach to prepare SharePoint data for Microsoft 365 Copilot?\nRestricted Content Discovery is designed to limit the ability of end users to search for content from specific SharePoint sites. For a more comprehensive guidance on preparing your data for Copilot, check out this\nblueprint\n.\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Restricted Content Discovery", @@ -1223,7 +1223,7 @@ "https://learn.microsoft.com/en-us/sharepoint/restricted-sharepoint-search": { "content_hash": "sha256:5759c6423a0272a992a8e57b5548da139d59357668f456168e886e7de134c83b", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nRestricted SharePoint Search\nFeedback\nSummarize this article for me\nImportant\nRestricted SharePoint Search is designed for customers of Microsoft 365 Copilot chat and agentic experiences. It's designed as a short-term solution to allow time for your organization's administrators to thoroughly review and audit site and file permissions, but it's not intended or scalable for long-term use. Comprehensive data security solutions are available, including\nSharePoint Advanced Management\nand\nMicrosoft Purview\n.\nWhat is Restricted SharePoint Search?\nRestricted SharePoint Search is a setting that enables you as a\nSharePoint Administrator\nor\nother Microsoft 365 administrator\nto maintain a list of SharePoint sites (an \"allowed list\") for which you have checked permissions and applied data governance. The allowed list defines which SharePoint sites can be used in organization-wide search queries, and, as a temporary measure, Copilot chat and agentic experiences.\nBy default, the Restricted SharePoint Search setting is turned off and the allowed list is empty. If Restricted SharePoint Search is enabled, users can interact with files and content they own or have previously accessed in Copilot.\nHow to use Restricted SharePoint Search\nYou can use Restricted SharePoint Search as a temporary measure to prevent certain content on SharePoint sites from being shared too widely. This feature lets you as an administrator decide which SharePoint sites appear in search results across your organization and Copilot chat or agentic experiences. However, it's important to note that Restricted SharePoint Search isn't a security boundary and doesn't change any permissions on SharePoint sites.\nRestricted SharePoint Search doesn't guarantee that only sites that are on the allowed list show up in search or in Copilot chat and agentic experiences. If a user has recently accessed a site, or the site was shared with that user in Teams or Outlook, the site will appear in the user's results and responses, even if it's not on the allowed list.\nRecommended process\n:\nIf you use Restricted SharePoint Search, use it to temporarily help limit results in search or Copilot chat and agentic experiences. Keep in mind that it's not intended to be a long-term solution.\nUse\nSharePoint Advanced Management\nto identify and remediate oversharing risks, such as broad access and unmanaged sharing.\nUse\nMicrosoft Purview\nto apply data security and AI governance controls with sensitivity labels, data loss prevention (DLP), and auditing.\nAfter you've validated permissions and governance controls, disable Restricted SharePoint Search.\nNotify Copilot agent owners and your IT team that Copilot and agentic responses using existing SharePoint permissions. Users' search and Copilot responses can change after Restricted SharePoint Search is disabled, so it's important to keep them informed.\nWhy should Restricted SharePoint Search be disabled after validation?\nRestricted SharePoint Search is designed to be used as a temporary measure and not a long-term solution. If you keep Restricted SharePoint Search enabled, the user experience is affected in the following ways:\nRestricted SharePoint Search has a limit of 100 sites, which is not sustainable as your organization scales Copilot and agentic operations.\nSearch results are limited to sites on the allowed list, users' frequently visited sites, sites that users have permissions to, and users' recently accessed files. Restricted SharePoint Search affects the overall search experience, even for users who aren't using Copilot chat or agents.\nWhen Restricted SharePoint Search is enabled, Copilot has less information available to reference, which can affect its ability to provide accurate and comprehensive responses.\nDisabling Restricted SharePoint Search enables Copilot chat and agentic experiences to rely on standard permission-based access, rather than a restricted hub-site scope. This configuration helps Copilot chat and agents retrieve SharePoint content with more consistency and accuracy, while still respecting the permissions set on SharePoint sites.\nNeither Copilot nor Restricted SharePoint Search prevents users from accessing content they own or have previously accessed.\nHow does Restricted SharePoint Search work?\nAs a\nSharePoint Administrator\nor\nother Microsoft 365 administrator\nyou can:\nCheck the current status of Restricted SharePoint Search to see if it's enabled or disabled\nEnable or disable Restricted SharePoint Search (it's disabled by default)\nCurate the allowed list\nby identifying the top 100 widely used sites\nAdd or remove sites from the allowed list by providing the site URL\nGet the full list of sites in the allowed list\nRestricted SharePoint Search is off by default. If you decide to enable it, users who search for content or use Copilot chat and agentic experiences can expect responses to surface content from the following sources:\nAn allowed list of curated SharePoint sites (with\nup to 100 SharePoint sites\n), honoring existing site permissions\nUsers' OneDrive files, chats, emails, calendars they have access to\nFiles from users' frequently visited SharePoint sites\nFiles that were shared directly with users\nFiles that users viewed, edited, or created\nNote\nThe allowed list limit of up to 100 SharePoint sites includes Hub sites, but not their associated sites. When you enable Hub sites, the associated sites of a Hub site are included in the allowed list, but they don't count towards the 100-site limit. This approach allows for greater flexibility while still adhering to the existing constraints.\nWhen you're picking Hub sites, make sure all the associated sites have proper permissions.\nThe total number of files included from the last three bullet points (frequently visited sites, files shared directly with the user, and files the users viewed, edited, or created) is limited to the last 2,000 entities.\nThe following diagram shows an example of an HR Hub site with eight associated sites:\nIn the preceding diagram, all eight associated sites plus the HR Hub site are counted as one site in the allowed-list.\nExample: Contoso Electronics before and after using Restricted SharePoint Search\nLet's consider Alex Wilber, a marketing specialist at Contoso Electronics. Before Alex's organization uses Restricted SharePoint Search, Alex can see not only his own personal contents, like his OneDrive files, chats, emails, contents that he owns or visited, but also content from some sites that haven't undergone access permission review or Access Control Lists (ACL) hygiene, and doesn't have data governance applied. For example, Contoso Electronics has a budgeting site with important business information. Most people don't know about this site, so the site owner hasn't set up proper permissions and hasn't followed correct data governance process. The site might be open to some users who aren't allowed to see it, such as Alex. When Alex asks Copilot for some budgeting information, Copilot gets information from the budgeting site.\nThe IT administrator at Contoso Electronics sets up Restricted SharePoint Search to limit what sites can be searched through the allowed list. After reviewing reports, the IT administrator decides to exclude the budgeting site from the allowed list. But even though Restricted SharePoint Search is now turned on, Alex can still access things that he owns or has recently visited, or that are directly shared with him. He just can't access any other sites, unless the site is in the allowed list and he has permission to it. Now, when Alex asks Copilot the same questions about budgeting, Copilot chat and agentic experiences don't show information from that site.\nFrom here, Contoso Electronics administrators plan to use\nSharePoint Advanced Management\nand\nMicrosoft Purview\n, and eventually, disable Restricted SharePoint Search.\nNote\nSite scoped searches aren't affected by this control. This control impacts\nmodern search\nand Copilot experiences.\nFrequently asked questions\nCan I use RSS for creating a \"deny list\" instead?\nNo, this capability isn't part of Restricted SharePoint Search. However, SharePoint Advanced Management offers a similar feature called\nRestricted Access Control for SharePoint sites\n. If you're not ready to use SharePoint Advanced Management, you can consider specifying whether to allow specific sites to show up in search results. For more information, see\nAllow this site to appear in Search results\n.\nNote\nIf you turn off the \"Allow this site to appear in Search results\" setting, you block the site content from showing up in both the organization-wide search and the site specific search.\nDoes Restricted SharePoint Search affect other services that don't depend on SharePoint, such as Exchange, To Do, Planner, Loop, etc.?\nYes, any product where Enterprise Search is enabled and could have SharePoint content and/or files as search results will be impacted.\nDoes Restricted SharePoint Search affect other features based on the Microsoft Index, such as Purview or SharePoint Advanced Management features?\nNo, Restricted SharePoint Search won't affect any other features based on the Microsoft Index.\nHow do I enable Restricted SharePoint Search?\nYou can enable Restricted SharePoint Search by using PowerShell scripts. For detailed steps, see\nEnable or disable Restricted Search\n.\nAfter enabling RSS, how long does it take to go into effect?\nRSS goes into effect within an hour after it's enabled.\nIf I give the URL of a hub site, will it also include all of the child sites or sites associated hub sites with it? Do these other sites count towards the 100 sites in the allowed lists?\nOnly the hub site (the URL in the Allowed list) is included in the 100. The sub sites under the hub site aren't counted against the 100 limit but RSS is effective on the sub sites.\nNext steps\nIf you decide to use Restricted SharePoint Search, start by curating the allowed list of sites. You can identify the top 100 widely used sites in your organization by using reports in the\nSharePoint admin center\nor by running the\nGet-SPOSite PowerShell cmdlet\n.\nAfter curating the allowed list, enable Restricted SharePoint Search by using the\nEnable-SPORestrictedSearch PowerShell script\n.\nMonitor the impact of Restricted SharePoint Search on your organization by evaluating the SharePoint sites\nactivities\nand\nusage\nreports.\nWork with site owners and admins to review permissions and apply data governance on the sites in the allowed list. You can use\nSharePoint Advanced Management\nto set up\nadvanced access policies\nand\nadvanced site content lifecycle management\nfor specific users and groups.\nUse\nMicrosoft Purview\nto apply data security and AI governance controls with sensitivity labels, data loss prevention (DLP), and auditing for the sites in the allowed list.\nAfter you've validated permissions and governance controls for the sites in the allowed list, disable Restricted SharePoint Search by using the\nDisable-SPORestrictedSearch PowerShell script\n.\nNotify Copilot agent owners and your IT team that Copilot and agentic responses will now use existing SharePoint permissions without the restrictions of the allowed list. Responses can change once Restricted SharePoint Search is disabled, so it's important to keep them informed.\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Restricted SharePoint Search", @@ -1232,7 +1232,7 @@ "https://learn.microsoft.com/en-us/sharepoint/advanced-management": { "content_hash": "sha256:2e0a24371a07cf7229fcf7a75cbbb8fe9a33be72e2720235d4740265ef3cdc90", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nWhat is SharePoint Advanced Management?\nFeedback\nSummarize this article for me\nSharePoint Advanced Management (SAM) is a comprehensive governance solution for SharePoint and OneDrive. With SAM, you can efficiently manage content growth, secure access, and monitor changes across your organization. These capabilities help you maintain control over your digital workspace and prepare your environment for Microsoft 365 Copilot.\nIn this article, you learn how SAM enables you to:\nPrevent content sprawl\nwith automated policies and insights\nManage the content lifecycle\nthrough reporting and compliance tools\nStreamline permissions and access management\nfor SharePoint and OneDrive sites\nExplore each of the following sections to discover how SAM supports your content governance needs and enhances collaboration in Microsoft 365.\nWhat you need to get started\nBefore you get started, make sure SharePoint Advanced Management (SAM) features are available to your organizations through one of the following two licensing options:\nIf your organization has a Copilot license and at least one user is assigned a Copilot license, SharePoint administrators automatically gain access to the SharePoint Advanced Management features required for Copilot deployment. The only SAM feature not included with Copilot is\nRestricted Site Creation\n.\nOrganizations without a Copilot license can access SharePoint Advanced Management features by\npurchasing a standalone SharePoint Advanced Management license\n.\nSAM features are managed by\nIT administrators\nwith access to the\nSharePoint admin center\n. Some features can also be used by site owners.\nWith the right licensing in place, you can take full advantage of SharePoint Advanced Management's three core capabilities: preventing content sprawl, managing the content lifecycle, and streamlining permissions and access management for SharePoint and OneDrive sites. The following sections provide a detailed look at each area, helping you understand how SAM empowers you to govern your organization's information effectively and securely.\nWhat is content sprawl and how can you prevent it?\nContent sprawl happens when digital files and information accumulate across your organization without effective oversight. This can make it harder to find what you need, increase storage costs, and create security or compliance risks. To help you prevent content sprawl, SharePoint Advanced Management offers three key features:\nSite ownership policy:\nEnsure every SharePoint site has clear ownership and accountability with automated policies that help you manage site lifecycle and governance.\nAI insights:\nUse AI-powered recommendations to identify patterns, spot potential issues, and take action to keep your content organized and secure.\nManage inactive sites:\nAutomatically detect and address inactive SharePoint sites, reducing clutter and optimizing your storage.\nRequest site attestation:\nRegularly prompt site owners to review and confirm the relevance of their sites, helping you maintain an organized and purposeful digital environment.\nBy using these features together, you can maintain control over your digital workspace and support secure, efficient collaboration.\nSite ownership policy\nSite ownership policies\nare a part of site lifecycle management. These policies help you automatically monitor and enforce site ownership requirements across your organization. You can create these policies to define who should be responsible for each site, set minimum owner or admin counts, and automate notifications when sites don't meet your criteria. By regularly identifying noncompliant sites and prompting users to take action, site ownership policies support effective site management, reduce the risk of ownerless sites, and help maintain security and compliance in your SharePoint environment.\nAI Insights\nThe\nAI insights\nfeature for\nSharePoint Advanced Management\nuses a language model to identify patterns and potential issues from reporting and receive actionable recommendations to solve issues.\nYou can find the\nGet AI insights\nbutton next to various reports in the SharePoint admin center. Once selected, the AI insights feature extracts patterns from the report and offers a list of potential actions.\nInactive sites policy\nYou can run automated, rule-based policies to manage and reduce inactive sites with the\nInactive SharePoint sites policy\nfeature from SharePoint Advanced Management.\nThe inactive sites policy combats content sprawl by automatically identifying and managing inactive SharePoint sites. It operates by defining inactivity criteria, such as lack of updates or user activity over a set period. Once identified, site owners receive email notifications to confirm the active/inactive state of the site.\nRequest site attestation\nRequest site attestation\nis a feature that helps you ensure that SharePoint sites remain relevant and necessary over time. With this feature, you can set up periodic reviews where site owners and site admins are prompted to attest to the continued need for their sites. This process helps identify and clean up unused or unnecessary sites, reducing content sprawl and maintaining an organized digital environment.\nHow can you manage content lifecycle?\nYou can manage the content lifecycle for SharePoint sites by:\nUsing\nsite change history reports\nto track property changes across your sites, helping you monitor updates and maintain compliance.\nReviewing\nrecent site actions\nto see the latest changes you've made, making it easier to audit activity and ensure your governance policies are followed.\nTogether, these features give you visibility into site modifications and support effective lifecycle management.\nSite change history reports\nThe\nSite change history report\nfeature lets you create change history reports in the SharePoint admin center to review SharePoint site property changes made within the last 180 days. Create up to five reports for a given date range and filter by sites and users. You can download the report as a .csv file to view the site property changes.\nRecent site actions\nThe\nRecent SharePoint admin actions\npolicy lets you review and monitor the last 30 changes you've made to a SharePoint site's properties within the last 30 days in the SharePoint admin center. This feature only shows changes made by you and not other administrators.\nHow can you manage permissions and access?\nMicrosoft 365 collaboration and AI experiences depend on strong permission and access controls for SharePoint and OneDrive. SharePoint Advanced Management (SAM) provides a suite of features to help you govern access, prevent oversharing, and protect sensitive data:\nAssess content management status:\nEvaluate your organization's content management practices and receive actionable insights to improve governance all in one place.\nBlock download policy:\nRestrict file downloads from SharePoint and OneDrive sites, ensuring browser-only access and preventing offline copies.\nCatalog management:\nProvide a comprehensive view of content distribution across regions, departments, users, information barriers, and custom properties defined by you.\nCompare site policies:\nEvaluate and align site-level policies to maintain consistent governance across your environment.\nConditional access policies:\nUse authentication contexts to connect a Microsoft Entra Conditional Access policy to a SharePoint site.\nData access governance reports:\nIdentify sites with overshared or sensitive content and take action to mitigate risks.\nManage data access governance via PowerShell:\nAutomate and scale your data access governance tasks using PowerShell commands.\nAgent insights:\nGain visibility into agents created in SharePoint and their activities.\nInsights on agents accessing content:\nGet insights on how the agents are accessing content across all SharePoint and OneDrive sites in your organization.\nApp insights:\nMonitor and manage non-Microsoft applications registered in your Microsoft Entra admin center that accesss your SharePoint content.\nInitiate site access reviews:\nDelegate review of overshared sites to site owners, ensuring regular validation of access permissions.\nRestrict access to all OneDrive sites by security group:\nLimit OneDrive access to specific security groups, enhancing data protection.\nRestrict access to specific OneDrives:\nControl access to individual OneDrive accounts based on user roles or group memberships.\nRestrict content discovery of SharePoint sites:\nLimit the ability of end users to search for files from specific SharePoint sites.\nRestrict site creation by users:\nEnforce policies to control who can create new SharePoint sites, reducing unnecessary sprawl.\nRestrict site creation by apps:\nSpecify the non-Microsoft applications that can create SharePoint sites, ensuring only trusted apps have this capability.\nRestrict SharePoint access by security groups:\nApply security group-based policies to further refine who can access specific SharePoint sites.\nBy using these features together, you can ensure that only authorized users have access to your organization's data, reduce the risk of data leaks, and support secure, efficient collaboration with Microsoft 365 Copilot.\nAssess content management status\nThe\nContent management assessment\nfeature in SharePoint Advanced Management aggregates a comprehensive set of tools all in one place for you to quickly assess and improve your organization's content management practices with actionable insights and recommendations.\nBlock download policy for SharePoint and OneDrive sites\nBlock download policy for SharePoint and OneDrive sites\nYou can block download of files from SharePoint sites or OneDrive without needing to use Microsoft Entra Conditional Access policies. Users have browser-only access with no ability to download, print, or sync files. They also won't be able to access content through apps, including the Microsoft Office desktop apps.\nCatalog management\nCatalog management\nhelps you organize and govern SharePoint sites by grouping them into logical categories based on regions, departments, users, information barriers, and custom properties. This feature uses built-in Microsoft 365 metadata to enable targeted actions like content monitoring, policy enforcement, and Copilot grounding, streamlining governance and reducing administrative overhead.\nCompare site policies\nCompare site policies\nlets you evaluate and align site-level policies to maintain consistent governance across your SharePoint sites. You can compare policies such as sharing settings, sensitivity labels, and access controls between different sites to identify discrepancies and ensure uniform application of your organization's security standards.\nConditional access policies for SharePoint sites\nConditional access policies for SharePoint sites\nlet you enforce stringent access conditions when users access SharePoint sites. Authentication contexts can be directly applied to sites or used with sensitivity labels to connect Microsoft Entra Conditional Access policies to labeled sites. This ensures that only authorized users can access sensitive content based on defined security requirements.\nData access governance reports\nData access governance reports\nlets you view reports that identify sites that contain potentially overshared or sensitive content. You can use these reports to assess and apply appropriate security and compliance policies.\nData Access Governance management via PowerShell\nWhile Data access governance is available in SharePoint admin center portal, large organizations usually look for\nPowerShell support\nin order to manage scale via scripting and automation.\nThis document discusses all appropriate PowerShell commands available via SharePoint Online PowerShell module to manage reports from Data access governance.\nAgent insights\nAgent insights\nis a SharePoint Advanced Management feature that lets you gain visibility into agents created in SharePoint and their activities. This report can help you monitor and manage the agents accessing your SharePoint content.\nInsights on agents' access to content\nInsights on agents' access to content\nis a SharePoint Advanced Management feature that lets you gain insights on how the agents are accessing content across all SharePoint and OneDrive sites in your organization. You can see how agents interact with your content, spot access patterns, and view agent distribution across sites.\nEnterprise app insight reports\nApp insights\nis a SharePoint Advanced Management feature that lets you gain insights on the various non-Microsoft applications registered to your Microsoft Entra admin center and how they access your SharePoint content. This report can help you maintain and protect the integrity of your content.\nSite access reviews\nSite access review\nfeature in the SharePoint admin center lets you delegate the review process of\ndata access governance reports\nto the site owners of overshared sites.\nSite access review involves site owners in the review process so they can address the concern of overshared sites identified in data access governance reports.\nRestricted Access Control for SharePoint\nYou can prevent sites and content from being discovered at the site-level by enabling\nRestricted Access Control for SharePoint sites\n. Site access restriction allows only users in the specified security group or Microsoft 365 group to access content. This policy can be used with Microsoft 365 group-connected, Teams-connected, and nongroup connected sites.\nRestricted Access Control for OneDrive\nYou can limit access to shared content of a specific user's OneDrive to only people in a security group with the\nRestricted Access Control for OneDrive\npolicy. Once the policy is enabled, anyone who isn't in the designated security group won't be able to access content in that OneDrive even if it was previously shared with them.\nTo block users from accessing all OneDrive sites as a service, you can enable the\nRestrict OneDrive service access\nfeature.\nRestrict site creation by users\nYou can control who can create new SharePoint sites in your organization by enabling the\nRestrict site creation by users\nfeature. This helps reduce unnecessary sprawl and ensures that only authorized users can create sites.\nRestrict site creation by apps\nYou can control which non-Microsoft applications can create SharePoint sites in your organization by enabling the\nRestrict site creation by apps\nfeature. This ensures that only trusted apps have the capability to create sites, enhancing security and governance.\nLicensing\nTo use SharePoint Advanced Management (SAM), your organization must have the appropriate licensing in place. Learn about the main options for accessing SAM features\nhere\n.\nSharePoint Advanced Management features in Microsoft 365 Copilot licenses\nLearn about SharePoint Advanced Management features included in Microsoft 365 Copilot licenses\nhere\n.\nWhich Microsoft 365 Copilot SKUs include SharePoint Advanced Management?\nLearn more about what Microsoft 365 Copilot SKUs include SharePoint Advanced Management features\nhere\n.\nHow does SAM support Microsoft 365 Copilot deployment?\nWhether preparing for\nCopilot deployment\nor managing content post-implementation, SharePoint Advanced Management offers capabilities to help you govern your SharePoint and OneDrive content effectively.\nWe recommend utilizing SharePoint Advanced Management features along with our\nbest practices for Microsoft 365 Copilot\nto reduce the risk of oversharing, control content sprawl, and manage the content lifecycle.\nRelated articles\nMicrosoft 365 Government - how to buy\nGet started with Microsoft 365 Copilot\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Advanced Management", @@ -1241,7 +1241,7 @@ "https://learn.microsoft.com/en-us/sharepoint/data-access-governance-reports": { "content_hash": "sha256:7822c5b183553379a5c9531c7474f2e879642f964fcfd6c8c91503fa7f711087", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nData access governance reports for SharePoint and OneDrive sites\nFeedback\nSummarize this article for me\nAs sprawl and oversharing of SharePoint sites increase with exponential data growth, organizations need help with governing their data. Data access governance reports can help you govern access to SharePoint data. The reports let you discover sites that contain potentially overshared or sensitive content. You can use these reports to assess and apply the appropriate security and compliance policies.\nWhat you need to create a data access governance report\nWhat are the license requirements?\nYour organization needs to have the right license and meet certain administrative permissions or roles to use the feature described in this article.\nFirst, your organization must have one of the following base licenses:\nOffice 365 E3, E5, or A5\nMicrosoft 365 E1, E3, E5, or A5\nAdditionally, you need at least one of these licenses:\nMicrosoft 365 Copilot license:\nAt least one user in your organization must be assigned a Copilot license (this user doesn't need to be a SharePoint administrator).\nMicrosoft SharePoint Advanced Management license:\nAvailable as a standalone purchase.\nAdministrator requirements\nYou must be a\nSharePoint administrator\nor have equivalent permissions.\nAdditional information\nIf your organization has a Copilot license and at least one person in your organization is assigned a Copilot license, SharePoint administrators automatically gain access to the\nSharePoint Advanced Management features needed for Copilot deployment\n.\nFor organizations without a Copilot license, you can use SharePoint Advanced Management features\nby purchasing a standalone SharePoint Advanced Management license\n.\nData governance access reports are available in nongovernment cloud environments, as well as GCC, GCC-High, and DoD government cloud environments. The reports are currently unavailable for Gallatin, even if you have the required licenses.\nHow to access the Data access governance reports in the SharePoint admin center\nSign in to the\nSharePoint admin center\nwith the\nSharePoint administrator\ncredentials for your organization.\nIn the left pane, expand\nReports\nand then select\nData access governance\n.\nThe following reports are currently available from the Data access governance landing page:\nSnapshot reports\nSite permissions across your organization\n(Recommended)\nSensitivity label applied to files\nActivity reports\nSharing links\nShared with 'Everyone except external users'\nNote\nIT administrators with Microsoft 365 E5 licensing can access Data access governance reporting, but are unable to view or utilize the other\nSharePoint Advanced Management features\n. No snapshot reports are provided. No remedial actions are provided. Activity reports are available but can return only up to 10,000 sites.\nWhat are snapshot reports?\nSnapshot reports give you a snapshot of your organization's current status based on specific reporting criteria. These reports show data as of the date they were generated.\nCurrently, three types of snapshot reports are available:\nSite permissions report\n: Provides a comprehensive snapshot of permission structure across all SharePoint and OneDrive sites, helping you identify sites with the broadest user access (for example, sites with thousands of users, external guests, or \"Everyone except external users\" permissions).\nSite permissions for users report\n: Lists all sites a specified user can access, allowing admins to determine whether they can access the entire site or specific sections, granted directly to the user or indirectly through groups.\nSensitivity label for files report\n: Identifies SharePoint sites containing files with specific sensitivity labels applied, allowing you to verify that appropriate security policies are in place for your most sensitive content.\nWhat are activity reports?\nActivity reports help you track potential oversharing activities that occurred in the last 28 days. These reports focus on \"recently active\" sites where users created sharing links or shared content with large groups. For all activities tracked in activity reports, you can find corresponding \"baseline\" data in the\nsnapshot reports\n.\nCurrently, two types of activity reports are available to help you identify potential oversharing:\nSharing links reports\n: Identifies sites where users recently created the most sharing links (including \"Anyone,\" \"People in the organization,\" and \"Specific people\" links) to help you catch potential oversharing as it happens.\nShared with 'Everyone except external users' reports\n: Tracks sites where content is shared with all internal users in your organization, helping you identify broad internal exposure that could lead to unintended data access.\nImportant\nFor organizations without SharePoint Advanced Management:\nYou must enable data collection before you can generate activity reports. Here's what you need to know:\nAfter enabling data collection, the system starts collecting audit data\nData is stored for 28 days\nReports become available 24 hours after enabling collection\nReports only contain data from when collection was enabled\nIf no reports are generated for 3 months, data collection pauses and must be re-enabled\nHow to use snapshot and activity reports?\nAs part of your governance strategy, we recommend combining both snapshot and activity reports to get a complete picture of your organization's data access landscape. Here's how to use them together effectively:\nStart with snapshot reports\n: Run site permissions reports first to understand your baseline permission structure and identify sites with the broadest exposure. We recommend running these quarterly to maintain a comprehensive view of your organization's data access.\nFollow up with activity reports\n: Use sharing links and EEEU activity reports to monitor recent oversharing activities and catch emerging risks. We recommend running these monthly to stay on top of ongoing sharing activities.\nThis combination ensures you have both a complete picture of your current state and visibility into ongoing sharing activities that could create new exposure risks.\nWhat is the site permissions for your organization report?\nThe site permissions report for your organization report is the first snapshot report that provides a comprehensive view of your organization's current permission structure across all SharePoint and OneDrive sites. This report analyzes every site to help you understand how broadly your data is exposed and identify potential oversharing risks. This snapshot approach helps you quickly assess your overall security posture and identify sites that need immediate attention.\nLearn create and use the\nsite permissions for your organization report here\n.\nWhat is the site permissions for users report?\nThe site permissions report for users report is the next snapshot report that provides a comprehensive view into permissions of the specified users across all SharePoint and OneDrive sites. This report lists all sites a user can access and allows admins to determine whether they can access the entire site or specific sections, granted directly to the user or indirectly through groups. This approach helps you quickly assess your overall security posture and identify sites that need immediate attention.\nLearn how to create and use the\nsite permissions for users report here\n.\nWhat is the sensitivity labels for files report?\nThe sensitivity labels for files report is the other snapshot report that helps you control access to sensitive content across your organization. This report identifies sites containing\nfiles with sensitivity labels applied\n, allowing you to verify that appropriate security policies are applied.\nLearn how to use the\nsensitivity labels for files report here\n.\nWhat is the sharing links report?\nThe sharing links report is one of the two activity reports that helps you identify sites where users created the most new sharing links in the last 28 days.\nLearn how to create and use\nsharing links report here\n.\nWhat is the 'Everyone except external users' (EEEU) report?\nEEEU is a built-in SharePoint group that automatically includes all internal users but excludes any external guests. The 'Everyone except external users' (EEEU) report is one of the two activity reports that helps you identify sites where content has been shared with your entire organization in the past 28 days. You can run the\nsite permissions for your organization report\nfirst to understand your organization's current EEEU sharing status, then use this activity report to monitor ongoing EEEU sharing activities. Learn how to create and use the Everyone except external users sharing activity report\nhere\n.\nLimitations or known issues\nReports may not work if you have nonpseudonymized report data selected for your organization. To change this setting, you must be a Global Administrator. Go to the\nReports setting in the Microsoft 365 admin center\nand clear\nDisplay concealed user, group, and site names in all reports\n.\nRemedial actions from Data access governance reports\nAfter discovering potential oversharing through Data access governance reports, you can take several actions to address these risks. When deciding which actions to take, consider:\nThe sensitivity of the exposed content\nThe amount of content at risk\nThe potential disruption to users and workflows\nAvailable remediation options\nFor immediate action:\nUse\nRestricted access control (RAC)\nto limit access to a specific group\nReview the\n'Change history' report\nto identify recent permission changes that may have led to oversharing\nFor collaborative remediation:\nUse the\nSite access review feature\nto request that site owners review and update permissions themselves\nThis approach ensures you can balance security needs with minimal disruption to your organization's productivity.\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Data Access Governance Reports", @@ -1250,7 +1250,7 @@ "https://learn.microsoft.com/en-us/sharepoint/site-lifecycle-management": { "content_hash": "sha256:cc93a9e79f30fe8f8363e7316f0d0559fe52b0490c2588d8ab71be46f18f8214", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nManage inactive sites by using inactive site policies\nFeedback\nSummarize this article for me\nThe site lifecycle management features from\nMicrosoft SharePoint Advanced Management\nhelp you improve site governance by automating policy configuration in the\nSharePoint admin center\n. Inactive site policies, part of SharePoint's site lifecycle management features, help you automate this process. You can set up an inactive site policy to automatically detect inactive sites and notify site owners by email. Owners can then confirm if the site is still active.\nWhat do you need to create an inactive site policy?\nWhat are the license requirements?\nYour organization needs to have the right license and meet certain administrative permissions or roles to use the feature described in this article.\nFirst, your organization must have one of the following base licenses:\nOffice 365 E3, E5, or A5\nMicrosoft 365 E1, E3, E5, or A5\nAdditionally, you need at least one of these licenses:\nMicrosoft 365 Copilot license:\nAt least one user in your organization must be assigned a Copilot license (this user doesn't need to be a SharePoint administrator).\nMicrosoft SharePoint Advanced Management license:\nAvailable as a standalone purchase.\nAdministrator requirements\nYou must be a\nSharePoint administrator\nor have equivalent permissions.\nAdditional information\nIf your organization has a Copilot license and at least one person in your organization is assigned a Copilot license, SharePoint administrators automatically gain access to the\nSharePoint Advanced Management features needed for Copilot deployment\n.\nFor organizations without a Copilot license, you can use SharePoint Advanced Management features\nby purchasing a standalone SharePoint Advanced Management license\n.\nHow do inactive site policies work?\nScope of inactive site policies\nYou can configure parameters for an inactive site policy, such as inactive time period, template type, site creation source, sensitivity labels, and exclusion of up to 100 sites.\nIn-scope site activities\nInactive site policies analyze activity across SharePoint and connected platforms like Teams, Viva Engage (formerly Yammer), and Exchange to detect a site's last activity.\nPlatform type\nActivities\nSharePoint\nViewed files, edited files, shared files internally and externally, synced files, viewed pages, visited pages\nViva Engage (formerly Yammer)\nPosted messages, read conversations, liked messages\nTeams\nPosted channel messages in a team across standard channels, posted messages in Teams and standard channels, replied to messages, mentioned in messages, reacted to messages, sent urgent messages, conducted meetings (recurring, ad hoc, one-time)\nExchange\nReceived emails in the Exchange mailbox\nScope of app activities\nInactive site policies don't consider app activity through an app token. They consider app activity through a user token only when a user agent is involved and meets the following criteria.\nActivity source\nCondition when activity is considered\nPnP PowerShell activity via user token\nIs not considered\nSharePoint Online PowerShell activity via user token\nIs considered only when UserAgent parameter value is passed\nCSOM scripting activity via user token\nIs considered when script explicitly sets UserAgent value\nAny other app activity via user token\nIs considered when UserAgent exists, except in the following scenarios when\n- UserAgent starts with \"client-request=id\"/\"ACTIVEMONITORING\"/SPORUNNERS\"\nOR\n- UserAgent ends with \"MSDEMO\"/\"MSDPLATFORM\"/\"SystemUsage\"\nOR\n- UserAgent contains \"GomezAgent\"/\"bingbot.htm\"/\"ms search 6.0 robot\"/\"http://www.monitis.com\"/\"ISV\"\nApp activity via app token\nIs not considered\nIn-scope site templates\nSite lifecycle management reviews the activity of communication sites, classic sites, Teams-connected sites, and group-connected sites with the following site template types:\nSite type\nTemplate type\nCommunication site\nSitePagePublishing#0\nClassic sites\nSTS#0, STS#1, STS#2, WIKI#0, BLOG#0, SGS#0, SPS#0, SPSNEWS#0, ENTERWIKI#0, COMMUNITY#0, DEV#0, EXPRESS#0, EHS#1, EHS#2\nTeams-connected site\nSTS#3 or Group#0\nGroup-connected site\nSTS#3 or Group#0\nOut-of-scope sites\nThe following sites are out of scope and excluded from site activity detection:\nOneDrive sites\nSites created by system users\nApp catalog sites\nRoot sites\nHome sites\nTenant admin sites\nSites associated with Shared and Private Teams channels\nPolicy modes\nWhen setting up a site lifecycle policy, you can choose between a simulation policy and an active policy.\nSimulation mode\nThe simulation policy runs once and generates a report based on the set parameters. If it fails, you need to delete it and create a new one. Once you validate a simulation policy, you can convert it to an active policy.\nNote\nSite lifecycle policies in simulation mode are now available in GCCH and DoD environments as of November 17, 2025.\nActive mode\nThe active policy runs monthly, generating reports and sending notifications to site owners to confirm the site's status. If it fails during a particular month, it will run again on the next schedule. The policy enforces actions on sites that remain uncertified or unattested by the site owner or admin, provided you configured it to take enforcement actions.\nHow to create an inactive site policy\nAs a SharePoint administrator, go to the\nSharePoint admin center\nand sign in.\nIn the navigation pane, expand\nPolicies\n, and then select\nSite lifecycle management\n.\nUnder\nInactive site policies\n, select\nOpen\n.\nSelect\n+ Create policy\n. Review the\nManage inactive sites\ninformation, and then select\nNext\n.\nOn the\nSet policy scope\nstep, choose your policy scope option, and then select\nNext\n.\nIf you select\nUpload a CSV file with a list of up to 10,000 URLs\n, you can upload a list of site URLs of select sites for the policy.\nTip\nYou can export the site list from the SharePoint active sites page.\nEnsure the CSV file use the same format of the sample CSV file and has no duplicate URLs and those URLs are valid and complete. \nEnsure the URLs listed in CSV file belong to your tenant's domain. \nIf you chose\nInclude sites with retention policies and retention holds\n, this option applies only to sites that are governed by\nMicrosoft Purview retention policies and labels\n. This setting doesn't affect the inclusion or exclusion of sites that are in a read-only or locked state.\nOn the\nConfigure policy\nstep, specify an inactivity period, who to notify, and what actions you want to take after three notifications. Then select\nNext\n.\nChoose to send emails to site owners or site admins, or both.\nIf you plan to exclude users or groups, see the section\nExcluding users or groups from a policy\n(in this article).\nCustomize the email content to provide more context and instructions to the email recipients.\nChoose enforcement actions if there's no response from site owners or admins after three notifications.\nFor sites that aren't certified or attested after 3 monthly notifications for any of the site lifecycle management policies, you can take one of the following enforcement actions.\nEnforcement action\nPolicy behavior\nDo nothing\nSite owners or site admins receive monthly notifications for three months. After this period, no notifications are sent for the next three months. If the site remains unattested after six months, monthly notifications resume. The policy execution report lists unattested sites as unactioned by the site owner. You can download this report and filter out sites marked as unactioned.\nRead-only access\nSite owners or site admins receive monthly notifications for three months. If the notification recipients don't mark the site as attested during this period, the site goes into read-only mode.\nArchive sites after mandatory read-only period\nSite owners or site admins receive monthly notifications for three months. If the notification recipients don't mark the site as attested during this period, then the site goes into a read-only mode for a configurable duration (3, 6, 9, or 12 months). After the configured number of months, the site gets archived through\nMicrosoft 365 Archive\n. Archival is subject to the tenant enabling Microsoft 365 Archive in the Microsoft 365 admin center.\nThe following screenshot shows an example of configuring enforcement actions for a site attestation policy:\nNote\nIf you configure the policy to take an enforcement action:\nThe notifications won't be sent after policy action is successful.\nThe site and it's status are included in the monthly report.\nOn the\nFinish\nstep, specify a name and description for your policy, and select a policy mode. Then Select\nFinish\n.\nAfter your policy is created, select\nDone\n.\nOnce your policy is deployed, you can view and manage it in the\nSite lifecycle management\n>\nInactive site policy\ndashboard.\nExcluding users or groups from a policy\nYou can exclude specific users, Microsoft 365 Groups, or security groups from receiving site lifecycle management requests and notifications, even if they're site owners or site admins for sites that are included in a policy.\nKey behaviors\n:\nExclusions are used only to determine notification recipients.\nExcluding a user or group doesn't change site permissions or ownership, and doesn't exclude the site from lifecycle policy evaluation\nSites continue to be evaluated by the policy as usual.\nLimits\n:\nYou can add up to 100 entries to the exclusion list.\nEach entry can be an individual user, a Microsoft 365 Group, or a security group\nThe 100-entry limit applies to the number of entries, not the number of users within a group. For example, a group with more than 100 members counts as one entry.\nGroup exclusion behavior (important)\n:\nWhen a group is added to the exclusion list for a policy, that group is excluded from notifications only when the group is directly added to the site or is a nested group within other groups that are directly added to the site.\nA member of an excluded group might still receive a notification if they're directly added to the site or are part of some other group that is directly added to the site.\nInactive site notifications to site owners or site admins\nNotifications inform SharePoint site owners or site admins when a site is inactive for a specified number of months. To keep the site, the notification recipients should select the\nCertify site\nbutton in the notification email. Once certified, Site lifecycle management doesn't check the site's activity for one year.\nCustomize email notifications\nAdmins can now customize the emails sent by the Site Lifecycle Management policies, to site owners and admins for certification or attestation. Customizing email content helps improve the read-through rate of the emails sent, effectively improving the response efficiency thus contributing towards better governance across the tenant.\nThe option to customize emails is available in the configure step for all site lifecycle management policies.\nSelecting\nCustomize email to be sent\nopens the customization window as following:\nCustomizable section\nDescription\nSender\nConfiguring a custom domain (in the Microsoft 365 admin center) is a prerequisite to using the email customization feature. For more information, see\nChoose which domain to use for your email\n.\nSubject\n(up to 100 characters)\nYou can use\n$UserDisplayName\nto insert the user's name and\n$SiteName\nto insert the name of the site.\nMessage\n(up to 500 characters)\nYou can use\n$UserDisplayName\nto insert the user's name,\n$SiteName\nto insert the name of the site and\n$SiteUrl\nto insert URL of the site.\nPolicy guideline URL\nOnly valid HTTP links are allowed\nPolicy guideline description text\nDefault value is the placeholder text\nYou can also customize emails for existing policies. To customize emails, follow these steps:\nSelect an existing policy.\nGo to\nEdit configuration\n.\nFind the email customization option.\nNote\nIf you don't configure email customization for a policy, the system continues to send default emails from\nnoreply@sharepoint.com\n.\nWhat to do if you can't customize email messages\nYou might not be able to customize emails if the custom domain setting isn't configured or is turned off.\nYou must configure the\nSend email notifications from your domain\nsetting in the Microsoft 365 admin center before you can customize emails. If this setting isn't configured, you see a warning message on the top of the policy list, as shown in the following image:\nYou might also see the warning message during the configuration step, as shown in the following screenshot:\nIf you previously customized emails in one or more policies, but now the\nSend email notifications from your domain\nsetting in the Microsoft 365 admin center is turned off later, you see the message bar in the policy list, and a warning message in the email customization window, as shown in the following screenshot:\nNote\nOnly someone who has the Global Administrator role can configure domain settings in the Microsoft 365 admin center.\nSites managed by multiple site lifecycle management policies\nFor each type of site lifecycle management policy, such as\nsite ownership policy\n,\ninactive site policy\n, and\nsite attestation policy\n, if you create multiple policies under the same type, notification emails aren't repeated. If a notification was sent within the last 30 days from any policy of that type, and the site remains uncertified, no further notifications are sent. The policy execution report shows the site's status as \"Notified by another policy.\"\nFor example, if a site is covered by two different inactive site policies and receives a notification email from the first policy, the second policy doesn't send any additional notifications within the next 30 days if the site remains uncertified.\nIt's recommended to ensure that policies of the same type don't have overlapping scopes. If sites fall under the scope of multiple policies of the same type, the notification schedule and enforcement actions on the site could become unpredictable.\nEnforcement actions\nThe following table summarizes how the inactive site policy behaves based on the selected enforcement action:\nEnforcement action\nPolicy behavior\nDo nothing\nThe specified recipients receive monthly notifications for three months. After this period, the policy sends no notifications for the next three months. If the site remains inactive after six months, monthly notifications resume. The policy execution report lists inactive sites as unactioned by the site owner. You can download this report and filter out sites marked as unactioned.\nRead-only access\nSite owners or site admins receive monthly notifications for three months. If the notification recipients don't mark the site as certified during this period, the site goes into read-only mode.\nArchive sites after mandatory read-only period\nSite owners or site admins receive monthly notifications for three months. If the notification recipients don't mark the site as certified during this period, then the site goes into a read-only mode for a configurable duration (3, 6, 9, or 12 months). After the configured number of months, the site gets archived through\nMicrosoft 365 Archive\n. Microsoft 365 Archive must be enabled in the Microsoft 365 admin center.\nImportant\nSite lifecycle policies leverage Outlook Actionable Messages to enable site owners or site admins take necessary actions within email.\nFor notifications to render properly, ensure\nOutlook version requirements\nare met in your organization.\nIf you're a US Government cloud customer, see\nSet up actionable emails for SLM policies in US Government cloud environments\n(in this article).\nTo troubleshoot rendering issues, refer to\nfrequently asked questionnaire\n.\nWhen a site owner or site admin selects the site URL in the notification email, this action does\nnot\ncount as site activity. The site remains inactive. Additionally, any\nread actions\ndone on the site within one hour of visiting from the email aren't considered activity. However, any\nedits\nmade to the site count as activity and reset the inactivity status.\nTip\nBefore creating an inactive site policy, check for any site access restriction policies that could disrupt site attestation by the respective site owner.\nRead-only mode\nAn inactive site policy configured with the read-only enforcement action sends additional notifications to inform site owners or site admins when there's no response.\nWhen a site goes into read-only mode, a notification is sent.\nIf a site is in read-only mode, the following banner is added to the site:\nImportant points about read-only or locked sites\nFor sites that are in a read-only or locked state, the following behaviors are expected.\nUnlocked sites: Always included in policy scope\nRead-only sites locked by the same policy type:\nIncluded in scope\nReport indicates the site was previously actioned by this policy\nRead-only sites locked by a different policy type:\nExcluded from policy scope\nAnother policy already owns and governs this site\nRead-only sites externally locked (locked because of non-site lifecycle management reasons):\nIncluded in scope\nExternal locks do not prevent the site from being evaluated by the policy\nNo-access (fully locked) sites:\nIncluded in scope, but no enforcement action is taken\nThe policy skips action because the site is already in a no access locked state\nThese are default behaviors that can't be modified through policy configurations.\nRemove site from read-only mode\nTo remove a site from read-only mode in\nSharePoint admin center\n, go to the\nActive sites\npage, select the site, and then select\nUnlock\nfrom the site page panel.\nSite owners can't remove a site from read-only mode and must contact the tenant admin to remove read-only mode.\nUnarchive a site\nTo unarchive a site in\nSharePoint admin center\n, expand\nSites\nand select\nArchived sites\n. Select the site you want to unarchive and select\nReactivate\n.\nNote\nOnly tenant admins can reactivate an archived site.\nReporting\nThe policy execution report lists sites that are inactive for six months. You can download the report as a .csv file and filter out sites that site owners consider unactioned.\nThe following table describes the information included in the policy execution report:\nColumn\nDefinition\nSite name\nName of the site\nURL\nURL of the site\nTemplate\nTemplate of the site\nConnected to Teams\nIs it a Teams-connected site or not\nSensitivity label\nSensitivity label of the site\nRetention policy\nIs any retention policy applied to the site or not\nSite lock state\nState of site access\nbefore\nthe policy runs (Unlock/Read-Only/No access)\nLast activity date (UTC)\nDate of last activity detected by inactive site policy across SharePoint site and connected workloads (Exchange, Viva Engage (formerly Yammer), or Teams)\nSite creation date (UTC)\nDate when the site was created\nStorage used (GB)\nStorage consumed by the site\nNumber of site owners\nTotal count of site owners for the site\nEmail address of site owners\nEmail addresses of all site owners\nNumber of site admins\nTotal count of site admins for the site\nEmail address of site admins\nEmail addresses of all site admins\nAction status\nStatus of the site (First, second, or third notification sent; Site in read-only mode; Site archived; Action taken by another policy: read-only, archive, or notified by another policy)\nTotal notifications count\nTotal notifications sent so far by any policy under the same policy template\nAction taken on (UTC)\nDate on which the enforcement action was taken (date when site was archived or put in read-only mode)\nDuration in read-only\nNumber of days the site is in the enforced read-only state\nSet up actionable emails for SLM policies in US Government cloud environments\nIn US Government Cloud (GCC High and DoD) environments, a tenant administrator must complete an extra, one-time setup for SharePoint site lifecycle management (SLM) policies to use\nactionable messages\n. This step helps ensure that policy notification emails display and function correctly. For example, site admins and site owners can take actions directly from email.\nUnlike other commercial cloud environments, GCC High and DoD tenants require explicit administrator approval of the actionable message provider before it can send interactive email messages. Without this approval, SLM policy emails are delivered, but action buttons don't function as expected.\nImportant\nYou must be a Global Administrator or Exchange Administrator in the tenant to set up actionable messages.\nApprove the SLM actionable message provider\nGo to the\nOutlook Actionable Messages – Connectors admin portal for GCCH or DoD\nand sign in.\nIn the\nProvider Status\nfilter, select\nApproved by Microsoft – Pending Your Approval\n.\nLocate the provider named\nInactiveSiteOAMProviderGCCH\n.\nSelect the provider, and then select\nApprove\n.\nAfter you approve the provider, the SLM policy notifications send actionable messages.\nNote\nThis approval applies only to SLM policy notifications. Other applications or services that use actionable messages might require separate approval.\nEnsure actionable messages are enabled for the tenant\nSite lifecycle management policies use Outlook actionable messages to enable site owners or site administrators to take necessary actions by using links within email messages.\nFor notifications to render properly, make sure your organization meets the\nOutlook version requirements\n.\nTo troubleshoot rendering problems, see\nfrequently asked questions\n.\nTroubleshooting actionable messages\nIf actionable messages don't work as expected, try these steps:\nMake sure that the\nInactiveSiteOAMProviderGCCH\nprovider is in an approved state.\nAllow sufficient time. It can take up to 24 hours for changes to propagate.\nRelated articles\nMicrosoft 365 group expiration policy\nRestore deleted sites\nOverview of Microsoft 365 Archive\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Site Lifecycle Management", @@ -1259,7 +1259,7 @@ "https://learn.microsoft.com/en-us/sharepoint/request-site-attestations": { "content_hash": "sha256:9dd14311a2832c87c70a4e0ee93b965316f140e40e4f0447f8a62118c9bc847d", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nRequest recurring site attestations for SharePoint sites\nFeedback\nSummarize this article for me\nThe site lifecycle management features in\nMicrosoft SharePoint Advanced Management\nhelp your organization improve site governance by automating policy configuration in the\nSharePoint admin center\n. Site attestation policies, part of SharePoint's site lifecycle management features, help you manage periodic attestation of sites at scale.\nSite attestation involves regular reviews by site owners or site admins to check and confirm the accuracy of site information, including the site's necessity, its owners, members, permissions, and sharing settings. For sites that remain unattested, you can choose to automate enforcement actions to prevent risks of content overexposure. This approach ensures ongoing site compliance and actively reduces risks such as information oversharing.\nThis article describes how to create and configure a site attestation policy.\nRequirements for a site attestation policy\nWhat are the license requirements?\nYour organization needs to have the right license and meet certain administrative permissions or roles to use the feature described in this article.\nFirst, your organization must have one of the following base licenses:\nOffice 365 E3, E5, or A5\nMicrosoft 365 E1, E3, E5, or A5\nAdditionally, you need at least one of these licenses:\nMicrosoft 365 Copilot license:\nAt least one user in your organization must be assigned a Copilot license (this user doesn't need to be a SharePoint administrator).\nMicrosoft SharePoint Advanced Management license:\nAvailable as a standalone purchase.\nAdministrator requirements\nYou must be a\nSharePoint administrator\nor have equivalent permissions.\nAdditional information\nIf your organization has a Copilot license and at least one person in your organization is assigned a Copilot license, SharePoint administrators automatically gain access to the\nSharePoint Advanced Management features needed for Copilot deployment\n.\nFor organizations without a Copilot license, you can use SharePoint Advanced Management features\nby purchasing a standalone SharePoint Advanced Management license\n.\nHow does a site attestation policy work?\nWhen a site attestation policy runs (usually on a monthly basis), it generates a report that lists sites requiring attestation according to policy criteria. Site owners and site admins are notified via email, prompting them to review and address potential issues. Depending on how the policy is configured, specified actions can be taken for sites that are ownerless or that pose a potential risk to your organization.\nAs a SharePoint administrator, you can specify the scope of a site attestation policy, what actions should occur, and whether you have exceptions to the policy.\nPart 1 - Create a site attestation policy\nAs a\nSharePoint administrator\n, go to the\nSharePoint admin center\nand sign in.\nIn the navigation pane, expand\nPolicies\n, and then select\nSite lifecycle management\n.\nUnder\nSite attestation policies\n, select\nOpen\n.\nSelect\nCreate a policy\n.\nOn the\nOverview\npage of\nManage site attestation\n, review the information, and then select\nNext\n. Then proceed to the next section.\nPart 2 - Define the scope of a site attestation policy\nTo define the scope of a site attestation policy, on the\nSet policy scope\npage, select one of the following options:\nUpload a CSV file with a list of up to 10,000 URLs\n; or\nSelect sites at scale\nScope of site attestation policies\nYou can create site lifecycle policies with different scopes based on your organization's requirements. Choose the sites to include in the policy based on:\nSite templates\nCreation sources\nSensitivity labels\nWhether to include sites under retention policies and retention holds\nTo exclude specific sites, add the site URLs of up to 100 sites in the\nExclude sites\nsection while configuring the policy.\nUpload a CSV file with the list of sites\nIf you select\nUpload a CSV file with a list of up to 10,000 URLs\n, you can upload a list of site URLs for the policy.\nTip\nYou can export the site list from the SharePoint active sites page.\nEnsure that the CSV file uses the same format as the sample CSV file and has no duplicate URLs. Also, make sure the URLs are valid and complete. \nEnsure the URLs listed in the CSV file belong to your tenant's domain. \nSelect sites at scale\nIf you chose\nSelect sites at scale\n, you can select site templates to include in this policy, and then filter them by:\nSensitivity label\nSite creation source\nYou can also choose whether to:\nInclude sites with retention policies and retention holds\nExclude specific sites from this policy\nSelect site template types\nSelect site template types from the following list:\nAll sites\nClassic sites\nCommunication sites\nGroup connected sites without teams\nTeam sites without Microsoft 365 group\nTeams-connected sites\nFilter sites by sensitivity labels\nSet policy scope by filtering sites by their sensitivity labels.\nNote\nIf your tenant doesn't use sensitivity labels, this option is unavailable.\nFilter sites by creation source\nFilter sites for the policy scope by site creation source:\nSharePoint Home\nSharePoint admin center\nPowerShell\nPnP\nTeams\nInclude sites with retention policies and retention holds\nThe include/exclude option for sites with retention policies and holds applies only to sites that are governed by\nMicrosoft Purview retention policies and labels\n.\nThis setting does not impact the inclusion or exclusion of sites that are in read-only or locked states.\nFor sites that are in a read-only or locked state, following behaviors are expected:\nUnlocked sites: Always included in policy scope\nRead-only sites locked by the same policy type:\nIncluded in scope\nReport indicates the site was previously actioned by this policy\nRead-only sites locked by a different policy type:\nExcluded from policy scope\nAnother policy already owns and governs this site\nRead-only sites externally locked (locked because of non-site lifecycle management reasons):\nIncluded in scope\nExternal locks do not prevent the site from being evaluated by the policy\nNo access (fully locked) sites:\nIncluded in scope, but no enforcement action is taken\nThe policy skips action because the site is already in a no access locked state\nThese are default behaviors and can't be modified through policy configuration options.\nExclude specific sites from the policy\nEnter up to 100 sites that you want to exclude from the site attestation policy. Be sure to separate each URL by new lines.\nAfter setting the policy's scope, select\nNext\n, and then proceed to the next section.\nPart 3 - Configure the site attestation policy settings\nNote\nA site is scoped for attestation by a policy only if\nboth\nof the following conditions are true:\nThe site's creation date is older than the attestation threshold\nThe site's last attestation date is also older than the same threshold\nThe attestation threshold is calculated using the policy run date minus the attestation period (how often you want sites to be attested, configured by SharePoint administrators). If either condition isn't met, the site is excluded from attestation for that run.\nKeep the following points in mind:\nRecently attested sites are skipped\nNewly created sites are skipped (because a grace period applies)\nOnly sites that are both old enough and due for reattestation are included\nWhen you're setting up a site attestation policy, on the\nConfigure policy\nstep, you can:\nChoose how often you want the sites to be attested (3 months, 6 months, and 12 months).\nIdentify who is responsible for attesting the site (site owners, site admins, or both).\nExclude site owners or admins from receiving requests.\nSpecify what action the policy should take after three notifications.\nExclude site owners or admins from receiving requests\nYou can exclude specific users, Microsoft 365 Groups, or security groups from receiving site lifecycle management requests and notifications, even if they're site owners or site admins for sites that are included in a policy.\nKey behaviors\n:\nExclusions are used only to determine notification recipients.\nExcluding a user or group doesn't change site permissions or ownership, and doesn't exclude the site from lifecycle policy evaluation\nSites continue to be evaluated by the policy as usual.\nLimits\n:\nYou can add up to 100 entries to the exclusion list.\nEach entry can be an individual user, a Microsoft 365 Group, or a security group\nThe 100-entry limit applies to the number of entries, not the number of users within a group. For example, a group with more than 100 members counts as one entry.\nGroup exclusion behavior (important)\n:\nWhen a group is added to the exclusion list for a policy, that group is excluded from notifications only when the group is directly added to the site or is a nested group within other groups that are directly added to the site.\nA member of an excluded group might still receive a notification if they're directly added to the site or are part of some other group that is directly added to the site.\nActions to take on unattested sites after three notifications\nFor sites that are unattested after three monthly notifications, you can choose to either do nothing or take one of the following enforcement actions. The following table summarizes how the inactive site policy behaves for each selected enforcement action:\nEnforcement action\nPolicy behavior\nDo nothing\nSite owners or site admins receive monthly notifications for three months. After this period, no notifications are sent for the next three months. If the site remains unattested after six months, monthly notifications resume. The policy execution report lists unattested sites as unactioned by the site owner. You can download this report and filter out sites marked as unactioned.\nRead-only access\nSite owners or site admins receive monthly notifications for three months. If the notification recipients don't mark the site as attested during this period, the site goes into read-only mode.\nArchive sites after mandatory read-only period\nSite owners or site admins receive monthly notifications for three months. If the notification recipients don't mark the site as attested during this period, then the site goes into a read-only mode for a configurable duration (3, 6, 9, or 12 months). After the configured number of months, the site gets archived through\nMicrosoft 365 Archive\n. Archival is subject to the tenant enabling Microsoft 365 Archive in the Microsoft 365 admin center.\nNote\nIf you configure the policy to take an enforcement action:\nThe notifications stop after the policy action succeeds.\nThe site and its status are included in the monthly report.\nCustomize email notifications\nAdmins can now customize the emails sent by the Site Lifecycle Management policies, to site owners and admins for certification or attestation. Customizing email content helps improve the read-through rate of the emails sent, effectively improving the response efficiency thus contributing towards better governance across the tenant.\nThe option to customize emails is available in the configure step for all site lifecycle management policies.\nSelecting\nCustomize email to be sent\nopens the customization window as following:\nCustomizable section\nDescription\nSender\nConfiguring a custom domain (in the Microsoft 365 admin center) is a prerequisite to using the email customization feature. For more information, see\nChoose which domain to use for your email\n.\nSubject\n(up to 100 characters)\nYou can use\n$UserDisplayName\nto insert the user's name and\n$SiteName\nto insert the name of the site.\nMessage\n(up to 500 characters)\nYou can use\n$UserDisplayName\nto insert the user's name,\n$SiteName\nto insert the name of the site and\n$SiteUrl\nto insert URL of the site.\nPolicy guideline URL\nOnly valid HTTP links are allowed\nPolicy guideline description text\nDefault value is the placeholder text\nYou can also customize emails for existing policies. To customize emails, follow these steps:\nSelect an existing policy.\nGo to\nEdit configuration\n.\nFind the email customization option.\nNote\nIf you don't configure email customization for a policy, the system continues to send default emails from\nnoreply@sharepoint.com\n.\nWhat to do if you can't customize email messages\nYou might not be able to customize emails if the custom domain setting isn't configured or is turned off.\nYou must configure the\nSend email notifications from your domain\nsetting in the Microsoft 365 admin center before you can customize emails. If this setting isn't configured, you see a warning message on the top of the policy list, as shown in the following image:\nYou might also see the warning message during the configuration step, as shown in the following screenshot:\nIf you previously customized emails in one or more policies, but now the\nSend email notifications from your domain\nsetting in the Microsoft 365 admin center is turned off later, you see the message bar in the policy list, and a warning message in the email customization window, as shown in the following screenshot:\nNote\nOnly someone who has the Global Administrator role can configure domain settings in the Microsoft 365 admin center.\nAfter configuring the policy settings, select\nNext\nto finish your policy. Name the policy, add a description (optional), and select a policy mode.\nSelect\nFinish\n. You can now view and manage your policy from the\nSite lifecycle management\n>\nSite attestation policy\ndashboard.\nSite set as read-only mode\nWhen you configure an unattested site policy with the read-only enforcement action, it sends extra notifications to inform site owners or site admins that the site goes into read-only mode.\nWhen the site is in read-only mode, the following banner appears on the site:\nRemove site from read-only mode\nTo remove a site from read-only mode in the\nSharePoint admin center\n, go to the\nActive sites\npage, select the site, and then select\nUnlock\nfrom the site page panel.\nSite owners can't remove a site from read-only mode. They must contact the tenant admin to remove read-only mode.\nUnarchive a site\nTo unarchive a site in\nSharePoint admin center\n, expand\nSites\nand select\nArchived sites\n. Select the site you want to unarchive and select\nReactivate\n.\nNote\nOnly tenant admins can reactivate an archived site.\nSites managed by multiple site attestation policies\nFor each type of site lifecycle management policy, such as\nsite ownership policy\n,\ninactive site policy\n, and\nsite attestation policy\n, if you create multiple policies, notification emails aren't repeated. If a notification was sent within the last 30 days from any policy of that type, and the site remains unattested or uncertified, no further notifications are sent. The policy execution report shows the site's status as \"Notified by another policy.\"\nFor example, if a site is covered by two different site attestation policies and receives a notification email from the first policy, the second policy doesn't send any extra notifications within the next 30 days if the site remains unattested.\nMake sure that policies of the same type don't have overlapping scopes. If sites fall under the scope of multiple policies of the same type, the notification schedule and enforcement actions on the site could become unpredictable.\nPart 4 - Deploy your policy in either simulation or active mode\nAfter you have configured your site attestation policy, and you're on the\nFinish\nstep, you can specify whether to test your policy or activate it immediately.\nOn the\nFinish\nstep of setting up your site attestation policy, give your policy a name and description.\nUnder\nPolicy mode\n, select either the\nSimulation\nor\nActive\nmode.\nPolicy modes\nWhen setting up a site lifecycle policy, you can choose between a simulation policy and an active policy.\nSimulation mode\nThe simulation policy runs once and generates a report based on the set parameters. If it fails, you need to delete it and create a new one. Once you validate a simulation policy, you can convert it to an active policy.\nNote\nSite lifecycle policies in simulation mode are now available in GCCH and DoD environments as of November 17, 2025.\nActive mode\nThe active policy runs monthly, generating reports and sending notifications to site owners to confirm the site's status. If it fails during a particular month, it will run again on the next schedule. The policy enforces actions on sites that remain uncertified or unattested by the site owner or admin, provided you configured it to take enforcement actions.\nReporting\nAfter each run of the configured policy, you can view a detailed report about the sites it identifies.\nIn the\nSite attestation policies\npage, select the desired policy from the list.\nThe panel outlines the numbers of:\nSites to attest\nSites that didn't have anyone to notify\nSites attested\nSites set to read-only\nArchived sites\nYou can also view the policy's scope, configuration, and general information on the report.\nYou can also view the policy's scope, configuration, and general information on the panel. Select the\nDownload detailed report\noption to download the report in CSV containing the following details for each of the sites identified due for attestation:\nColumn\nDefinition\nSite name\nName of the site\nURL\nURL of the site\nTemplate\nTemplate of the site\nConnected to Teams\nIndicates if it's a Teams-connected site\nSensitivity label\nSensitivity label assigned to the site\nRetention policy\nIndicates if any retention policy is applied to the site\nSite lock state\nState of site access\nbefore\nthe policy runs (Unlock/Read-Only/No access)\nNotified site admins\nEmail addresses of site admins receiving attestation notifications\nNotified site owners\nEmail addresses of site owners receiving attestation notifications\nLast attested by\nEmail address of the person who last attested the site\nLast attestation date (UTC)\nDate when the site was last attested\nNumber of site owners\nTotal count of site owners for the site\nEmail address of site owners\nEmail addresses of all site owners\nNumber of site admins\nTotal count of site admins for the site\nEmail address of site admins\nEmail addresses of all site admins\nTotal notifications count\nTotal notifications sent so far by any policy under the same policy template\nAction status of policy\nStatus of the site (First, second, or third notification sent, Site in read-only mode, Site archived, Action taken by another policy such as read-only, archive, or notified by another policy)\nAction taken on (UTC)\nDate on which the enforcement action was taken (date when site was archived or put in read-only mode)\nLast activity date (UTC)\nDate of last activity detected across SharePoint site and connected workloads\nSite creation date (UTC)\nDate when the site was created\nStorage used (GB)\nStorage consumed by the site\nDuration in read-only (days)\nNumber of days the site is in the enforced read-only state\nConfigure actionable emails for US Government Cloud (GCC High or DoD) environments\nIn US Government Cloud (GCC High and DoD) environments, a tenant administrator must complete an extra, one-time setup for SharePoint site lifecycle management (SLM) policies to use\nactionable messages\n. This step helps ensure that policy notification emails display and function correctly. For example, site admins and site owners can take actions directly from email.\nUnlike other commercial cloud environments, GCC High and DoD tenants require explicit administrator approval of the actionable message provider before it can send interactive email messages. Without this approval, SLM policy emails are delivered, but action buttons don't function as expected.\nImportant\nYou must be a Global Administrator or Exchange Administrator in the tenant to set up actionable messages.\nApprove the SLM actionable message provider\nGo to the\nOutlook Actionable Messages – Connectors admin portal for GCCH or DoD\nand sign in.\nIn the\nProvider Status\nfilter, select\nApproved by Microsoft – Pending Your Approval\n.\nLocate the provider named\nInactiveSiteOAMProviderGCCH\n.\nSelect the provider, and then select\nApprove\n.\nAfter you approve the provider, the SLM policy notifications send actionable messages.\nNote\nThis approval applies only to SLM policy notifications. Other applications or services that use actionable messages might require separate approval.\nEnsure actionable messages are enabled for the tenant\nSite lifecycle management policies use Outlook actionable messages to enable site owners or site administrators to take necessary actions by using links within email messages.\nFor notifications to render properly, make sure your organization meets the\nOutlook version requirements\n.\nTo troubleshoot rendering problems, see\nfrequently asked questions\n.\nTroubleshooting actionable messages\nIf actionable messages don't work as expected, try these steps:\nMake sure that the\nInactiveSiteOAMProviderGCCH\nprovider is in an approved state.\nAllow sufficient time. It can take up to 24 hours for changes to propagate.\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Site Attestation", @@ -1277,7 +1277,7 @@ "https://learn.microsoft.com/en-us/sharepoint/insights-on-sharepoint-agents": { "content_hash": "sha256:919c73c3e7cb8bcd6dc2503d343c965b22800b2bfa1199d401f3e1a7a271a279", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nInsights report on agents in SharePoint\nFeedback\nSummarize this article for me\nInsights report on agents in SharePoint provides SharePoint Administrators with rich information on the recently created agents across all SharePoint sites and OneDrive sites within their organization. This report provides admins with the ability to learn about the sites with the highest number of agents created. Using this report, SharePoint admins can further govern and maintain the integrity of the content used by agents as grounding data.\nThe insights report is based on the Microsoft audit data logged for the agents, created in SharePoint, through the FileCreated and FileRenamed events.\nYou can generate and manage agent insights report in SharePoint Admin Center or with SharePoint Online Management Shell.\nWhat do you need to access agent insights report\nLicense requirements\nYour organization needs to have the right licenses and meet certain administrative permissions or roles to use the feature described in this article.\nFirst, your organization must have one of the following licenses:\nOffice 365 E3, E5, or A5\nMicrosoft 365 E1, E3, E5, or A5\nAdditionally, you need to have a Microsoft 365 Copilot license.\nNote\nAt least one user in your organization must be assigned a Copilot license (this user doesn't need to be a SharePoint administrator).\nIf your organization has a Copilot license and at least one person in your organization is assigned a Copilot license, SharePoint administrators automatically gain access to the\nSharePoint Advanced Management features needed for Copilot deployment\n.\nAdministrator requirements\nYou must be a\nSharePoint administrator\nor have equivalent permissions.\nImportant\nIf you don't have a Microsoft SharePoint Advanced Management license, you're asked to enable data collection, so that relevant audit data is collected to build the report. Once enabled, the reports can be generated 24 hours later and contain data from the point of collection. Data is stored for 28 days. If no reports are generated at least once in three months, data collection is paused and should be enabled again. To enable data collection for these reports, see the\nData collection for insights report on agents in SharePoint\nsection (in this article).\nHow to access agent insights report in SharePoint Admin Center\nSign in to the\nSharePoint admin center\nwith the\nSharePoint administrator\ncredentials for your organization.\nIn the navigation pane, expand\nReports\nand then select\nAgent Insights\n.\nHow to create reports in SharePoint Admin Center\nAs a SharePoint Administrator, you can create the report by selecting\nCreate a report\n.\nProvide the Report name and under Report duration, specify the time frame for the report.\nSelect\nCreate and run\n.\nNote\nYou can create a report for the past 1, 7, 14, or 28 days.\nView report status in SharePoint Admin Center\nTo check if a report is ready or when it was last updated, see the \nStatus\n column.\nView report in SharePoint Admin Center\nWhen a report is ready, select it to view the data. You can view the top 100 records hosting the highest number of agents. You can search for sites or filter on the site template, and governance policies.\nApply Content governance policies in SharePoint Admin Center\nYou can apply content governance policies on the sites from the insights report. The policies available are\nRestrict site access policy\nand\nRestrict Content Discovery policy\n.\nNote\nAfter a policy is applied to the site from the insights report, the policy status on the existing report won't be updated. To view the updated status of the policy on the site, select the policy to view the latest status or access the Active site panel and review the site settings.\nAgent insights report in SharePoint Online Management Shell\nYou can generate and manage agent insights report using SharePoint Online Management Shell.\nIf you haven't already done so, download and install the latest version of\nSharePoint Online Management Shell\n.\nConnect to SharePoint Online as at least a \nSharePoint administrator\nin Microsoft 365. For more information, see\nGetting started with SharePoint Online Management Shell\n.\nTo generate and view these reports, ensure the organization has the SharePoint Advanced Management add-on SKU or Microsoft 365 Copilot license.\nWith permissions of at least a SharePoint administrator, you can generate and view the insights report using the following commands:\nTo generate report for a one-day default report duration, run the following command:\nStart-SPOCopilotAgentInsightsReport\nTo generate a report for any other duration (7, 14 or 28 days), run the following command:\nStart-SPOCopilotAgentInsightsReport -ReportPeriodInDays\nFor example, to generate report for the past 28 days, run the command:\nStart-SPOCopilotAgentInsightsReport -ReportPeriodInDays <28>\nTo check the status of all active and available reports, run the following command:\nGet-SPOCopilotAgentInsightsReport\nTo check the status of a specific report, run the following command:\nGet-SPOCopilotAgentInsightsReport –ReportId\nTo download and view the report, run the following command:\nGet-SPOCopilotAgentInsightsReport –ReportId -Action Download\nNote\nPowerShell displays up to 100 records, but downloaded reports can contain up to 1 million records.\nGet-SPOCopilotAgentInsightsReport –ReportId -Action View\nTo view more detailed reports, the following options are available:\nCopilotAgentsOnSites\n: Provides the name of all the agents currently available on all sites. This report contains up to 1,000,000 records.\nNote\nThe default value for the\n-Content\nparameter is\nCopilotAgentsOnSites\n.\nGet-SPOCopilotAgentInsightsReport –ReportId -Content CopilotAgentsOnSites\nTopSites\n: Provides a list of 100 sites with the number of agents available on each site.\nGet-SPOCopilotAgentInsightsReport –ReportId -Content TopSites\nSiteDistribution\n: Provides the summarized view of agents across all types of sites like Communication sites, Microsoft 365 Group connected sites, OneDrive sites, etc.\nGet-SPOCopilotAgentInsightsReport –ReportId -Content SiteDistribution\nData collection for insights report on agents in SharePoint\nIf you don't have a Microsoft SharePoint Advanced Management license, you're asked to enable data collection. This section explains how to enable and check status for data collection for the Insights report on agents in SharePoint.\nEnable data collection\nThis PowerShell command starts collecting audit data for reports on activities from the last 28 days.\nStart-SPOAuditDataCollectionForActivityInsights\nDisabling data collection\nThis PowerShell command stops collecting audit data for reports on activities from the last 28 days.\nStop-SPOAuditDataCollectionForActivityInsights\nChecking the data collection status\nOnce data collection is enabled, the reports can be generated after 24 hours. To check whether reports can be generated, use the PowerShell command Get-SPOAuditDataCollectionStatusForActivityInsights. The command returns the current data collection status, which can be\nNotInitiated\n,\nInProgress\n,\nPaused\n. Reports can be generated when the status is\nInProgress\n.\nGet-SPOAuditDataCollectionStatusForActivityInsights\nKnown experiences with agent insights reports in SharePoint\nThe following are some known experiences with agent insights reports generated in SharePoint Admin Center or using SharePoint Online Management Shell:\nA report can be rerun only after 24 hours since the last report generated.\nIn large tenants, it might take up to 48 hours for the data to be available.\nOnly one report can exist for each report range value (1, 7, 14, or 28 days). This means you can see a maximum of four reports at a given point.\nThe newly generated report replaces the previously created report with the same date range. To preserve the previously created report, download the report first before creating a new report for the same date range.\nThese reports are generated using Microsoft 365 unified audit data and might not cover all audit events.\nRelated articles\nRestrict SharePoint site access with Microsoft 365 groups and Microsoft Entra security groups\nRestrict discovery of SharePoint sites and content\nMicrosoft.Online.SharePoint.PowerShell Module\nFeedback\nWas this page helpful?\nYes\nNo\nNo\nNeed help with this topic?\nWant to try using Ask Learn to clarify or guide you through this topic?\nAsk Learn\nAsk Learn\nSuggest a fix?\nAdditional resources", - "last_checked": "2026-03-11T12:59:29.613756+00:00", + "last_checked": "2026-03-14T06:51:10.083468+00:00", "last_status": 200, "last_changed": "2026-03-11T12:59:29.613756+00:00", "topic": "Agent Insights", @@ -1286,7 +1286,7 @@ "https://learn.microsoft.com/en-us/sharepoint/control-lists": { "content_hash": "sha256:dc8e5471a4d7397566cc96f3121870479daf0088b98c36640aca5df0dd0a0db2", "normalized_content": "Table of contents\nExit editor mode\nAsk Learn\nAsk Learn\nFocus mode\nTable of contents\nRead in English\nAdd\nAdd to plan\nEdit\nShare via\nFacebook\nx.com\nLinkedIn\nEmail\nCopy Markdown\nPrint\nNote\nAccess to this page requires authorization. You can try\nsigning in\nor\nchanging directories\n.\nAccess to this page requires authorization. You can try\nchanging directories\n.\nControl settings for Microsoft Lists\nFeedback\nSummarize this article for me\nAs at least a\nSharePoint Administrator\nin Microsoft 365, you can control settings for Microsoft Lists. You can:\nDisable the creation of personal lists (prevent users from saving new lists to \"My lists\").\nDisable built-in list templates that aren't relevant for your organization.\nYou control both of these settings by using Microsoft PowerShell.\nDisable creation of personal lists\nIf you change this setting, when users create a list, they must select a SharePoint site for saving the list. The \"Save to\" setting doesn't include the \"My lists\" option.\nDefault\nPersonal list creation disabled\nDownload the latest SharePoint Online Management Shell\n.\nNote\nIf you installed a previous version of the SharePoint Online Management Shell, go to Add or remove programs and uninstall \"SharePoint Online Management Shell.\"\nConnect to SharePoint as at least a\nSharePoint Administrator\nin Microsoft 365. To learn how, see\nGetting started with SharePoint Online Management Shell\n.\nRun the following command:\nSet-SPOTenant -DisablePersonalListCreation $true\nTo re-enable the creation of personal lists, set the parameter to\n$false\n.\nDisable built-in list templates\nDisabling these templates removes them from all places users create lists (the Lists app, Microsoft Teams, and SharePoint sites).\nDefault\nBuilt-in list templates disabled\nSome templates disabled\nAll templates disabled\nDownload the latest SharePoint Online Management Shell\n.\nNote\nIf you installed a previous version of the SharePoint Online Management Shell, go to Add or remove programs and uninstall \"SharePoint Online Management Shell.\"\nConnect to SharePoint as at least a\nSharePoint Administrator\nin Microsoft 365. To learn how, see\nGetting started with SharePoint Online Management Shell\n.\nRun the following command:\nSet-SPOTenant -DisableModernListTemplateIds '