Mittwoch, 24. April 2024

Size matters - Large documents and Copilot for Microsoft 365

Microsoft has published an article named Keep it short and sweet: a guide on the length of documents that you provide to Copilot. It describes how Copilot for Microsoft 365 reaches its limits when it has to work with large documents or very long emails.

The reason for this is that Copilot works with data from the Microsoft Graph, which means that the search in M365 also has a role here. Documents, emails and all other content must first be indexed by the search before they are available for Copilot. At least for the search in SharePoint Online, the limits are documented: https://learn.microsoft.com/en-us/sharepoint/search-limits.

The exact limits that apply for processing by Copilot in Microsoft 365 are currently unclear. The article Keep it short and sweet: a guide on the length of documents that you provide to Copilot gives the following recommendations:

  • Shorter than 20 pages
  • Maximum of around 15,000 words

The example shows how it behaves when relevant information is after these limit recommendations. The relevant information to be used via Copilot are as followed. These are on page 49 of a Word document that contains a total of 27,208 words.


If you ask Copilot “What can you tell me about Snabales Total liabilities?” you get the following answer:
If you use Copilot in Word and ask the same question, the answer is: “This response isn't based on the document: I'm sorry, but the document does not provide any information about Snabales Total liabilities...”


One option you now have here is not to use Copilot for Microsoft 365 natively, but to create your own solution based on Azure AI-Search and Azure OpenAI. In Azure AI-Search, a vector search can be used that splits large documents into so-called chunks. This article describes the details: Chunking large documents for vector search solutions in Azure AI Search



Sharing link „People in your organization“ & Copilot for Microsoft 365

Microsoft Copilot for Microsoft 365 only surfaces organizational data to which individual users have at least view permissions. Source: https://learn.microsoft.com/en-us/copilot/microsoft-365/microsoft-365-copilot-privacy#how-does-microsoft-copilot-for-microsoft-365-use-your-proprietary-organizational-data

The sharing function in SharePoint and OneDrive can be used to share content with users. It is then also possible to define which persons or groups are granted access and with which rights:

The following applies:
  • Anyone gives access to anyone who receives this link, whether they receive it directly from you or forwarded from someone else. This may include people outside of your organization.
  • People in <Your Organization> with the link gives anyone in your organization who has the link access to the file, whether they receive it directly from you or forwarded from someone else.
  • People with existing access can be used by people who already have access to the document or folder. It doesn't change any permissions and it doesn't share the link. Use this if you just want to send a link to somebody who already has access. 
  • Specific people gives access only to the people you specify, although other people may already have access. This may include people outside of your organization. If people forward the sharing invitation, only people who already have access to the item will be able to use the link.

For the options “Anyone ”, “People with existing access” and “Specific people”, everything described above also applies for Copilot => Microsoft Copilot for Microsoft 365 only displays organizational data for which individual users have at least display permissions.

The situation is slightly different with the option “People in <Your Organization> with the link”. The following applies here: 

Creating a People in your organization link will not make the associated file or folder appear in search results, be accessible via Copilot, or grant access to everyone within the organization. Simply creating this link does not provide organizational-wide access to the content. For individuals to access the file or folder, they must possess the link and it needs to be activated through redemption. A user can redeem the link by clicking on it, or in some instances, the link may be automatically redeemed when sent to someone via email, chat, or other communication methods. The link does not work for guests or other people outside your organization. 
Source and further details: https://learn.microsoft.com/en-us/sharepoint/deploy-file-collaboration#control-sharing

This can lead to non-transparent effects for users. For example, a user who has shared content in this way may assume that this information is now available to all users in the tenant and is therefore also accessible via Copilot. In the following example, the user Stan Laurel shares the files RefDoc.docx and SnabelesSnowball.docx via the “People in your organization” link.

Another user has received and clicked the link to the RefDoc.docx file, but not the link to the SnabelesSnowball.docx file.
This leads to the following result in Copilot although the user generally has access to both files:
Question to Copilot about the contents of the SnabelesSnowball.docx file for which the share link has not yet been clicked:
Question to Copilot about the contents of the RefDoc.docx file from which the sharing link has already been used at least once:
This effect, that a user is only granted access to the file once he has clicked on the sharing link, is also confirmed in another example. Here, Copilot is asked to create a list of all files that have been shared via the link type People in <your organization>. The file RefDoc.docx, whose sharing link has already been clicked, appears in the list. The file SnabelesSnowball.docx, for which this is not yet the case, is not mentioned.

There is also another side to this topic. If the default sharing link is “People in <Your Organization> with the link”, and this link is further promoted by people, for example by posting it in a Teams post or sending it by email, this can lead to all users suddenly receiving replies from Copilot to the content behind the link, even if this information was not intended for everyone in the company. So caution and a solid concept for dealing with the topic is necessary.
To see what kind of sharing links are used, the “Data access governance reports for SharePoint sites” can be used:

Source and further details: https://learn.microsoft.com/en-US/SharePoint/data-access-governance-reports?WT.mc_id=365AdminCSH_inproduct#sharing-links-reports





Sonntag, 4. Februar 2024

Case study on the use of Microsoft AI solutions

Currently, not a day goes by without news in the area of AI. The following article is about a project with a company from Germany that has used features from the Microsoft AI stack to implement solutions for employees' daily work.

Highlights

  • Added value of AI technologies in daily work
  • How can AI be used in the context of customer projects 
  • Requirements related to the AI Act and GDPR
  • How can AI be used effectively in harmony with the human factor
  • How does the secure use of AI solutions look in terms of an IT security strategy?

Challenges

As a leading consultancy strategy projects, the company's focus is on what their customers need. The human factor remains one of the most important aspects here. AI solutions must be easy to use and deliver reliable, reproducible results if they want to add value in day-to-day work. This made it even challenging to classify and use artificial intelligence correctly.
Statement taken during the project:
  • With the solutions based on Azure OpenAI, we speed up our qualification process and the final validation activities in consulting projects, which is a significant advantage - explains the management.
  • We use the Microsoft Azure solution architecture to meet the high requirements of our customers for the secure collection, storage and analysis of data - says the data protection officer.
  • By using the Azure tools for SecDevOps, i.e. the interaction between security, development and IT operations, we can automate and standardize processes. - This is how the CISO summarizes the framework for the AI solution.
These design principles, as outlined by the CISO, are the foundation for fulfilling the requirements of the GDPR and the AI Act / AI Regulation without any problems.
Even sensitive data with aspects relating to the Geschäftsgeheimnis-Gesetz / German Trade Secrets Act can now be processed by Microsoft's AI solutions.

Example / quote from the project: One of our customers has well over 100 existing patents. The customer now wanted to know what new product suggestions the AI would generate based on the existing patents.

Objectives and solutions in the project

The driver of successful companies is not exclusively digitalization. Our markets have been saturated for a long time now, and only a few sectors are still focused on real growth, but more often just on shifting market share. The pace of innovations is becoming ever tighter. However, speed is only effective if there is a strategic idea behind the innovations. Finding out whether an idea meets the actual requirements of the market takes time. The aggregation of survey results and feedback from beta phases takes time to complete. 
Beta testing for new processes and products always has to survive against established expertise and experience. This specialist knowledge exists either in the heads of employees or in countless internal company repositories, databases and knowledge sources. Making this knowledge usable when testing new solutions was a challenge that could be solved with AI.

Quote from the project: With the Microsoft 365 Chat function, we can answer questions relating to data in our M365 environment in seconds. As we store data and information from our customers in secure project rooms in Microsoft Teams, this function is also available to us there.

The Microsoft 365 Chat solution is a feature in the context of Microsoft Copilot. Here, the quality of the prompt that a user enters determines the quality of the result that the AI generates. It was therefore a key factor during the rollout to ensure employee empowerment through a training concept.
Data accuracy was necessary, especially in the area of quality management of results. Here, the typical hallucination of generative AI solutions was prevented by teaching existing large language models in Azure OpenAI and using predefined prompts.

The solutions:
  • Storing project data and information in Microsoft Teams / SharePoint means that this information can be analyzed using Microsoft 365 Chat.
  • Training concept for the use of AI solutions / Prompt Engineering
  • Special processes and quality control based on AI were made possible with customized Large Language Models in Azure OpenAI
  • Solutions relating to specific topics were implemented using predefined prompts/prompt extensions with Microsoft Copilot Studio. 

Benefits

A data-supported approach to developing new and innovative processes and products can be applied much more efficiently, quickly and scalably with AI. The definition of new solutions, as well as the associated testing and quality management, is also supported by AI solutions. 
  • Evaluating the current situation / evaluating exsisitng data pools in consulting projects
  • The creativity of generative AI solutions is used to identify new processes and approaches for product innovations
  • The human factor is seen as an initial component of the consulting approach, but can now focus on the essentials


Freitag, 24. November 2023

Who’s Harry Potter? - Can AI really forget something it has learned & GDPR

The question "Who's Harry Potter?" is the title of the article by Ronen Eldan (Microsoft Research) and Mark Russinovich (Azure) on the topic of whether AI systems can forget something once they have learned it. As far as the topic of "forgetting" is involved, the GDPR also comes up here. Article 17 of the GDPR regulates the right to deletion / to be forgotten. Microsoft has already provided information on the topic for AI solutions in the context of the GDPR.
But one thing at a time...

Who’s Harry Potter?

Ronen Eldan and Mark Russinovich wanted to make the Llama2-7b model forget the content of the Harry Potter books. The background to this is that the data set "books3", which contains many other copyrighted texts in addition to the Harry Potter books, was allegedly used to train the LLM. Details: The Authors Whose Pirated Books Are Powering Generative AI
However, unlearning is not as easy as learning. How to train or fine-tune an LLM in Azure OpenAI is described here. Essentially, a JSONL file is used to instruct a base model which answer should be given to an explicit question:

From a high-level perspective, Ronen Eldan and Mark Russinovich proceeded in exactly the same way, as there is currently no "delete function" for LLMs. The model was therefore trained to answer questions about Harry Potter differently:
However, these adjustments resulted in the model hallucinating significantly more. The ability to hallucinate is a key feature of generative AI solutions. If the model has no information to generate an answer, an answer is created on the basis of likelihood calculation. This is called hallucinating. This results in outputs such as this one, which claims that Frankfurt Airport will have to close in 2024:

Ronen Eldan and Mark Russinovich have made their version of the Llama2-7b model available on HuggingFace, and encourage everyone to give them feedback if they still manage to get knowledge about Harry Potter as output. Details: https://arxiv.org/abs/2310.02238 And here is the link to the article: Who's Harry Potter? Making LLMs forget

Privacy, and Security for Microsoft AI solutions

As mentioned above, the right to be forgotten is only one aspect when it comes to the requirements of the GDPR or ISO/IEC 27018. Microsoft does not offer any explicit legal support in the actual sense. Rather, it is described that Microsoft AI solutions also generally meet the necessary requirements. The key points here are:
  • Prompts, responses and data accessed via Microsoft Graph are not used for the training of LLMs, including those of Microsoft 365 Copilot.
  • For customers from the European Union, Microsoft guarantees that the EU data boundary will be respected. EU data traffic remains within the EU data boundary, while global data traffic in the context of AI services can also be sent to other countries or regions.
  • Logical isolation of customer content within each tenant for Microsoft 365 services is ensured by Microsoft Entra authorization and role-based access control.
  • Microsoft ensures strict physical security, background screening and a multi-level encryption strategy to protect the confidentiality and integrity of customer content.
  • Microsoft is committed to complying with applicable data protection laws, such as the GDPR and data protection standards, such as ISO/IEC 27018.
Currently (November 24, 2023) Microsoft does not yet offer any guarantees for data in-rest in the context of Microsoft 365 Copilot. This applies to customers with Advanced Data Residency (ADR) in Microsoft 365 or Microsoft 365 Multi-Geo. Microsoft 365 Copilot builds on Microsoft's current commitments for data security and data protection. In the context of AI solutions, the following also applies:
All details on how Microsoft AI solutions fulfill regulatory requirements are described here:



Dienstag, 31. Oktober 2023

Prepare your organization for Microsoft 365 Copilot

Make sure that all permissions are set correctly. Check that all technical requirements are met and assign Copilot licenses to your user.

All of these points are certainly part of rollout planning. However, they are not the only ones and, above all, the implementation is not done quickly for many companies.

In this article from September 21, 2023, it was announced that Microsoft 365 Copilot will be generally available on 1. November: https://blogs.microsoft.com/blog/2023/09/21/announcing-microsoft-copilot-your-everyday-ai-companion/ 

A classic public preview, as known from other products, was not available for Copilot. Therefore, the usual method of evaluating features with a small pilot group in the company, as soon as a feature is available as a public preview, was not available.

Currently, it is still the case that at least 300 licenses have to be purchased in order to use Microsoft 365 Copilot:


This article describes what you can do today to prepare for Microsoft 365 Copilot, even if you don't have the license available yet.

Copilot, Bing Chat Enterprise, Azure OpenAI – When to use what

The Microsoft 365 Copilot license is an add-on to an existing Microsoft 365 E3/E5 license and is currently quoted at $30 per user/month. Many companies are therefore planning a mix, which may look like this, for example:
  • Approximately 20 - 30% of the employees get a Microsoft 365 Copilot license. These are mainly the so-called power users. 
  • Own solutions based on Azure OpenAI are only implemented for users and requirements where it is really about providing very specific solutions. 

In general, the checking scheme is structured like this:
  • Should the AI have access to own data? - If the answer is YES, the solution with Bing Chat Enterprise is not working
  • Are the features of Microsoft 365 Copilot suitable to cover the requirements? - If the answer is NO, then the answer is to create your own solution based on Azure OpenAI and possibly customized LLMs.

How Microsoft 365 Copilot is designed

Microsoft 365 Copilot is available as a plugin in Office Apps or as M365 Chat in Teams. When a user enters a request, it is tailored through an approach called "grounding". This method makes the user's input more specific and ensures that the user receives answers that are relevant and usable for his specific request. To obtain the data, a semantic index is used. This is also where security trimming takes place, ensuring that a user only receives answers generated based on data they are allowed to access. This is done via the Microsoft Graph. The response generated in this way is then returned to the user. Microsoft Copilot can also be extended. To do this Graph Connectors (https://www.microsoft.com/microsoft-search/connectors) can be used.
Example in Word and M365 Chat:

Get ready for Microsoft 365 Copilot

Before Microsoft 365 Copilot can be used, some prerequisites have to be fulfilled. For example, the solution is only available from a minimum Office version or only in the Office Apps for Enterprise. This also applies to Outlook.  Microsoft 365 Copilot requires the new Outlook, which is available for Windows and Mac. All details about the requirements for Copilot are described in this article: Microsoft 365 Copilot requirements

Preparing for the launch of Copilot

As mentioned above, it is recommended to make Copilot available to a selected test group first. Even if at least 300 licenses have to be purchased, the actual license can then only be assinged to selected users. The feedback from this test group can then be used to plan the further rollout. Microsoft provides the following information and guidelines for this tasks:
The most important task of this test group is to check that the access- rights-concept has been implemented properly, using representative scenarios. A user's request for information to which he does not have access must provide no answers. In general, the Search in Microsoft 365 can also be used independently of Copilot for such tests. The search at https://www.office.com/search returns results from all Microsoft 365 services and connected sources. This includes Teams chats, emails in Outlook, and posts in Viva Engage. The example shows that searching for sensitive information should not bring up any matches:

Create a Copilot Center of Excellence

In the article How to get ready for Microsoft 365 Copilot, Microsoft recommends creating a Center of Excellence for Copilot. This Center of Excellence can then be used to provide training materials, updates regarding the rollout in the company, FAQs and other information. The Center of Excellence is intended to be a central place for users to find everything related to the topic. Microsoft provides extensive material for this:
The Center of Excellence can then also provide information on where the limitations of AI and Copilot are and what needs to be considered from a regulatory perspective The EU has published the EU AI ACT for this reason.










Donnerstag, 7. September 2023

All You Need is Guest

At Blackhat 2023, there was a session called "All You Need is Guest". The session describes how to highjack a Microsoft 365 tenant with a guest user via the Power Platform. There is also already the tool you need for this on GitHub: power-pwn.
In the Microsoft 365 tenant, only a trial / seed license for the PowerPlatform must be available. As of today, this license type cannot be declined administratively or it is not possible to completely prevent users from procuring this license themselves via settings in Microsoft 365. Trial licenses or "Self Service Purchase" can be configured. This is described here: Manage self-service purchases and trials. There are also scripts available, such as this one on GitHub, to disable everything that can be disabled:  AllowSelfServicePurchase for the MSCommerce PowerShell module. However, the so-called Seeded License cannot be deactivated and is always available to the user.

To protect against an attack, the easiest way is to set up a Conditonal Access Policy that denies guest users access to the PowerPlatform:
However, depending on whether guest users need access to PowerPlatform as part of business processes, details can be selected in the settings for "Specific users included":





Donnerstag, 31. August 2023

Bing Chat Enterprise & Office Online

Bing Chat and Bing Chat Enterprise can interact with Office Online apps. The result is close to what Microsoft Copilot promises in M365 apps.

Bing Chat Enterprise

Bing Chat and Bing Chat Enterprise are extensions to Bing Search available at https://bing.com/ chat. Essentially, both solutions offer ChatGPT-like capabilities that are further enhanced by Bing Search.

Bing Chat Enterprise differs from Bing Chat because it covers aspects of data privacy as well, and sign-in is via a Microsoft Entra ID, formerly Azure AD account. This allows the use of Microsoft Entra security features to secure and control the login. The input and generated texts are not recorded, analyzed or used by Bing Chat Enterprise to train the language models. However, Bing Chat Enterprise cannot access company data in Microsoft 365 and use it to create answers. This is only possible with Microsoft Copilot or via Azure OpenAI.

Bing Chat Enterprise is available through https://bing.com/chat and the Microsoft Edge for Business sidebar. A Microsoft 365 E3, E5, Business Standard, Business Premium, or A3 or A5 license is required to use the feature. Further details on this topic are described here: Bing Chat Enterprise - Your AI-powered chat with commercial data protection

Chat Enterprise & Office Online Apps

To use Bing Chat in combination with the M365 Office Online Apps the way via the Edget Sidebar has to be used. Details on configuration and how IT administrators can control the feature across the enterprise are described here: Manage the sidebar in Microsoft Edge.

If the feature is enabled, it looks like this:

Now Bing Chat Enterprise can be used for all apps that are used in the browser.

The two features Chat and Compose are available for this. The Compose function offers detailed options to generate texts in the required context and style. Thus, tone, format and the length of the text can be defined. The following example shows the result for the question: "What can you do with SharePoint Online?", with the settings: Tone: Professional, Format: Paragraph and Length: Medium:
With the "Add to site" function, the generated texts can be transferred to the Office Online apps Word, Excel, PowerPoint, Outlook, Teams, SharePoint, etc. The text is inserted at the cursor position in the app. In the following example, the generated text is inserted directly into the chat in Teams via the "Add to site" button:
The other direction is also available. However, the "Chat" function must be used here. In the following example, the user has opened a mail in Outlook Online and wants to have the text of the mail translated into Spanish. To do this, the text must be highlighted. Next, Bing Chat automatically asks what should happen to the marked text. The user says "Please translate into Spanish" and Bing Chat can do the job.
To make this feature available, the "Allow access to any web page or PDF" feature must be enabled in the Bing Chat configuration using " Notification  and App Settings":