Microsoft Copilot Copyright Commitment explained (original) (raw)
The copyright laws around generative AI-created content are still somewhat unclear, so organizations should look to Microsoft's copyright protections for guidance.
Microsoft has dozens of different versions of its Copilot technology integrated across its software and services, but some organizations have concerns about how these generative AI tools interact with sensitive and private business data.
While the data sources and functionality differ from one Copilot to another, they all have one thing in common which is the large language model (LLM) they use: GPT from OpenAI -- delivered from Microsoft's data centers.
Data privacy and LLMs in the enterprise
While the language models have improved on this front since the release of ChatGPT, these LLMs and the content they produce are still not completely original, even though it might seem otherwise. GPT and other language models have been trained on large data sets of text from the internet, encompassing publicly available information from various sources. It learns patterns and linguistic structures from this training data to generate human-like responses.
Although GPT can create unique replies to user prompts, there is a possibility that some generated content might resemble or replicate existing information found online. If a user prompts the language model to create a fictional story, it might still draw upon the extensive knowledge it has acquired from various data sources used during its training.
Copilot, like other generative AI services, shares a common issue: Its data may originate from the same sources. This issue also extends to image and visual generation with Copilot via the Dall-E model, which creates images based on user prompts and largely pulls from the data sources that it has trained on.
Imagine a scenario where a user creates an image using Copilot and displays it on their organization's public website. Then, they are faced with an infringement lawsuit because the Copilot image might be similar to another organization's company logo or design. This is a situation where the Microsoft Copilot Copyright Commitment comes into play.
What is the Microsoft Copilot Copyright Commitment?
The Copilot Copyright Commitment launched in September 2023 with the aim of addressing concerns related to intellectual property infringement claims.
As part of the Copilot Copyright Commitment -- also referred to by Microsoft as the Customer Copyright Commitment -- Microsoft states it is obligated to defend customer organizations against intellectual property claims and to cover all related legal expenses. This applies to customers that have used or distributed output content such as images from Copilot. However, there are some conditions that must be fulfilled to get the legal assistance from Microsoft.
1. The customer cannot have disabled, evaded, disrupted or interfered with the content filters built into the product, or other safety systems that are part of the Copilot. At the moment there are no options available to disable or alter these built-in safety mechanisms.
As part of this commitment, Microsoft states it is obligated to defend customer organizations against intellectual property claims and to cover all related legal expenses.
2. The customer must not modify, use or distribute the output content in ways that they know, or reasonably should know, could infringe upon or misappropriate third-party proprietary rights. For example, intentionally using source code, text, information or visuals that are known to be copyrighted and owned by another organization would violate this condition.
3. The customer must have sufficient rights to use the input in connection with the Copilot, including, without limitation, any customer data used to customize the model that produced the output content that is the subject of the claim.
4. The claim does not allege that the output content, as used in commerce or the course of trade, violates a third party's trademark or related rights. As with the example mentioned earlier, suppose a customer uses Copilot to generate marketing materials, including a logo designed to be distinctly original but inadvertently similar to an existing trademarked logo. This would not be covered by the commitment.
This commitment from Microsoft only covers the paid versions of Copilot -- including Copilot for Microsoft 365, and Windows Copilot when used with a work identity -- and Bing Chat Enterprise. However currently this does not cover any custom-built Copilot services such as Copilot Studio or other free services. Those other services do not have the same safety mechanisms built in.
The commitment can also extend to services built on top of Azure OpenAI as long as the customer has added proper configurations and meta prompts to the service.
So why has Microsoft made this commitment? Many organizations are afraid that using generative AI to create content will create legal challenges due to the potential of generating output that infringes on existing intellectual properties.
Recognizing this risk, Microsoft's commitment serves as a safety net, providing customers with a degree of security and peace of mind when deploying and using these AI tools. This proactive approach is intended to encourage innovation and usage of AI technologies without the constant fear of legal repercussions. And of course, Microsoft wants to position itself in the enterprise as a leader of these new capabilities.
What does the Copilot Copyright Commitment mean for the enterprise?
Organizations should recognize the Copilot Copyright Commitment from Microsoft as a form of risk management. It functions as a reassurance that Microsoft will not only stand behind its product but also behind the users of its product, if any legal issues arise concerning intellectual property.
However, it is also important to understand the boundaries of this commitment. The protection offered is contingent upon users adhering to the specific guidelines, so organizations absolutely must align their usage of Copilot with these guidelines.
Secondly, competing cloud providers such as Google have also made the same commitment on their generative AI services, so clearly there is an industry-wide recognition of the need for such safeguards.
For IT professionals, especially those working with Copilot for Microsoft 365, the Microsoft Copilot copyright commitment means they must be cautious in how they implement and manage the use of Copilot within their organizations. They must ensure that configurations adhere to Microsoft's guidelines and that all user interactions with Copilot are monitored for compliance.
This includes overseeing the types of data fed into Copilot and managing how the output is used across different platforms within the organization. This is difficult to do, but Microsoft has also made commitments to provide new mechanisms and dashboards that reveal these insights.
IT pros must also stay informed about the latest updates and enhancements to the AI models used by Copilot, and understand how changes in the models' training data or functionality might affect compliance and this commitment. This will help prevent potential legal issues and ensure that the use of Copilot aligns with both corporate policies and broader legal standards.
Marius Sandbu is a cloud evangelist for Sopra Steria in Norway who mainly focuses on end-user computing and cloud-native technology.