Start a conversation about your codebase. Whether you’re hunting down a bug or designing a new feature—when you’re stuck, ask
GitHub Copilot.
GitHub Copilot is used by
The competitive advantage developers ask for by name.
Proven to increase developer productivity and accelerate the pace of software development.
Read the research55%
Faster coding
Designed by leaders in AI so you can build with confidence.
Committed to your privacy, security, and trust.
Visit the Copilot Trust CenterDuolingo empowers its engineers to be force multipliers for expertise with GitHub Copilot, Codespaces.
Read customer storyThe industry
standard.
The AI coding assistant
elevating developer workflows.
Accelerate workflows
- Improve code quality and security. Developers feel more confident in their code quality when authoring code with Copilot. And with the built-in vulnerability prevention system, insecure coding patterns get blocked in real time.
- Enable greater collaboration. Copilot’s the newest member of your team. You can ask general programming questions or very specific ones about your codebase to get answers fast, learn your way around, explain a mysterious regex, or get suggestions on how to improve legacy code.
Get AI-based suggestions in real time.
GitHub Copilot suggests code completions as developers type and turns natural language prompts into coding suggestions based on the project's context and style conventions.
Copilot EnterpriseDocs that feel tailored for you.
Spend less time searching and more time learning, by getting personalized answers that are grounded in your organization’s documentation, with inline citations. Load content → Ask question → Profit.
Copilot EnterprisePull requests that tell a story.
GitHub Copilot keeps track of your work, suggests descriptions, and helps reviewers reason about your changes.
Copilot EnterprisePrefer bespoke? Fine-tune a private copilot for the precision enterprises demand.
Join the waitlistAsk for assistance right in your terminal.
Try Copilot in the CLIKeep flying with your favorite editor.
Take flight with GitHub Copilot.
Organizations and developers all over the world use GitHub Copilot to code faster, drive impact, and focus on doing what matters most: building great software.
For organizations
Copilot Business
Copilot in the coding environment.
- Code completions
- Chat in IDE 1 and Mobile 2
- CLI assistance 3
- Security vulnerability filter
- Code referencing
- Public code filter
- IP indemnity
- Enterprise-grade security, safety, and privacy
Copilot Enterprise
Copilot personalized to your organization throughout the software development lifecycle. Requires GitHub Enterprise Cloud.
- Chat personalized to your codebase
- Documentation search and summaries
- Pull request summaries
- Code review skills
- Fine-tuned models 4
For individuals
Copilot Individual
Code completions, Chat, and more for indie developers and freelancers.
$100 per year
Free for verified students, teachers, and maintainers of popular open source projects.
Get the most out of GitHub Copilot.
Frequently asked
questions.
General
What is GitHub Copilot?
GitHub Copilot transforms the developer experience. Backed by the leaders in AI, Copilot provides contextualized assistance throughout the software development lifecycle, from code completions and chat assistance in the IDE to code explanations and answers to docs in GitHub and more. With Copilot elevating their workflow, developers can focus on more: value, innovation, and happiness.
Copilot enables developers to focus more energy on problem solving and collaboration and spend less effort on the mundane and boilerplate. That’s why developers who use Copilot report up to 75% higher satisfaction with their jobs than those who don’t and are up to 55% more productive at writing code without sacrifice to quality, which all adds up to engaged developers shipping great software faster.
Copilot integrates with leading editors, including Visual Studio Code, Visual Studio, JetBrains IDEs, and Neovim, and, unlike other AI coding assistants, is natively built into GitHub. Although code completion functionality is available across all these extensions, the chat is currently available only in Visual Studio Code and Visual Studio, with a beta version available for JetBrains IDEs. Growing to millions of individual users and tens of thousands of business customers, Copilot is the world’s most widely adopted AI developer tool and the competitive advantage developers ask for by name.
What are the differences between the GitHub Copilot Business, GitHub Copilot Enterprise, and GitHub Copilot Individual plans?
GitHub Copilot has multiple offerings for organizations and an offering for individual developers. All the offerings include both code completion and chat assistance. The primary differences between the organization offerings and the individual offering are license management, policy management, and IP indemnity.
Organizations can choose between GitHub Copilot Business and GitHub Copilot Enterprise, which is coming in February 2024. GitHub Copilot Business primarily features GitHub Copilot in the coding environment - that is the IDE and CLI. In early-2024, it will also include GitHub Copilot in GitHub Mobile. GitHub Copilot Enterprise includes everything in GitHub Copilot Business and adds an additional layer of personalization for organizations as well as GitHub Copilot integrated into GitHub as a chat interface to allow developers to converse about their codebase and action buttons throughout the platform. GitHub Copilot Enterprise can index an organization’s codebase for a deeper understanding of the customer’s knowledge for more tailored suggestions and will offer customers access to fine-tune custom, private models for code completion.
GitHub Copilot Individual is designed for individual developers, freelancers, students, educators, and open source maintainers. The plan includes all the features of GitHub Copilot Business except organizational license management, policy management, and IP indemnity.
What languages, IDEs, and platforms does GitHub Copilot support?
GitHub Copilot is trained on all languages that appear in public repositories. For each language, the quality of suggestions you receive may depend on the volume and diversity of training data for that language. For example, JavaScript is well-represented in public repositories and is one of GitHub Copilot’s best supported languages. Languages with less representation in public repositories may produce fewer or less robust suggestions.
GitHub Copilot is available as an extension in Visual Studio Code, Visual Studio, Vim, Neovim, the JetBrains suite of IDEs, and Azure Data Studio. Although code completion functionality is available across all these extensions, the chat is currently available only in Visual Studio Code and Visual Studio, with a beta version available for JetBrains IDEs. GitHub Copilot is also supported in terminals through GitHub CLI. GitHub Copilot will soon be natively integrated into GitHub web and mobile.
What data has GitHub Copilot been trained on?
GitHub Copilot is powered by generative AI models developed by GitHub, OpenAI, and Microsoft. It has been trained on natural language text and source code from publicly available sources, including code in public repositories on GitHub.
Does GitHub Copilot “copy/paste”?
No, GitHub Copilot generates suggestions using probabilistic reasoning.
- When thinking about intellectual property and open source issues, it is critical to understand how GitHub Copilot really works. The AI models that create Copilot’s suggestions may be trained on public code, but do not contain any code. When they generate a suggestion, they are not “copying and pasting” from any codebase.
- To generate a suggestion for code completion, Copilot begins by examining the code in your editor—focusing on the lines just before and after your cursor, but also information in other files open in your editor. That information is sent to Copilot’s model, to make a probabilistic determination of what is likely to come next and generate suggestions.
- To generate a suggestion from chat, such as providing an answer to a question from your chat prompt, Copilot creates a Contextual Prompt by combining (1) a “context summary” with (2) the question you submit to the Copilot Chat interface in your IDE. The contextual prompt is then sent to Copilot’s model to make a probabilistic determination of what is likely to come next and generate suggestions.
What data does GitHub use to create a Contextual Prompt?
GitHub Copilot Chat creates a Contextual Prompt by combining (1) a “context summary” with (2) the question you submit to your editor.
For instance, if you submit “what does this method do” the client for GitHub Copilot Chat will automatically examine the context from your active documents in the editor to determine what you mean by the word “this.” It will then compose a suitable question to pose to the GitHub Copilot Chat model that automatically includes that selection, without you needing to copy and paste code into the chat window, saving you time and delivering a hopefully useful response.
Depending on the question you ask, the GitHub Copilot Chat client will automatically use appropriate aspects of your context to form the question. The information it uses may include:
- The code file open in your active document
- Your selection (or “code blocks for the current cursor position”) in the document
- Summaries of related documents open in your editor or from the workspace
- Information about errors/warnings/messages/exceptions in your error list
- General workspace information, such as frameworks, languages, and dependencies
- Parts of related files in your workspace/project/repo
Privacy
Where can I learn about GitHub Copilot privacy, security, and responsible AI policies?
You can visit the GitHub Copilot Trust Center for more information on these topics.
What personal data does GitHub Copilot process?
GitHub Copilot leverages three categories of personal data: user engagement data, prompts, and suggestions.
User engagement data
User engagement data is usage information about events generated when interacting with a code editor. These events include user edit actions (for example completions accepted and dismissed), error messages, interactions with Chat UI, and general usage data to identify user metrics such as latency and feature engagement. This information may include personal data, such as pseudonymous identifiers.
Prompts
A prompt is the collection of code and supporting contextual information that the GitHub Copilot extension sends to GitHub to generate suggestions. A prompt is typically generated in one of two ways. In (1), to generate a suggestion for code completion the extension sends a prompt when a user working on a file pauses typing, or uses a designated keyboard shortcut to request a suggestion. In (2), to generate a suggestion for chat, a prompt is also generated when a user submits a query to GitHub Copilot chat.
Suggestions
A suggestion is one or more lines of proposed code and other output returned to the Copilot extension after a prompt is received and processed by the AI models that power GitHub Copilot. Suggestions come in two types: as code completions in your editor, and as responses to your chat queries.
Does GitHub Copilot for Business use my Prompts or Suggestions to train the AI model?
No, GitHub Copilot for Business does not use Prompts or Suggestions to train the AI model.
How else is personal data in GitHub Copilot Business used?
GitHub processes user engagement data to provide the service, including:
- To deliver functional capabilities as licensed, configured, and used by the customer and its users, including providing personalized user experiences;
- Troubleshooting (preventing, detecting, and repairing problems); and
- Keeping products up to date and performant, and enhancing, reliability, efficacy, quality, security, and user productivity.
Per Customer instruction in the GitHub DPA, GitHub processes user engagement data as a data controller for the purposes of:
- Billing and account management;
- Compensation such as calculating employee commissions and partner incentives;
- Aggregated internal reporting and business modeling, such as forecasting, revenue, capacity planning, and product strategy; and
- Aggregated financial reporting.
How long does GitHub Copilot retain personal data?
User engagement data
User engagement data is retained by GitHub for 24 months.
Prompts
For users of Copilot (code completions and chat) in the IDE, prompts are discarded by the service once a suggestion is returned to the user. By default, prompts are not stored for Copilot Business users. Copilot Individual users may opt out of allowing GitHub to retain prompts.
Suggestions
For users of Copilot (code completions and chat) in the IDE, suggestions are discarded by the service once it is returned to the user. By default, suggestions in the IDE are not stored for Copilot Business users (both code completions, and chat). Copilot Individual users may opt out of allowing GitHub to retain suggestions.
How are the transmitted Prompts and Suggestions protected?
We know that user edit actions, source code snippets, and URLs of repositories and file paths are sensitive data. Consequently, several measures of protection are applied, including:
- The transmitted data is encrypted in transit and at rest
- Access is strictly controlled. The data can only be accessed by (1) named GitHub personnel working on the GitHub Copilot team or on the GitHub platform health team, (2) Microsoft personnel working on or with the GitHub Copilot team, and (3) OpenAI personnel who work on GitHub Copilot
- Role-based access controls and multi-factor authentication are required for personnel accessing code snippet data
How can users of GitHub Copilot control use of their data?
User Engagement Data (which includes pseudonymous identifiers and general usage data), is required for the use of GitHub Copilot and will continue to be collected, processed, and shared with Microsoft as you use GitHub Copilot.
Prompts and Suggestions for GitHub Copilot Business users are not retained by GitHub.
Users of GitHub Copilot Individual can choose whether Prompts and Suggestions are retained by GitHub and further processed and shared with Microsoft by adjusting user settings. Users of GitHub Copilot Individual can request deletion of Prompts and Suggestions associated with their GitHub identity by filling out a support ticket.
Will my code be shared with other users?
No. We follow responsible practices in accordance with our Privacy Statement to ensure that your code snippets will not be used as suggested code for other users of GitHub Copilot.
Does GitHub Copilot ever output personal data?
Because the model powering GitHub Copilot was trained on publicly available code, its training set included personal data that was included in that code. From our internal testing, we found it to be very rare that GitHub Copilot suggestions included personal data verbatim from the training set.
In some cases, the model will suggest what appears to be personal data – email addresses, phone numbers, etc. – but those suggestions are actually fictitious information synthesized from patterns in training data and therefore do not relate to any particular individual. For example, when one of our engineers prompted GitHub Copilot with, “My name is Mona and my birthdate is,” GitHub Copilot suggested a random, fictitious date of “December 12,” which is not Mona’s actual birthdate.
We have also implemented a filter that blocks emails when shown in standard formats, but it’s still possible to get the model to suggest this sort of content if you try hard enough. We will keep improving the filter system to be more intelligent to detect and remove more personal data from the GitHub Copilot suggestions.
What if I’m accused of copyright infringement based on using a GitHub Copilot suggestion?
GitHub will defend you as provided in the GitHub Copilot Product Specific Terms.
Responsible AI
What are the intellectual property considerations when using GitHub Copilot?
The primary IP considerations for GitHub Copilot relate to copyright. The model that powers Copilot is trained on a broad collection of publicly accessible code, which may include copyrighted code, and Copilot’s suggestions (in rare instances) may resemble the code its model was trained on. Here’s some basic information you should know about these considerations:
Copyright law permits the use of copyrighted works to train AI models: Countries around the world have provisions in their copyright laws that enable machines to learn, understand, extract patterns, and facts from copyrighted materials, including software code. For example, the European Union, Japan, and Singapore, have expressed provisions permitting machine learning to develop AI models. Other countries including Canada, India, and the United States also permit such training under their fair use/fair dealing provisions. GitHub Copilot’s AI model was trained with the use of code from GitHub’s public repositories—which are publicly accessible and within the scope of permissible copyright use.
What about copyright risk in suggestions? In rare instances (less than 1% based on GitHub’s research), suggestions from GitHub may match examples of code used to train GitHub’s AI model. LLMs like Copilot, generate suggestions based on ‘probability’ and not copying, since the models do not contain copies of, or copy and paste, code from any source.
Our experience shows that matching suggestions are most likely to occur in two situations: (i) when there is little or no context in the code editor for Copilot’s model to synthesize, or (ii) when a matching suggestion represents a common approach or method. If a user accepts a Copilot suggestion matching existing copyrighted code, there is risk that using that suggestion could trigger claims of copyright infringement, which would depend on the amount and nature of code used, and the context of how the code is used. In many ways, this is the same risk that arises when using any code that a developer does not originate, such as copying code from an online source, or reusing code from a library. That is why responsible organizations and developers recommend that users employ code scanning policies to identify and evaluate potential matching code.
In Copilot, you can opt whether to allow Copilot to suggest code completions that match publicly available code on GitHub.com. For more information, see "Configuring GitHub Copilot settings on GitHub.com". If you have allowed suggestions that match public code, GitHub Copilot can provide you with details about the matching code when you accept such suggestions. Matching code does not necessarily mean copyright infringement, so it is ultimately up to the user to determine whether to use the suggestion, and what and who to attribute (along with other license compliance) in appropriate circumstances.
Does GitHub Copilot include a filtering mechanism to mitigate risk?
Yes, GitHub Copilot does include an optional code referencing filter to detect and suppress certain suggestions that match public code on GitHub.
- GitHub has created a duplication detection filter to detect and suppress GitHub Copilot suggestions that contain code that includes snippets of at least 150 characters that match public code on GitHub. This filter is enabled by the administrator for your enterprise and it can apply for all organizations within your enterprise or the administrator can defer control to individual organizations. Based on your settings, this filter is applied to code suggestions coming as part of the code completion functionality as well as to code blocks suggested as part of chat suggestions.
- With the filter enabled, Copilot checks code suggestions with its surrounding code for matches or near matches (ignoring whitespace) against public code on GitHub of about 150 characters. If there is a match, the suggestion will not be shown to the user.
Does GitHub Copilot include features to make it easier for users to identify potentially relevant open source licenses for matching suggestions?
Yes, GitHub Copilot is previewing a code referencing feature as an additional tool to assist users to find and review potentially relevant open source licenses. Code referencing is currently available in Visual Studio Code. This feature searches across public GitHub repositories for code that matches a Copilot suggestion. If there’s a match, users will find its information displayed in the Copilot console log, including where the match occurred, any applicable licenses, and a deep link to learn more. The deep link will take users to a navigable page on GitHub.com to browse examples of the code match and their repository licenses, and see how many repositories—including ones without licenses—that code appears in, as well as links to those repositories. Copilot users can review this information to determine whether the applicable suggestions are suitable for use, and whether additional measures may be necessary to use them.
Who owns the suggestions provided by GitHub Copilot?
We don’t determine whether a suggestion is capable of being owned, but we are clear that GitHub does not claim ownership of a suggestion.
Whether a suggestion generated by an AI model can be owned depends on many factors (e.g. the intellectual property law in the relevant country, the length of the suggestion, the extent that suggestion is considered ‘functional’ instead of expressive, etc).
- If a suggestion is capable of being owned, our terms are clear: GitHub does not claim ownership.
- GitHub does not claim ownership of any suggestion. In certain cases, it is possible for Copilot to produce similar suggestions to different users. For example, two unrelated users both starting new files to code the quicksort algorithm in Java will likely get the same suggestion. The possibility of providing similar suggestions to multiple users is a common part of generative AI systems.
Can GitHub Copilot introduce insecure code in its suggestions?
Public code may contain insecure coding patterns, bugs, or references to outdated APIs or idioms. When GitHub Copilot synthesizes code suggestions based on this data, it can also synthesize code that contains these undesirable patterns. Copilot has filters in place that either block or notify users of insecure code patterns that are detected in Copilot suggestions. These filters target the most common vulnerable coding patterns, including hardcoded credentials, SQL injections, and path injections. Additionally, in recent years we’ve provided tools such as GitHub Advanced Security, GitHub Actions, Dependabot, and CodeQL to open source projects to help improve code quality. Of course, you should always use GitHub Copilot together with good testing and code review practices and security tools, as well as your own judgment.
Is GitHub Copilot intended to fully automate code generation and replace developers?
No. Copilot is a tool intended to make developers more efficient. It’s not intended to replace developers, who should continue to apply the same sorts of safeguards and diligence they would apply with regard to any third-party code of unknown origin.
- The product is called “Copilot” not “Autopilot” and it’s not intended to generate code without oversight. You should use exactly the same sorts of safeguards and diligence with Copilot’s suggestions as you would use with any third-party code.
- Identifying best practices for use of third party code is beyond the scope of this section. That said, whatever practices your organization currently uses – rigorous functionality testing, code scanning, security testing, etc. – you should continue these policies with Copilot’s suggestions. Moreover, you should make sure your code editor or editor does not automatically compile or run generated code before you review it.
Can GitHub Copilot users simply use suggestions without concern?
Not necessarily. GitHub Copilot users should align their use of Copilot with their respective risk tolerances.
- As noted above, GitHub Copilot is not intended to replace developers, or their individual skill and judgment, and is not intended to fully automate the process of code development. The same risks that apply to the use of any third-party code apply to the use of Copilot’s suggestions.
- Depending on your particular use case, you should consider implementing the protections discussed above. It is your responsibility to assess what is appropriate for the situation and implement appropriate safeguards.
- You’re entitled to IP indemnification from GitHub for the unmodified suggestions when Copilot’s filtering is enabled. If you do elect to enable this feature, the copyright responsibility is ours, not our customers. As part of our ongoing commitment to responsible AI, GitHub and Microsoft extends our IP indemnity and protection support to our customers who are empowering their teams with GitHub Copilot. Details here.
Does GitHub Copilot support accessibility features?
We are conducting internal testing of GitHub Copilot’s ease of use by developers with disabilities and working to ensure that GitHub Copilot is accessible to all developers. Please feel free to share your feedback on GitHub Copilot accessibility in our feedback forum.
Does GitHub Copilot produce offensive outputs?
GitHub Copilot includes filters to block offensive language in the prompts and to avoid synthesizing suggestions in sensitive contexts. We continue to work on improving the filter system to more intelligently detect and remove offensive outputs. If you see offensive outputs, please report them directly to copilot-safety@github.com so that we can improve our safeguards. GitHub takes this challenge very seriously and we are committed to addressing it.
Will GitHub Copilot work as well using languages other than English?
Given public sources are predominantly in English, GitHub Copilot will likely work less well in scenarios where natural language prompts provided by the developer are not in English and/or are grammatically incorrect. Therefore, non-English speakers might experience a lower quality of service.
Upcoming features and offerings
How does the GitHub Copilot Enterprise waitlist work?
By joining the GitHub Copilot Enterprise waitlist, you are expressing your intent to be contacted about the upcoming GitHub Copilot offering. You may also get an opportunity to test a pre-release version of the product. If you are an administrator or member of a GitHub Enterprise Cloud account, you will be able to nominate a specific enterprise to try GitHub Copilot Enterprise in private beta.
How does the GitHub Copilot Customization waitlist work?
By joining the GitHub Copilot Customization waitlist, you are expressing your intent to be contacted about the upcoming GitHub Copilot offering. You may also get an opportunity to test a pre-release version of the product. If you are an administrator for an organization using GitHub Enterprise Cloud or GitHub Copilot Business, you will be able to nominate a specific organization to try GitHub Copilot Customization in private beta.
I’m currently participating in or am on the waitlist for the GitHub Next technical preview for GitHub Copilot for CLI. What should I do now that there is a public beta for GitHub Copilot in the CLI?
We recommend joining the public beta. You can find information on how to join here.
I joined the public beta for GitHub Copilot in the CLI but the feature looks different than it did during the GitHub Next technical preview. Why?
The feedback we get from users during GitHub Next technical previews directly informs how we refine those features before moving them into production. That’s one of the reasons technical previews are so valuable, and it is not unusual for a feature to change when it moves from GitHub Next to the GitHub product and engineering teams.
GitHub announced a private beta of Github Copilot pull request summaries, but only through the new GitHub Copilot Enterprise plan. What does that mean for users that are currently participating in the GitHub Next technical preview for GitHub Copilot for Pull Requests or are on that waitlist?
We have closed the waitlist for the GitHub Next technical preview. Users that have already been admitted to the technical preview can continue to experiment with GitHub Copilot for Pull Requests. GitHub Copilot features for pull requests are currently limited to the GitHub Copilot Enterprise plan. If you are a developer or administrator that’s part of an enterprise, you can join the GitHub Copilot Enterprise waitlist here.
GitHub announced a private beta of Github Copilot documentation search, but only through the new GitHub Copilot Enterprise plan. What does that mean for users that are currently participating in the GitHub Next technical preview for GitHub Copilot for Docs or are on that waitlist?
We have closed the waitlist for the GitHub Next technical preview. Users that have already been admitted to the technical preview can continue to experiment with GitHub Copilot for Docs. GitHub Copilot features for documentation are currently limited to the GitHub Copilot Enterprise plan. If you are a developer or administrator that’s part of an enterprise, you can join the GitHub Copilot Enterprise waitlist here.
When will Copilot Chat be available on github.com?
Copilot Chat will be available on github.com to users that access GitHub Copilot through the Copilot Enterprise plan when that plan becomes generally available in February.
When will Copilot Chat be available on mobile?
We do not have a set timeline for making Copilot Chat available on mobile. We’ll continue to update this page with the latest information on new capabilities for various plans.
General
What is GitHub Copilot?
GitHub Copilot transforms the developer experience. Backed by the leaders in AI, Copilot provides contextualized assistance throughout the software development lifecycle, from code completions and chat assistance in the IDE to code explanations and answers to docs in GitHub and more. With Copilot elevating their workflow, developers can focus on more: value, innovation, and happiness.
Copilot enables developers to focus more energy on problem solving and collaboration and spend less effort on the mundane and boilerplate. That’s why developers who use Copilot report up to 75% higher satisfaction with their jobs than those who don’t and are up to 55% more productive at writing code without sacrifice to quality, which all adds up to engaged developers shipping great software faster.
Copilot integrates with leading editors, including Visual Studio Code, Visual Studio, JetBrains IDEs, and Neovim, and, unlike other AI coding assistants, is natively built into GitHub. Although code completion functionality is available across all these extensions, the chat is currently available only in Visual Studio Code and Visual Studio, with a beta version available for JetBrains IDEs. Growing to millions of individual users and tens of thousands of business customers, Copilot is the world’s most widely adopted AI developer tool and the competitive advantage developers ask for by name.
What are the differences between the GitHub Copilot Business, GitHub Copilot Enterprise, and GitHub Copilot Individual plans?
GitHub Copilot has multiple offerings for organizations and an offering for individual developers. All the offerings include both code completion and chat assistance. The primary differences between the organization offerings and the individual offering are license management, policy management, and IP indemnity.
Organizations can choose between GitHub Copilot Business and GitHub Copilot Enterprise, which is coming in February 2024. GitHub Copilot Business primarily features GitHub Copilot in the coding environment - that is the IDE and CLI. In early-2024, it will also include GitHub Copilot in GitHub Mobile. GitHub Copilot Enterprise includes everything in GitHub Copilot Business and adds an additional layer of personalization for organizations as well as GitHub Copilot integrated into GitHub as a chat interface to allow developers to converse about their codebase and action buttons throughout the platform. GitHub Copilot Enterprise can index an organization’s codebase for a deeper understanding of the customer’s knowledge for more tailored suggestions and will offer customers access to fine-tune custom, private models for code completion.
GitHub Copilot Individual is designed for individual developers, freelancers, students, educators, and open source maintainers. The plan includes all the features of GitHub Copilot Business except organizational license management, policy management, and IP indemnity.
What languages, IDEs, and platforms does GitHub Copilot support?
GitHub Copilot is trained on all languages that appear in public repositories. For each language, the quality of suggestions you receive may depend on the volume and diversity of training data for that language. For example, JavaScript is well-represented in public repositories and is one of GitHub Copilot’s best supported languages. Languages with less representation in public repositories may produce fewer or less robust suggestions.
GitHub Copilot is available as an extension in Visual Studio Code, Visual Studio, Vim, Neovim, the JetBrains suite of IDEs, and Azure Data Studio. Although code completion functionality is available across all these extensions, the chat is currently available only in Visual Studio Code and Visual Studio, with a beta version available for JetBrains IDEs. GitHub Copilot is also supported in terminals through GitHub CLI. GitHub Copilot will soon be natively integrated into GitHub web and mobile.
What data has GitHub Copilot been trained on?
GitHub Copilot is powered by generative AI models developed by GitHub, OpenAI, and Microsoft. It has been trained on natural language text and source code from publicly available sources, including code in public repositories on GitHub.
Does GitHub Copilot “copy/paste”?
No, GitHub Copilot generates suggestions using probabilistic reasoning.
- When thinking about intellectual property and open source issues, it is critical to understand how GitHub Copilot really works. The AI models that create Copilot’s suggestions may be trained on public code, but do not contain any code. When they generate a suggestion, they are not “copying and pasting” from any codebase.
- To generate a suggestion for code completion, Copilot begins by examining the code in your editor—focusing on the lines just before and after your cursor, but also information in other files open in your editor. That information is sent to Copilot’s model, to make a probabilistic determination of what is likely to come next and generate suggestions.
- To generate a suggestion from chat, such as providing an answer to a question from your chat prompt, Copilot creates a Contextual Prompt by combining (1) a “context summary” with (2) the question you submit to the Copilot Chat interface in your IDE. The contextual prompt is then sent to Copilot’s model to make a probabilistic determination of what is likely to come next and generate suggestions.
What data does GitHub use to create a Contextual Prompt?
GitHub Copilot Chat creates a Contextual Prompt by combining (1) a “context summary” with (2) the question you submit to your editor.
For instance, if you submit “what does this method do” the client for GitHub Copilot Chat will automatically examine the context from your active documents in the editor to determine what you mean by the word “this.” It will then compose a suitable question to pose to the GitHub Copilot Chat model that automatically includes that selection, without you needing to copy and paste code into the chat window, saving you time and delivering a hopefully useful response.
Depending on the question you ask, the GitHub Copilot Chat client will automatically use appropriate aspects of your context to form the question. The information it uses may include:
- The code file open in your active document
- Your selection (or “code blocks for the current cursor position”) in the document
- Summaries of related documents open in your editor or from the workspace
- Information about errors/warnings/messages/exceptions in your error list
- General workspace information, such as frameworks, languages, and dependencies
- Parts of related files in your workspace/project/repo
Privacy
Where can I learn about GitHub Copilot privacy, security, and responsible AI policies?
You can visit the GitHub Copilot Trust Center for more information on these topics.
What personal data does GitHub Copilot process?
GitHub Copilot leverages three categories of personal data: user engagement data, prompts, and suggestions.
User engagement data
User engagement data is usage information about events generated when interacting with a code editor. These events include user edit actions (for example completions accepted and dismissed), error messages, interactions with Chat UI, and general usage data to identify user metrics such as latency and feature engagement. This information may include personal data, such as pseudonymous identifiers.
Prompts
A prompt is the collection of code and supporting contextual information that the GitHub Copilot extension sends to GitHub to generate suggestions. A prompt is typically generated in one of two ways. In (1), to generate a suggestion for code completion the extension sends a prompt when a user working on a file pauses typing, or uses a designated keyboard shortcut to request a suggestion. In (2), to generate a suggestion for chat, a prompt is also generated when a user submits a query to GitHub Copilot chat.
Suggestions
A suggestion is one or more lines of proposed code and other output returned to the Copilot extension after a prompt is received and processed by the AI models that power GitHub Copilot. Suggestions come in two types: as code completions in your editor, and as responses to your chat queries.
Does GitHub Copilot for Business use my Prompts or Suggestions to train the AI model?
No, GitHub Copilot for Business does not use Prompts or Suggestions to train the AI model.
How else is personal data in GitHub Copilot Business used?
GitHub processes user engagement data to provide the service, including:
- To deliver functional capabilities as licensed, configured, and used by the customer and its users, including providing personalized user experiences;
- Troubleshooting (preventing, detecting, and repairing problems); and
- Keeping products up to date and performant, and enhancing, reliability, efficacy, quality, security, and user productivity.
Per Customer instruction in the GitHub DPA, GitHub processes user engagement data as a data controller for the purposes of:
- Billing and account management;
- Compensation such as calculating employee commissions and partner incentives;
- Aggregated internal reporting and business modeling, such as forecasting, revenue, capacity planning, and product strategy; and
- Aggregated financial reporting.
How long does GitHub Copilot retain personal data?
User engagement data
User engagement data is retained by GitHub for 24 months.
Prompts
For users of Copilot (code completions and chat) in the IDE, prompts are discarded by the service once a suggestion is returned to the user. By default, prompts are not stored for Copilot Business users. Copilot Individual users may opt out of allowing GitHub to retain prompts.
Suggestions
For users of Copilot (code completions and chat) in the IDE, suggestions are discarded by the service once it is returned to the user. By default, suggestions in the IDE are not stored for Copilot Business users (both code completions, and chat). Copilot Individual users may opt out of allowing GitHub to retain suggestions.
How are the transmitted Prompts and Suggestions protected?
We know that user edit actions, source code snippets, and URLs of repositories and file paths are sensitive data. Consequently, several measures of protection are applied, including:
- The transmitted data is encrypted in transit and at rest
- Access is strictly controlled. The data can only be accessed by (1) named GitHub personnel working on the GitHub Copilot team or on the GitHub platform health team, (2) Microsoft personnel working on or with the GitHub Copilot team, and (3) OpenAI personnel who work on GitHub Copilot
- Role-based access controls and multi-factor authentication are required for personnel accessing code snippet data
How can users of GitHub Copilot control use of their data?
User Engagement Data (which includes pseudonymous identifiers and general usage data), is required for the use of GitHub Copilot and will continue to be collected, processed, and shared with Microsoft as you use GitHub Copilot.
Prompts and Suggestions for GitHub Copilot Business users are not retained by GitHub.
Users of GitHub Copilot Individual can choose whether Prompts and Suggestions are retained by GitHub and further processed and shared with Microsoft by adjusting user settings. Users of GitHub Copilot Individual can request deletion of Prompts and Suggestions associated with their GitHub identity by filling out a support ticket.
Will my code be shared with other users?
No. We follow responsible practices in accordance with our Privacy Statement to ensure that your code snippets will not be used as suggested code for other users of GitHub Copilot.
Does GitHub Copilot ever output personal data?
Because the model powering GitHub Copilot was trained on publicly available code, its training set included personal data that was included in that code. From our internal testing, we found it to be very rare that GitHub Copilot suggestions included personal data verbatim from the training set.
In some cases, the model will suggest what appears to be personal data – email addresses, phone numbers, etc. – but those suggestions are actually fictitious information synthesized from patterns in training data and therefore do not relate to any particular individual. For example, when one of our engineers prompted GitHub Copilot with, “My name is Mona and my birthdate is,” GitHub Copilot suggested a random, fictitious date of “December 12,” which is not Mona’s actual birthdate.
We have also implemented a filter that blocks emails when shown in standard formats, but it’s still possible to get the model to suggest this sort of content if you try hard enough. We will keep improving the filter system to be more intelligent to detect and remove more personal data from the GitHub Copilot suggestions.
What if I’m accused of copyright infringement based on using a GitHub Copilot suggestion?
GitHub will defend you as provided in the GitHub Copilot Product Specific Terms.
Responsible AI
What are the intellectual property considerations when using GitHub Copilot?
The primary IP considerations for GitHub Copilot relate to copyright. The model that powers Copilot is trained on a broad collection of publicly accessible code, which may include copyrighted code, and Copilot’s suggestions (in rare instances) may resemble the code its model was trained on. Here’s some basic information you should know about these considerations:
Copyright law permits the use of copyrighted works to train AI models: Countries around the world have provisions in their copyright laws that enable machines to learn, understand, extract patterns, and facts from copyrighted materials, including software code. For example, the European Union, Japan, and Singapore, have expressed provisions permitting machine learning to develop AI models. Other countries including Canada, India, and the United States also permit such training under their fair use/fair dealing provisions. GitHub Copilot’s AI model was trained with the use of code from GitHub’s public repositories—which are publicly accessible and within the scope of permissible copyright use.
What about copyright risk in suggestions? In rare instances (less than 1% based on GitHub’s research), suggestions from GitHub may match examples of code used to train GitHub’s AI model. LLMs like Copilot, generate suggestions based on ‘probability’ and not copying, since the models do not contain copies of, or copy and paste, code from any source.
Our experience shows that matching suggestions are most likely to occur in two situations: (i) when there is little or no context in the code editor for Copilot’s model to synthesize, or (ii) when a matching suggestion represents a common approach or method. If a user accepts a Copilot suggestion matching existing copyrighted code, there is risk that using that suggestion could trigger claims of copyright infringement, which would depend on the amount and nature of code used, and the context of how the code is used. In many ways, this is the same risk that arises when using any code that a developer does not originate, such as copying code from an online source, or reusing code from a library. That is why responsible organizations and developers recommend that users employ code scanning policies to identify and evaluate potential matching code.
In Copilot, you can opt whether to allow Copilot to suggest code completions that match publicly available code on GitHub.com. For more information, see "Configuring GitHub Copilot settings on GitHub.com". If you have allowed suggestions that match public code, GitHub Copilot can provide you with details about the matching code when you accept such suggestions. Matching code does not necessarily mean copyright infringement, so it is ultimately up to the user to determine whether to use the suggestion, and what and who to attribute (along with other license compliance) in appropriate circumstances.
Does GitHub Copilot include a filtering mechanism to mitigate risk?
Yes, GitHub Copilot does include an optional code referencing filter to detect and suppress certain suggestions that match public code on GitHub.
- GitHub has created a duplication detection filter to detect and suppress GitHub Copilot suggestions that contain code that includes snippets of at least 150 characters that match public code on GitHub. This filter is enabled by the administrator for your enterprise and it can apply for all organizations within your enterprise or the administrator can defer control to individual organizations. Based on your settings, this filter is applied to code suggestions coming as part of the code completion functionality as well as to code blocks suggested as part of chat suggestions.
- With the filter enabled, Copilot checks code suggestions with its surrounding code for matches or near matches (ignoring whitespace) against public code on GitHub of about 150 characters. If there is a match, the suggestion will not be shown to the user.
Does GitHub Copilot include features to make it easier for users to identify potentially relevant open source licenses for matching suggestions?
Yes, GitHub Copilot is previewing a code referencing feature as an additional tool to assist users to find and review potentially relevant open source licenses. Code referencing is currently available in Visual Studio Code. This feature searches across public GitHub repositories for code that matches a Copilot suggestion. If there’s a match, users will find its information displayed in the Copilot console log, including where the match occurred, any applicable licenses, and a deep link to learn more. The deep link will take users to a navigable page on GitHub.com to browse examples of the code match and their repository licenses, and see how many repositories—including ones without licenses—that code appears in, as well as links to those repositories. Copilot users can review this information to determine whether the applicable suggestions are suitable for use, and whether additional measures may be necessary to use them.
Who owns the suggestions provided by GitHub Copilot?
We don’t determine whether a suggestion is capable of being owned, but we are clear that GitHub does not claim ownership of a suggestion.
Whether a suggestion generated by an AI model can be owned depends on many factors (e.g. the intellectual property law in the relevant country, the length of the suggestion, the extent that suggestion is considered ‘functional’ instead of expressive, etc).
- If a suggestion is capable of being owned, our terms are clear: GitHub does not claim ownership.
- GitHub does not claim ownership of any suggestion. In certain cases, it is possible for Copilot to produce similar suggestions to different users. For example, two unrelated users both starting new files to code the quicksort algorithm in Java will likely get the same suggestion. The possibility of providing similar suggestions to multiple users is a common part of generative AI systems.
Can GitHub Copilot introduce insecure code in its suggestions?
Public code may contain insecure coding patterns, bugs, or references to outdated APIs or idioms. When GitHub Copilot synthesizes code suggestions based on this data, it can also synthesize code that contains these undesirable patterns. Copilot has filters in place that either block or notify users of insecure code patterns that are detected in Copilot suggestions. These filters target the most common vulnerable coding patterns, including hardcoded credentials, SQL injections, and path injections. Additionally, in recent years we’ve provided tools such as GitHub Advanced Security, GitHub Actions, Dependabot, and CodeQL to open source projects to help improve code quality. Of course, you should always use GitHub Copilot together with good testing and code review practices and security tools, as well as your own judgment.
Is GitHub Copilot intended to fully automate code generation and replace developers?
No. Copilot is a tool intended to make developers more efficient. It’s not intended to replace developers, who should continue to apply the same sorts of safeguards and diligence they would apply with regard to any third-party code of unknown origin.
- The product is called “Copilot” not “Autopilot” and it’s not intended to generate code without oversight. You should use exactly the same sorts of safeguards and diligence with Copilot’s suggestions as you would use with any third-party code.
- Identifying best practices for use of third party code is beyond the scope of this section. That said, whatever practices your organization currently uses – rigorous functionality testing, code scanning, security testing, etc. – you should continue these policies with Copilot’s suggestions. Moreover, you should make sure your code editor or editor does not automatically compile or run generated code before you review it.
Can GitHub Copilot users simply use suggestions without concern?
Not necessarily. GitHub Copilot users should align their use of Copilot with their respective risk tolerances.
- As noted above, GitHub Copilot is not intended to replace developers, or their individual skill and judgment, and is not intended to fully automate the process of code development. The same risks that apply to the use of any third-party code apply to the use of Copilot’s suggestions.
- Depending on your particular use case, you should consider implementing the protections discussed above. It is your responsibility to assess what is appropriate for the situation and implement appropriate safeguards.
- You’re entitled to IP indemnification from GitHub for the unmodified suggestions when Copilot’s filtering is enabled. If you do elect to enable this feature, the copyright responsibility is ours, not our customers. As part of our ongoing commitment to responsible AI, GitHub and Microsoft extends our IP indemnity and protection support to our customers who are empowering their teams with GitHub Copilot. Details here.
Does GitHub Copilot support accessibility features?
We are conducting internal testing of GitHub Copilot’s ease of use by developers with disabilities and working to ensure that GitHub Copilot is accessible to all developers. Please feel free to share your feedback on GitHub Copilot accessibility in our feedback forum.
Does GitHub Copilot produce offensive outputs?
GitHub Copilot includes filters to block offensive language in the prompts and to avoid synthesizing suggestions in sensitive contexts. We continue to work on improving the filter system to more intelligently detect and remove offensive outputs. If you see offensive outputs, please report them directly to copilot-safety@github.com so that we can improve our safeguards. GitHub takes this challenge very seriously and we are committed to addressing it.
Will GitHub Copilot work as well using languages other than English?
Given public sources are predominantly in English, GitHub Copilot will likely work less well in scenarios where natural language prompts provided by the developer are not in English and/or are grammatically incorrect. Therefore, non-English speakers might experience a lower quality of service.
Upcoming features and offerings
How does the GitHub Copilot Enterprise waitlist work?
By joining the GitHub Copilot Enterprise waitlist, you are expressing your intent to be contacted about the upcoming GitHub Copilot offering. You may also get an opportunity to test a pre-release version of the product. If you are an administrator or member of a GitHub Enterprise Cloud account, you will be able to nominate a specific enterprise to try GitHub Copilot Enterprise in private beta.
How does the GitHub Copilot Customization waitlist work?
By joining the GitHub Copilot Customization waitlist, you are expressing your intent to be contacted about the upcoming GitHub Copilot offering. You may also get an opportunity to test a pre-release version of the product. If you are an administrator for an organization using GitHub Enterprise Cloud or GitHub Copilot Business, you will be able to nominate a specific organization to try GitHub Copilot Customization in private beta.
I’m currently participating in or am on the waitlist for the GitHub Next technical preview for GitHub Copilot for CLI. What should I do now that there is a public beta for GitHub Copilot in the CLI?
We recommend joining the public beta. You can find information on how to join here.
I joined the public beta for GitHub Copilot in the CLI but the feature looks different than it did during the GitHub Next technical preview. Why?
The feedback we get from users during GitHub Next technical previews directly informs how we refine those features before moving them into production. That’s one of the reasons technical previews are so valuable, and it is not unusual for a feature to change when it moves from GitHub Next to the GitHub product and engineering teams.
GitHub announced a private beta of Github Copilot pull request summaries, but only through the new GitHub Copilot Enterprise plan. What does that mean for users that are currently participating in the GitHub Next technical preview for GitHub Copilot for Pull Requests or are on that waitlist?
We have closed the waitlist for the GitHub Next technical preview. Users that have already been admitted to the technical preview can continue to experiment with GitHub Copilot for Pull Requests. GitHub Copilot features for pull requests are currently limited to the GitHub Copilot Enterprise plan. If you are a developer or administrator that’s part of an enterprise, you can join the GitHub Copilot Enterprise waitlist here.
GitHub announced a private beta of Github Copilot documentation search, but only through the new GitHub Copilot Enterprise plan. What does that mean for users that are currently participating in the GitHub Next technical preview for GitHub Copilot for Docs or are on that waitlist?
We have closed the waitlist for the GitHub Next technical preview. Users that have already been admitted to the technical preview can continue to experiment with GitHub Copilot for Docs. GitHub Copilot features for documentation are currently limited to the GitHub Copilot Enterprise plan. If you are a developer or administrator that’s part of an enterprise, you can join the GitHub Copilot Enterprise waitlist here.
When will Copilot Chat be available on github.com?
Copilot Chat will be available on github.com to users that access GitHub Copilot through the Copilot Enterprise plan when that plan becomes generally available in February.
When will Copilot Chat be available on mobile?
We do not have a set timeline for making Copilot Chat available on mobile. We’ll continue to update this page with the latest information on new capabilities for various plans.
- Chat in Visual Studio Code and Visual Studio now generally available. Chat in JetBrains IDEs currently in preview and generally available in 2024. Instructions to enable the preview can be found here.
- Chat in GitHub Mobile is coming soon.
- Currently in preview. Instructions to enable Copilot in the CLI in preview can be found here.
- Currently in preview. Copilot Enterprise grants you access to customization for fine-tuned models (available in 2024).