Responsible AI and the Ethical Use of Copilot
Learn how to use GitHub Copilot responsibly by understanding privacy, licensing, attribution, and ethical coding practices in projects.
We'll cover the following...
GitHub Copilot is a powerful tool, capable of suggesting entire functions, refactoring your code, and speeding up development like never before. But with that power comes responsibility. We developers must understand that AI-assisted coding isn’t just about speed and convenience. It’s also about trust, safety, and ethics.
When Copilot suggests code, it draws from billions of lines of publicly available open-source code. These suggestions may resemble existing copyrighted material or reflect outdated or biased patterns. You could unintentionally introduce security flaws, legal issues, or non-inclusive language into your project without careful attention.
Steps to use GitHub Copilot safely and ethically
Let’s discuss step-by-step how GitHub can be safely used within our projects:
Protecting your data and privacy
Copilot works by reading the files in your IDE and suggesting completions based on public training data.
It does not send your code to the cloud, but if you’re not careful, there are still privacy risks.
What you should do:
Don’t include secrets like passwords or API keys in your code.
Keep sensitive business logic out of comments or temporary files that Copilot might scan.
Example:
// Never hardcode sensitive valuesconst ADMIN_PASSWORD = "123456";
Never use a hardcoded sensitive value; use an environment variable instead:
// Use environment variables insteadconst ADMIN_PASSWORD = process.env.ADMIN_PASSWORD;
Even if Copilot suggests a shortcut with a secret value, always stop and check. Think of Copilot as helpful, but not always safe by default.
Public vs. private code filters
GitHub Copilot generates code suggestions using large amounts of publicly available open-source code. While this is powerful, it also introduces potential legal and ethical risks.
Sometimes, Copilot may suggest a code snippet that closely matches or exactly copies code from an open-source project. If that code comes from a project with a restrictive license (like GPL or AGPL), reusing it without proper attribution or license compatibility can violate software licenses.
For example:
// Copilot might suggest this exact code block:function debounce(func, wait) {let timeout;return function (...args) {clearTimeout(timeout);timeout = setTimeout(() => func.apply(this, args), wait);};}
This code might seem simple, but what if it’s lifted directly from an open-source project under a license that requires attribution or limits commercial use? You could unintentionally violate license terms if you copy-paste this without verifying its source.
Enable the public code filter
To reduce the chances of Copilot suggesting directly copied code from public repositories, GitHub provides a public code filter. When enabled, Copilot will block suggestions that match public code, unless that code is in a permissive license or already exists in your project.
Scroll to “Suggestions matching public code (duplication detection filter).”
Turn on the toggle for:
Block suggestions matching public code. Now, GitHub won’t show suggestions matching public code. These suggestions are used to avoid license risks and ensure safer, original code, especially for commercial or team projects.
GitHub allows you to enforce this setting across your entire organization if you’re part of a team or managing multiple collaborators. You can do this by committing an org-level policy file that requires the public code filter for everyone.
Example:
Place the .github/copilot.yml
file in the root directory of your repository or the .github/
folder at the root:
copilot:policy:allowMatchingPublicCode: false
🔐 Commit this file to the root of your project or across your organization’s repositories.
This ensures consistent, license-safe behavior for all team members using Copilot.
Attribution and open source software (OSS)
GitHub Copilot generates code based on patterns learned from vast amounts of publicly available code. But Copilot is a predictive engine, not a citation tool. That means:
It may produce code that’s similar or identical to open-source snippets.
It doesn’t always tell you where the code came from or what license it was under.
Some open-source licenses, like GPL, LGPL, or even MIT, require that if you reuse code:
You include attribution (credit to the original author) or
You adhere to specific usage conditions.
Failing to do so could result in legal liability or violating the OSS community trust.
Best practices for attribution
If Copilot’s suggestion looks familiar, or if you recognize it from a popular library or GitHub project, take these steps before using it:
To check if the code comes from a library with a compatible license. Use online tools like:
When you do reuse or adapt open-source code, give credit, right in the code or your License
file:
// Code adapted from react-table by Tanner Linsley// License: MIT – https://github.com/TanStack/react-table
This protects you legally and shows respect for the original author’s work.
If your project is hosted on GitHub, it may already detect the licenses of included dependencies. Make sure to review and act on any alerts or suggestions.
Set an organizational-level ethical policy
If you’re working in a team, company, or contributing to open-source, setting clear usage policies for GitHub Copilot ensures that:
Everyone follows consistent ethical standards.
There’s less risk of accidental license violations.
Teams remain accountable for AI-assisted code.
Organizations can commit a configuration file to enforce safe practices across all contributors.
Create a file like: .github/copilot.yml
{"copilot": {"publicCodeFilter": "enabled","licenseSafeMode": true}}
This will:
Enforce the public code filter, blocking suggestions that directly match open-source code.
Turn on license-safe mode, encouraging attribution and proper reuse.
💡 This is especially important for collaborative or open repositories like your
daily-task-app
where multiple developers contribute.
Bias and fairness in AI-generated code
As Copilot is trained on real-world public code, it may unintentionally mirror biases in that data.
These biases can appear in:
Variable and function names (e.g.,
master/slave
,whitelist/blacklist
)Outdated or stereotyped examples
Default assumptions in business logic
Even though Copilot writes the code, you are still responsible for the ethics and quality of what goes into your project.
Here’s how to keep things fair and inclusive:
// Not ideallet masterList = [];// Betterlet mainTaskList = [];
General reflective questions for when you’re dealing with Copilot suggestions:
Is this code free of harmful assumptions?
Does it make sense for all users and contexts?
Does it reflect modern, respectful language?
If Copilot suggests something biased, improve it. This is part of being a responsible developer.
Task: Apply in your app (daily task app)
Let’s say you’re using Copilot to add the “Edit Task” functionality.
You type:
// Add logic for editing task title
Copilot suggests:
const [editingId, setEditingId] = useState(null);const [editingValue, setEditingValue] = useState("");
Before accepting the code, ask yourself:
Does this feel familiar or common?
Could it be from a well-known library with a restrictive license?
Should I check if this pattern is attributed properly elsewhere?
Tip: You can ask Copilot chat, “Does this snippet come from a known open-source project?” While it won’t give you a definite source, it may provide clues.
When in doubt:
Search the snippet on GitHub or Google.
Validate its source and license.
Add attribution if needed.
What’s next?
Now that you’ve learned how to use GitHub Copilot safely and ethically, it’s time to shift gears and focus on communicating effectively with Copilot to obtain the code you want. In the next lesson, you’ll learn the fundamentals of prompt engineering for code generation. Let’s take your AI coding superpower to the next level.