Who owns your data? Who decides how it's used?
And what happens when data causes harm?
"Just because data can be collected doesn't mean it should be."
Data Science for Young Minds · Grade 5 · Ages 10–11
The app is FREE because it sells "anonymized" health data to:
Quick poll: Thumbs up if you'd still use the app. Thumbs down if you wouldn't.
There's no wrong answer — but why did you choose what you chose?
Data ethics is the study of what is right and wrong when it comes to collecting, storing, sharing, and using data — especially data about people.
Imagine you wrote in a diary every day. Would you want your school to read it? Your parents? The government? A company trying to sell you things?
Privacy means you get to decide — not others.
Key idea: Privacy isn't about having "something to hide." It's about having the power to decide what you share, with whom, and why.
Every time you click "I Agree" on an app, you may be consenting to share a lot of data. But is that real consent?
Discussion: If you didn't read the TOS, did you really consent? Is clicking "agree" enough?
A free fitness app tracks your steps, sleep, and heart rate. It sells your health data to insurance companies. People who exercise more pay lower insurance premiums. People who don't exercise pay more.
The real question: Is it ethical to use data to make decisions that affect people's lives when the data doesn't capture the full picture of their situation?
A city uses an AI algorithm to predict which neighborhoods are most likely to have crime. Police are sent there more often. The algorithm was trained on historical arrest data.
The bias cycle:
Key insight: If the data used to train an algorithm comes from a biased system, the algorithm will learn and repeat that bias — even if no one programs it to be biased.
A school district installs AI cameras that track every student's movement and flag "unusual behavior" to administrators in real time. Parents were notified by newsletter but not asked to vote.
Power question: Should students have a say in surveillance of their own school? Is a newsletter enough for consent?
Algorithmic bias happens when a computer system produces results that unfairly advantage some groups over others — even when no one intended it.
"Garbage in, garbage out" — biased data creates biased algorithms, even if the math is perfect.
Organizations should be honest about:
When data causes harm, someone must answer for it.
Discussion: If a self-driving car hits someone because of a coding decision made years ago, who is responsible — the programmer, the company, the car owner?
In many places, people have legal rights over their data. Knowing these rights is the first step to protecting yourself.
You can ask companies what data they have about you. Many are legally required to tell you.
If data about you is wrong, you have the right to get it corrected — especially if it's affecting decisions about you.
In some regions, you can ask companies to delete your data entirely. This is called the "right to be forgotten."
Note: These rights are not equal everywhere. In the US, kids under 13 have COPPA protections. The EU has stronger laws (GDPR). Some places have almost no protections. Law is still catching up to technology.
Something can be legal (allowed by law) but still unethical (morally wrong). And sometimes things that feel right are technically illegal.
Data scientists and tech workers make ethical choices every day — even when they aren't required to by law.
Stand = Option A | Sit = Option B | No wrong answers!
Q1: A) A free app that collects your data | B) A $5/month app that collects nothing
Q2: A) AI that reduces crime 20% but sometimes wrong | B) Human police only, no AI predictions
Q3: A) School reads all student emails "just in case" | B) School reads none even for safety concerns
Q4: A) Doctors access your health data without asking in emergencies | B) Always ask first, even in emergencies
Before you leave: Write one sentence — your personal data ethics pledge.
"As a data citizen, I will always ask..."
Next session: You become the data scientist — Session 8: Capstone Research Project