๐Ÿ“‹ Instructor Cheat Sheet โ€” Session 7: Data Ethics

Data Science for Young Minds ยท Grade 5 ยท Ages 10โ€“11 ยท 60 minutes
Session 7 of 8 Data Ethics Group Discussion No Hands-On Activity
โฑ๏ธ Session Agenda
TimeBlockWhat You DoStudent Activity
0โ€“5HookShow the "fitness tracker data sold to insurance" headline. Ask: should companies be able to do this?Think-pair-share, quick poll (thumbs up/down)
5โ€“18Direct InstructionTeach 5 data ethics principles: privacy, consent, transparency, fairness, accountabilityFill in vocab definitions on worksheet
18โ€“40Case StudiesWalk through 3 nuanced cases โ€” no clear right answer. Facilitate structured debate using sentence framesSmall groups, worksheet case analysis
40โ€“48DebriefCollect class positions on each case; surface disagreements; emphasize that reasonable people disagreeClass vote + share reasoning
48โ€“55Brain Break"Would You Rather?" data ethics edition โ€” fast-paced, no wrong answerStand = Option A, Sit = Option B
55โ€“60Wrap-UpAssign take-home; preview Session 8 (their own research project)Write one data ethics pledge sentence
๐Ÿ“– Key Vocabulary
Data Privacy
The right of individuals to control who can access their personal information and how it is used.
Informed Consent
Agreeing to share data after fully understanding what will be collected, how, and why.
Data Transparency
Being open and honest about what data is collected, who collects it, and what it is used for.
Algorithmic Bias
When a computer system produces unfair or discriminatory results due to biased training data or design.
Data Accountability
The responsibility of those who collect and use data to do so fairly and answer for any harm caused.
Data Equity
Ensuring that data systems treat all groups of people fairly and do not advantage or harm any group unfairly.
๐Ÿงฐ Supplies Needed
Printed worksheets (1 per student) Whiteboard / projector Debate sentence frames (see below) Timer Sticky notes (optional)
Sentence Frames for Debate
  • โ€ข "I think this is ethical because..."
  • โ€ข "I think this is unethical because..."
  • โ€ข "I agree with ___, and I want to add..."
  • โ€ข "I respectfully disagree because..."
  • โ€ข "A possible harm of this is..."
  • โ€ข "A possible benefit of this is..."
โš–๏ธ Three Nuanced Case Studies (No Clear Right Answer)
Case 1 โ€” Health App Data and Insurance

The situation: A popular fitness app tracks your steps, sleep, and heart rate. The app is free because the company sells your anonymized health data to insurance companies. The insurance companies use this data to set premiums โ€” people who exercise more pay less.

The tension: Benefits include cheaper insurance for healthy people and better health research. Harms include people who can't exercise (disabilities, long work hours, unsafe neighborhoods) paying more โ€” not because they made a "bad choice" but because of factors outside their control.

Key questions for class: Did users truly consent if it's buried in Terms of Service? Is this fair to people who can't exercise? Who benefits and who is harmed?

Instructor note: Push back on "just read the TOS" โ€” studies show TOS is longer than many novels. Explore what "real consent" requires.

Case 2 โ€” Predictive Policing Algorithm

The situation: A city uses an algorithm to predict which neighborhoods are likely to have crimes. Police are sent to those neighborhoods more often. The algorithm was trained on historical arrest data.

The tension: The city says it's just using data to be efficient. Critics say that if past policing was biased (more arrests in certain neighborhoods regardless of actual crime), the algorithm learns and reinforces that bias. More police presence โ†’ more arrests โ†’ algorithm "confirms" prediction โ†’ cycle continues.

Key questions for class: If data comes from a biased system, can the algorithm be unbiased? Is efficiency a good reason to use potentially biased predictions? Who decides what counts as "crime"?

Instructor note: This is a real technology in use. Students may feel strongly. Validate all perspectives while helping them see the structural bias issue.

Case 3 โ€” School Surveillance Cameras with AI

The situation: A school district installs AI cameras that track every student's movement and flag "unusual behavior" to school administrators in real time. The district says it's for safety after a school shooting threat. Parents were told in a newsletter but not asked to vote.

The tension: Safety is a genuine concern. But constant surveillance may create anxiety, distrust, and a "chilling effect" on students' behavior โ€” they may not express themselves freely knowing they're watched. The AI may also flag students from certain groups more often. There was no vote โ€” was that consent?

Key questions for class: Does safety justify surveillance? Who decides what behavior is "unusual"? Is a newsletter enough for consent? Should students have a say?

Instructor note: Students will relate to this personally. Encourage them to think about power โ€” who has it, who doesn't.

๐Ÿ’ฌ Discussion Questions
  • What makes a use of data "ethical"? Can you name the criteria?
  • Is it possible for data to be fair in some ways and unfair in others at the same time?
  • Who should decide what data companies can collect on kids?
  • What is the difference between data being legal and data being ethical?
  • If an algorithm makes a mistake that hurts someone, who is responsible?
  • Are there types of data that should never be collected, no matter the reason?

Let students sit with discomfort โ€” not every question has a clear answer. That's the point.

โญ ND-Friendly Tips
Accommodations for This Session
  • Post sentence frames visibly โ€” helps students who struggle to find words under social pressure
  • Allow written responses for students who find verbal debate anxiety-provoking
  • Ethics topics can feel intensely personal โ€” validate that reactions are normal
  • Give processing time (30 seconds of quiet) before asking for opinions
  • For students who find moral ambiguity distressing, explicitly name: "There is no single right answer here. Smart people disagree."
  • "Would You Rather" brain break uses physical movement โ€” allow seated response (hand raise) as alternative
  • Avoid cold-calling during sensitive discussions โ€” use voluntary sharing or small-group first
๐Ÿง  "Would You Rather?" Brain Break (min 48โ€“55)

Instructions: Read each scenario. Stand = Option A, Sit = Option B. No wrong answers โ€” quick vote then one student explains their choice.

  • Q1: A) A free app that collects your data  |  B) A $5/month app that collects nothing
  • Q2: A) A city that uses AI to reduce crime by 20% but sometimes flags innocent people  |  B) A city that only uses human police with no AI
  • Q3: A) A school that reads all student emails "just in case"  |  B) A school that reads no emails even if there are safety concerns
  • Q4: A) Doctors can access your health data without asking to save your life in an emergency  |  B) Doctors must always ask first, even in emergencies