Data Science for Young Minds — Grade 3
| Topic | Details |
|---|---|
| The myth of objective data | The myth of objective data: every dataset reflects human choices |
| How collection methods can exclude certa | How collection methods can exclude certain groups |
| How analysis choices can amplify inequal | How analysis choices can amplify inequality |
| Real examples | Real examples: biased hiring algorithms, unfair school assessments |
| Who gets counted and who gets left out | Who gets counted and who gets left out |
| Invisible populations | Invisible populations: people without internet, without addresses, without documentation |
| How missing representation leads to wron | How missing representation leads to wrong conclusions |
| Activity | Activity: examine a dataset and identify who might be missing |
| What an algorithm is in this context | What an algorithm is in this context: a set of rules a computer follows to make decisions |
| How algorithms learn from biased histori | How algorithms learn from biased historical data |
| Real examples | Real examples: facial recognition errors, biased hiring tools, unfair loan decisions |
| The human responsibility | The human responsibility: algorithms do what humans tell them to |
| The responsibilities of anyone who colle | The responsibilities of anyone who collects or uses data |
| Principles | Principles: accuracy, fairness, privacy, transparency, and accountability |
| The Islamic principle of Amana applied to data | The Islamic principle of Amana applied to data: information is a trust |
| Activity | Activity: write your personal Data User's Pledge |
Data seems objective, but the people who collect, analyze, and use it make choices that can introduce unfairness.
When certain people are not represented in data, decisions based on that data can harm them.
Algorithms make decisions about loans, jobs, schools, and justice. What happens when they are biased?
Create your own principles for using data ethically. What does it mean to use data responsibly?