Quote:
Originally Posted by hildea
I was at a conference about big data where this anecdote was discussed. Even if it's true, we don't know: - How many customers who got pregnancy-related ads without being pregnant
- How many customers who were pregnant who didn't get those ads
So, at best we know that one pregnant customer got those ads, but that might be just random luck.
Another problem with using big data is that they can strenghten existing predudices. For instance say two different demographic groups use drugs equally much, but one of them is stopped and searched in random checks more often (because of predudice in the police). Then your data will tell you, quite correctly, that members of group A are more likely to be arrested for drug crimes than group B. If you use this to decide how to prioritize which groups to check more often in the future, you are making policing even more predudiced, and believing you are basing it on objective data.
|
In general, the police stop and frisk wasn't about drugs, it was about guns. It just so happened that a lot of drug dealers also happen to also carry illegal guns, presumably to protect themselves from other criminals.
A large part of the reason that profiling works is that profiles are not based on personal prejudices, but rather are based on what sort of person matches a given criteria. Police don't just walk around frisking random black guys, they frisk black guys who act a certain way. Police also tend to stop middle aged white guys slowing driving along in an area known for prostitution or drugs as well.
If you read up on how airport security works in Israel, you will see that the agents are trained to watch for certain behaviors rather than certain demographic groups. It's not all that easy to tell an Arab from a Jew at a glance in Israel.