TY - JOUR
T1 - Articles
T2 - Algorithmic jim crow
AU - Hu, Margaret
N1 - Copyright:
Copyright 2017 Elsevier B.V., All rights reserved.
PY - 2017/11
Y1 - 2017/11
N2 - This Article contends that current immigration-And security-related vetting protocols risk promulgating an algorithmically driven form of Jim Crow. Under the "separate but equal" discrimination of a historic Jim Crow regime, state laws required mandatory separation and discrimination on the front end, while purportedly establishing equality on the back end. In contrast, an Algorithmic Jim Crow regime allows for "equal but separate " discrimination. Under Algorithmic Jim Crow, equal vetting and database screening ofall citizens and noncitizens will make it appear that fairness and equality principles are preserved on the front end. Algorithmic Jim Crow, however, will enable discrimination on the back end in the form ofdesigning, interpreting, and acting upon vetting and screening systems in ways that result in a disparate impact. Currently, security-related vetting protocols often begin with an algorithm-Anchored technique of biometric identification-for example, the collection and database screening of scanned fingerprints and irises, digital photographs for facial recognition technology, and DNA. Immigration reform efforts, however, call for the biometric data collection of the entire citizenry in the United States to enhance border security efforts and to increase the accuracy of the algorithmic screening process. Newly developed big data vetting tools fuse biometric data with biographic data and internet and social media profiling to algorithmically assess risk. This Article concludes that those individuals and groups disparately impacted by mandatory vetting and screening protocols will largely fall within traditional classifications-race, color, ethnicity, national origin, gender, and religion. Disparate-impact consequences may survive judicial review if based upon threat risk assessments, terroristic classifications, data- screening results deemed suspect, and characteristics establishing anomalous data and perceived foreignness or dangeroiishess data- nonprotected categories that fall outside of the current equal protection framework. Thus, Algorithmic Jim Crow will require an evolution ofequality law.
AB - This Article contends that current immigration-And security-related vetting protocols risk promulgating an algorithmically driven form of Jim Crow. Under the "separate but equal" discrimination of a historic Jim Crow regime, state laws required mandatory separation and discrimination on the front end, while purportedly establishing equality on the back end. In contrast, an Algorithmic Jim Crow regime allows for "equal but separate " discrimination. Under Algorithmic Jim Crow, equal vetting and database screening ofall citizens and noncitizens will make it appear that fairness and equality principles are preserved on the front end. Algorithmic Jim Crow, however, will enable discrimination on the back end in the form ofdesigning, interpreting, and acting upon vetting and screening systems in ways that result in a disparate impact. Currently, security-related vetting protocols often begin with an algorithm-Anchored technique of biometric identification-for example, the collection and database screening of scanned fingerprints and irises, digital photographs for facial recognition technology, and DNA. Immigration reform efforts, however, call for the biometric data collection of the entire citizenry in the United States to enhance border security efforts and to increase the accuracy of the algorithmic screening process. Newly developed big data vetting tools fuse biometric data with biographic data and internet and social media profiling to algorithmically assess risk. This Article concludes that those individuals and groups disparately impacted by mandatory vetting and screening protocols will largely fall within traditional classifications-race, color, ethnicity, national origin, gender, and religion. Disparate-impact consequences may survive judicial review if based upon threat risk assessments, terroristic classifications, data- screening results deemed suspect, and characteristics establishing anomalous data and perceived foreignness or dangeroiishess data- nonprotected categories that fall outside of the current equal protection framework. Thus, Algorithmic Jim Crow will require an evolution ofequality law.
UR - http://www.scopus.com/inward/record.url?scp=85037053309&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85037053309&partnerID=8YFLogxK
M3 - Article
AN - SCOPUS:85037053309
VL - 86
SP - 633
EP - 696
JO - Fordham Law Review
JF - Fordham Law Review
SN - 0015-704X
IS - 2
ER -