Oops! AI did it again... you're not that innocent.
Nectar, a 'crime-predicting' system developed with #Palantir, could be rolled out nationally after a pilot with Bedfordshire police (UK).
Data such as race, sex life, trade union membership, philosophical beliefs and health are used to 'predict' criminality so people can be targeted for #surveillance.
inews.co.uk/news/police-use-co…
#SafetyNotSurveillance #policing #police #AI #ukpolitics #ukpol #criminaljustice #precrime #predictivepolicing
Police use controversial AI tool that looks at people’s sex lives and beliefs
Senior MPs and privacy campaigners have expressed alarm at the deployment of Palantir’s AI-powered crime-fighting software with access to sensitive personal informationMark Wilding (The i Paper)
Open Rights Group
in reply to Open Rights Group • • •'Crime-predicting' tech leads to more over-policing of Black, racialised, lower income and migrant communities.
It automates unjust stop and searches, harassment, handcuffing and use of force.
Sign and share our petition to BAN it ⬇️
you.38degrees.org.uk/petitions…
#SafetyNotSurveillance #policing #police #AI #surveillance #ukpolitics #ukpol #criminaljustice #precrime #predictivepolicing
Ban ‘Crime Predicting’ Police Tech
38 Degrees