To be clear, no insurance company is currently using Twitter data against anyone (or for anyone), at least not openly. The idea outlined in the article is that people could set up accounts to share their personal data with companies like insurance companies, as a way of showing off their healthiness.
As is the case with most police departments across the country, the NYPD does not disclose internal disciplinary records to the public. Even though cities spend millions in public funds to settle lawsuits filed against officers, the public has little access to what the settlements reveal about problematic officers and precincts. Meanwhile, the officers themselves rarely face consequences and often return to the streets quickly, their histories shielded in anonymity.
But that situation is beginning to change — as a growing number of police accountability groups are starting to bypass the departments by aggregating and distributing misconduct history databases on their own.
Computer scientists have created an AI program capable of predicting the outcome of human rights trials. The program was trained on data from nearly 600 cases brought before the European Court of Human Rights (ECHR), and was able to predict the court's final judgement with 79 percent accuracy. Its creators say it could be useful in identifying common patterns in court cases, but stress that they do not believe AI will be able to replace human judgement.
We are DATACTIVE, a research project and a research collective exploring the politics of big data broadly defined. We take a critical look at massive data collection, privacy and surveillance | social movements, activism and internet activism | internet infrastructure, cybersecurity and their governance | open data and civic tech networks.