How-to

Privacy

  • Germany edges toward Chinese-style rating of citizens.

    In Germany, he points out, it starts out with the universal credit rating system known as a Schufa. Very much like its US counterpart FICO, Schufa is a private company that assesses the creditworthiness of about three-quarters of all Germans and over 5 million companies in the country. Anyone wanting to rent a house or loan money is required to produce their Schufa rating in Germany – or their FICO score in the US. Additionally, factors like “geo-scoring” can also lower your overall grade if you happen to live in a low-rent neighborhood, or even if a lot of your neighbors have bad credit ratings.

    In other areas, German health insurers will offer you lower premiums if you don’t get sick as much. They may offer you even better premiums if you share data from your fitness-tracking device to show you’re doing your part to stay healthy. Anyone using websites like Amazon, eBay or Airbnb is asked to rate others and is rated themselves. Those who try to avoid being rated are looked at askance. An increasing number of consumers will be denied certain services or, say, mortgages if they don’t present some kind of rating.

Tech

  • We crack the Shufa!. When closed algorithms affect your life, one solution is to reverse-engineer them!

    When applying for a loan, mobile phone contract, or even trying to rent an apartment in Germany, the Schufa score - Germany’s credit rating - is decisive. If you have a few „points“ too little, your application is refused. (Computer says „No“ to your new smartphone or apartment.) However, the calculation of these credit scores --done by the private Schufa company-- is fully intransparent. The formula is a trade secret, and as such not open to the public.

    We want to change this intransparency with the project OpenSCHUFA. Together with AlgorithmWatch we want to reconstruct the Schufa algorithm with „reverse engineering“.

  • Facial Recognition Is Accurate, if You’re a White Guy. The newest iteration of this well-known problem.

    Facial recognition technology is improving by leaps and bounds. Some commercial software can now tell the gender of a person in a photograph.

    When the person in the photo is a white man, the software is right 99 percent of the time.

    But the darker the skin, the more errors arise — up to nearly 35 percent for images of darker skinned women, according to a new study that breaks fresh ground by measuring how the technology works on people of different races and gender.

  • Tired of texting? Google tests robot to chat with friends for you. At some point we'll just set our phones on autopilot mode and leave them on the table. Which is not necessarily bad.

    Are you tired of the constant need to tap on a glass keyboard just to keep up with your friends? Do you wish a robot could free you of your constant communication obligations via WhatsApp, Facebook or text messages? Google is working on an AI-based auto-reply system to do just that.

  • Someone Is Sending Amazon Sex Toys to Strangers. Amazon Has No Idea How to Stop It. Another example of here is this algo, let's fool it now by feeding it some input we can control. Two relevant paragraphs:

    Nikki’s story is part of a broiling internal mystery that is flummoxing Amazon, according to a source at the company: Someone is shipping out unsolicited products, frequently sex toys, to seemingly random customers, and the company does not yet know why they’re being purchased, and why they’re being shipped to people like Nikki.

    [...]

    Sources both in and out of Amazon have one theory. It’s called, in Amazon-speak, verified review hacking.

    Amazon uses a review system that heavily weights “verified purchases”—reviews by users who have purchased a specific product through Amazon—over other reviews.

    This could give sellers incentive to buy and ship their own products to strangers from dummy accounts. Those dummy accounts could then give the product a 5-star review and, in turn, help it surface higher in Amazon and Google searches.

    Solutions to this have been proposed, however. On Dave Farber's IP list, this post appeared a few days ago:

    The 'fix' is for Amazon to inject itself into the pipe in a way that the seller and vendor are unable to defeat. That is the 'card in the box.' Every order has an order number and Amazon can generate a URL through an Amazon URL shortening service that would indicate the order was fraudulent. If the recipient visits the URL you look for the review, if you find it you change it to "Fraudulent Reviewer" (you could remove it but public seller shaming is even better)

    This is another example of how the information economy influences the goods economy. There the marginal value of the bogus review can be computed in terms of lifetime product sales affected.

  • AI-Moderators Fighting AI-Generated Porn Is the Harbinger of the Fake News Apocalypse. It begins with porn, will continue with everything else. I was talking to some friends in Madrid a few months ago about this topic, before the deepfake porn was a thing, and we concluded that we are walking towards a society in which the only news you'll believe are those aligned to what you already believe. It's going to be fun for all the wrong reasons.

    This sounds great in theory, but as Wired points out, there are a few scenarios where deepfakes will slip through the cracks. If someone makes a deepfake of a private citizen—think vindictive exes or harassers scraping someone’s private Facebook page—and no images or videos of them appear publicly online, these algorithms won’t be able to find videos, and will categorize it as the original.


Data Links is a periodic blog post published on Sundays (specific time may vary) which contains interesting links about data science, machine learning and related topics. You can subscribe to it using the general blog RSS feed or this one, which only contains these articles, if you are not interested in other things I might publish.

Have you read an article you liked and would you like to suggest it for the next issue? Just contact me!