The tension between model confidentiality and public access motivates our investigation of model extraction attacks. In such attacks, an adversary with black-box access, but no prior knowledge of an ML model's parameters or training data, aims to duplicate the functionality of (i.e., "steal") the model. Unlike in classical learning theory settings, ML-as-a-service offerings may accept partial feature vectors as inputs and include confidence values with predictions. Given these practices, we show simple, efficient attacks that extract target ML models with near-perfect fidelity for popular model classes including logistic regression, neural networks, and decision trees. We demonstrate these attacks against the online services of BigML and Amazon Machine Learning. We further show that the natural countermeasure of omitting confidence values from model outputs still admits potentially harmful model extraction attacks. Our results highlight the need for careful ML model deployment and new model extraction countermeasures.


Now a small subsidiary of Google named Jigsaw is about to release an entirely new type of response: a set of tools called Conversation AI. The software is designed to use machine learning to automatically spot the language of abuse and harassment —with, Jigsaw engineers say, an accuracy far better than any keyword filter and far faster than any team of human moderators.

Related: Can a video game company tame toxic behaviour?.

With more than 300 million active customer accounts and more than $100 billion in annual revenue, Amazon is a shopping giant whose algorithm can make or break other retailers. And so ProPublica set out to see how Amazon's software was shaping the marketplace.

In experiments, the researchers then had EQ-Radio scan 30 volunteers, each seated about one meter away from the gadget. They were listening to music, looking at photos, or watching videos to help them recall memories that made them feel either happy, excited, sad, angry, or neutral. The system collected more than 130,000 heartbeats.