https://www.schneier.com/blog/archiv...ng-models.html
And you can't easily secure a machine "learning" system.
Quote:
bstract: Given the computational cost and technical expertise required to train machine learning models, users may delegate the task of learning to a service provider. Delegation of learning has clear benefits, and at the same time raises serious concerns of trust. This work studies possible abuses of power by untrusted learners.We show how a malicious learner can plant an undetectable backdoor into a classifier. On the surface, such a backdoored classifier behaves normally, but in reality, the learner maintains a mechanism for changing the classification of any input, with only a slight perturbation. Importantly, without the appropriate “backdoor key,” the mechanism is hidden and cannot be detected by any computationally-bounded observer. We demonstrate two frameworks for planting undetectable backdoors, with incomparable guarantees.
|
But how can we trust the systems that haven't been outsourced? The companies with the resources have shown themselves to be untrustworthy and have the attitude that laws only apply to companies not using the Internet (Google, Meta/Facebook, Uber, AirBnB etc).