Please explain

0
560

If you’re a global business, you’re probably aware that the consumer and business data protection and privacy frameworks in place around the world vary a lot. There’s and increasing tendency to require citizen-related data to he held locally and not moved around without explicit consent. There are a lot of restrictions on what citizens have to opt in for and what the data they agree to contribute can be used for. With or without their knowledge and approval. Managing this patchwork of regulation (and the consequences of even inadvertent failure to comply) is tough and expensive – and you’re probably missing some interesting analytic potential because the data is dispersed and distributed analytics are still a work in progress both technically and legally. Bad enough today, but things are about to get much more complex.

The new European Union General Data Privacy Regulation (GPDR) goes into effect in May next year (2018) and brings with it a lot of new requirements on businesses that operate in any of the EU countries. In many ways, it simplifies and harmonizes the somewhat diverse and incompatible national regulations that have accumulated under the 1995 Data Privacy Directive, which should make life easier, but the primary objectives of the GDPR are “to give citizens back the control of their personal data” with harmonization of the rules a secondary objective. As a result, a number of new “rights” have been established, including a limited “right to erasure” for data about an individual that can be shown to be erroneous or not in the individual’s interest to be “known”. The actual implementation of the “delete” function isn’t going to be easy – and will have some interesting consequences, given that there is online data about a lot of people going back over two decades. There’s also a “right to know” what information an online business has collected about you, whether or not you provided it. Also not clear exactly how that right will be implemented.

There’s more. Two sections of the GDPR do a couple of even more potentially contentious things: they ban decisions “based solely on automated processing, including profiling, which produces an adverse legal effect concerning the data subject or significantly affects him or her.” In other words, algorithms and other programs aren’t allowed to make negative decisions about people on their own. Although there is no explicit language in the new law, it’s possible to interpret this regulation as conferring a “right to an explanation” on how an automated system made a decision to, for example, deny a loan application or impose a higher interest rate. That’s not necessarily an issue if the system is entirely deterministic – although it does raise issues of IP protection and the defense of trade secrets if you have to explain how all your business rules work to anyone they get applied to.

What’s potentially more challenging will be the need to explain how decisions based on bid data analytics, machine learning and eventually forms of embedded “AI” are arrived at. Challenging because in some cases we don’t know how the underlying technology actually works. Some forms of machine learning (such as classifiers) are better at what they do than theory predicts they should be. In other cases, (deep learning using hidden layer neural networks) we may not have access to the algorithms directly and the algorithms themselves may change over time as the system “learns” to perform better.

It’s been suggested that companies can get around the law by injecting a single human-mediated processing step, but the whole purpose of many automated systems is to remove fallible humans from the process to make it “objective” and “fair”.

For now, this is “just” a problem with the EU and it’s still over a year away from full implementation. However, I can see these sorts of regulations and rights spreading to other major economies over time. Consumer privacy is becoming a much higher profile topic in many areas of business and government and in parallel, embedded AI is accelerating in capability and deployment. This could turn into an ugly train wreck if we don’t think things through ahead of time.

Better get started.

John Parkinson
Affiliate Partner
Waterstone Management Group

LEAVE A REPLY

Please enter your comment!
Please enter your name here