And that’s not a coincidence.
In fact, that “same time” was a very specific time: 25 May 2018. Last Friday. Why? Because that’s the day the GDPR comes into action.
Last year, a lawyer rewrote Instagram’s Terms of Service in a way children could understand it. Here are some excerpts:
2. Officially you own any original pictures and videos you post, but we are allowed to use them, and we can let others use them as well, anywhere around the world. Other people might pay us to use them and we will not pay you for that.
5. Although you are responsible for the information you put on Instagram, we may keep, use and share your personal information with companies connected with Instagram. This information includes your name, email address, school, where you live, pictures, phone number, your likes and dislikes, where you go, who your friends are, how often you use Instagram, and any other personal information we find such as your birthday or who you are chatting with, including in private messages (DMs).
We are not responsible for what other companies might do with this information. We will not rent or sell your personal information to anyone else without your permission. When you delete your account, we keep this personal information about you, and your photos, for as long as is reasonable for our business purposes.
7. […] We are not responsible if somebody breaks the law or breaks these rules; but if you break them, you are responsible.
The rewriting was part of a report on how children use the Internet, and how well-informed they are. If all Terms and Conditions were written clearly like this, would children think twice before approving them?
Of course, it’s not just Instagram, and it’s not just children. Almost every company puts out this kind of Terms of Service. Almost every person clicks on “I agree” without checking what they’re agreeing to.
To demonstrate this, security firm F-Secure set up a free WiFi hotspot, like the ones so common in railway stations and airports these days. Anyone could use the WiFi, provided they accepted the Terms and Conditions.
One of the conditions was, you had to give your firstborn son to the company “for the duration of eternity”.
Within minutes, six people had signed up— which goes to show how easily people give up their rights and data without realising it.
“I have heard you say”, said Dr Watson, “that it is difficult for a man to have any object in daily use without leaving the impress of his individuality upon it in such a way that a trained observer might read it.” With that, he took out a watch — one of the wind-up types common in those days — and challenged his companion to find out the character or habits of its owner.
“There are hardly any data,” Sherlock Holmes remarked. “The watch has been recently cleaned, which robs me of my most suggestive facts.”
Even so, the initials “H.W.” on the back helped him guess the watch had belonged to Watson’s brother.
“Right so far,” Dr. Watson told him. “Anything else?”
“He was a man of untidy habits — very untidy and careless. He was left with good prospects, but he threw away his chances, lived for some time in poverty with occasional short burst of prosperity, and, finally, taking to drink, he died. That”, said Sherlock Holmes, “is all I can gather”.
And quite a lot it was, too, to deduce from a single watch of “hardly any data”. No wonder Watson thought his companion had secretly made enquiries about his brother. But he hadn’t.
Sherlock Holmes could make out the owner was rich, because the watch was a pretty expensive model. What’s more, the two dents and many chips showed that the owner was also careless: who else would put loose change and other objects in his pocket, to bump and bang against such a valuable item?
But he wasn’t always rich, because the watch also had several numbers marked at the back. These numbers, Holmes knew, were pawnbroker’s marks: the ticket-numbers a pawnbroker notes down when someone gives them an object in exchange for money. This particular object had no less than four such marks on it. Of course, the owner also had “brief burst of prosperity”, when he was able to pay the pawnbroker and get his watch back.
And the “taking to drink”? The clues to that came from the keyhole, where one inserts the key to wind up the watch. This one had lots and lots of scratch-marks, where the key had slipped while the owner was trying to wind it up. But whose hand will be so unsteady that it slips so many times? Only the hand of a drunken person.
This life-story from a watch demonstrates the powers of Sherlock Holmes. But it also shows how small bits of data, when put together in the right way, can reveal an extraordinary amount of information.
Every bit of personal data gives clues on what a person is like. By themselves, these pieces of data can be harmless. Address and phone-numbers are considered “sensitive information”, but, a decade or so ago, telephone directories were freely available.
Of course, you need knowledge too. To make sense of the expensiveness, dents, numbers and scratches, Sherlock Holmes had to know about monetary valuation, carelessness, pawnbrokerage, and alcoholism.
That’s also why small bits of data weren’t such a problem ten years ago. All the dots may have been there, but there were no Sherlock Holmes algorithms to connect them together.
Not any more.
Still ringing in the news is the story of Cambridge Analytica. It’s a company that took the personal information of millions of Facebook users. Using that data, they targeted advertisements to influence elections around the world. All politicians advertise — but now, each voter was targeted with ads designed to influence them specifically.
Right now, everyone’s pointing fingers at Facebook and Cambridge Analytica for “stealing their data”. But in reality, almost everybody has your data — including the Cow Clicker game that lets you click on cows. As Cow Clicker creator Ian Bogost noted, the Cambridge Analytica scandal is possibly “sending lots of old app developers, like me, back to old code and dusty databases, wondering what they’ve even got stored and what it might yet be worth.”
Election influencing is a particularly hot topic at the moment — but personal data is being used in other ways too. Some are obvious, some are inventive, and some are downright creepy. Sherlock Holmes may have chosen to use his skills for good, but algorithms can be used for anything.
One of the main problems is that personal data is so easily available. Companies are always happy to have more of it to play with, and users sign away their rights, without realising it, every time they click an “I agree” button.
Making it so easy to “I agree” away user data is what the GDPR aims to stop.
The GDPR, or General Data Protection Regulation, is a new privacy law for the European Union. But its influence doesn’t stop at Europe. The law applies to any company with European customers — which potentially includes any company on the Internet. (Of course, there’s a loophole: companies could choose to use a different set of rules for European citizens, and a different set for everyone else).
GDPR is very strict about personal data, how companies collect it, and what they use it for. No more vague Privacy Policies saying “We’ll collect lots of your data and use it for various purposes”. You have to say exactly what data you’re collecting, and what you’ll use that data for. And if the user doesn’t give you permission, you can’t use that data at all.
That means no more ad trackers, following you across websites to see what ads you click and then showing you more of the same.
No more people taking your email address for signing up, and then using it to send you marketing spam.
No more random people starting to collect your data (unless the do it illegally, of course).
And to make sure you don’t accidentally give permission, GDPR has a think called “informed consent”. Customers have to clearly know what they’re agreeing to, and actively agree to it. A pre-ticked checkbox is not counted as “consent”.
GDPR also says that users should be able to download all the data a company has on them. They should be able to make corrections, and, if they choose to, have the data deleted completely.
To encourage companies to follow the rules, GDPR has very stiff penalties. If a company breaks the rules, they have to pay €20 million or 4% of their annual revenue — whichever is higher.
But there’s still one reason companies may not comply with GDPR: because they can’t.
Many companies are not yet ready for GDPR. That’s because the new law requires many changes in the way the operate, and they haven’t finished making those changes yet.
The truth is that many companies don’t know what data they’ve collected, let alone how to access and delete it. All these years, they’ve been saving extra pieces of information “in case it comes in useful later” — and nobody’s bothered to keep track of what those pieces are.
Now companies are being told to not only list that data, but also spell out how they’re using it, and set up systems to collect it together for whoever wants it. And to actually delete that data, if requested, without messing up the other records they have.
It’s a bit like sorting out all the papers and stuff from a very messy desk — a desk that’s been messy for decades.
That’s why some people say GDPR is not just going to change the way companies work now, it’s also going to change the way their services are designed in the future. Developers will have to be a lot more focused on privacy. (Of course, as developer Bozhidar Bozhanov points out, there won’t be much problem if you were focused on privacy anyway).
Back in the present, not all government regulators are ready either. According to a Reuters survey earlier this month, 17 of 24 regulators didn’t yet have the funding and legal powers to fulfil their duty.
Most people are expecting the GDPR launch to be a gentle easing in, with regulators taking it slowly and giving companies time to adjust. But regulators can’t control everything, because part of the GDPR is also about customers. If someone from the EU requests to download their data, a company has 30 days to send it — after which the person could file a complaint with the local regulator.
If companies or regulators are not ready, that’s going to be a bit hard. We’ll have to wait and see what happens.
Meanwhile, maybe you can sit back, go through all the newly updated Privacy Policies in your inbox, and decide whether or not to click on “I agree”.
Have something to say? At Snipette, we encourage questions, comments, corrections and clarifications — even if they are something that can be easily Googled! Or you can simply click on the ‘👏 clap’ button, to tell us how much you liked reading this.