- After Apple’s guidance revision, time to focus on enterprise
- Apple Brazenly Mocks Tech Rivals With Huge Billboard Touting Privacy At CES 2019
- Apple’s T2 Security Chip Confirmed To Slap Handcuffs On Some Third-Party Repairs
- One Reason Apple Got Cozy With Amazon Is About Killing Off Refurbs On Consumers
- Apple’s New MacBook Pro Keyboard Has ‘Thin, Silicone Barrier’ Under Each Key: iFixit
- Notifications are broken. Here’s how Apple and Google can fix them.
- SNAKABLE ARMORED APPLE MFI LIGHTNING CABLE REVIEW: LIKE APPLE’S LIGHTNING CABLE ON STEROIDS
- ALPHA AUDIOTRONICS SKYBUDS REVIEW: A FRUSTRATING PAIR OF TRUE WIRELESS EARBUDS
- REVIEW: THE IPHONE X IS THE BEST PHONE FOR BUSINESS, PERIOD.
- IMOVIE 10 REVIEW: FREE VIDEO EDITING THAT’S ELEGANT AND EASY
GOOGLE ACCIDENTALLY REVEALS NAMES OF RAPE VICTIMS DESPITE LAW
Google has been inadvertently revealing the names of rape victims that should remain secret.
Searching for details of certain cases can bring up predictions that reveal details of those cases that are confidential. Victims in rape cases are supposed to remain confidential forever, even if the accused is found not guilty.
A number of examples of searches that revealed the names of accusers were seen by The Times. Google has admitted that the illegal information was shown, but said it would work to stop them showing in the future.
But because that information is being pulled in from the internet, it can show irrelevant and sometimes shocking results. In the case of the rape results, the details appear to be pulled in from discussions on social media posts in which people are illegally naming accuserGoogle collected personal data about iPhone users, High Court hears
Google has systems in place that are intended to catch inappropriate predictions and avoiding showing them to users. But the huge number of searches going through the site – 15 per cent of which are new – make catching all of them difficult.
Google said that the autocomplete predictions highlighted by The Times were against its rules. It has removed the examples it has been alerted to and would remove any similar ones in the future, it said.
“We don’t allow these kinds of autocomplete predictions or related searches that violate laws or our own policies and we have removed the examples we’ve been made aware of in this case,” a Google spokesperson said. “We recently expanded our removals policy to cover predictions which disparage victims of violence and atrocities, and we encourage people to send us feedback about any sensitive or bad predictions.”
It is far from the first time that Google’s autocomplete feature has brought trouble to the company. Because it is based on searches and other data from the internet, it can occasionally show highly offensive predictions even for apparently innocent searches.
Google’s rules for what shows on search results pages – as opposed to the autocomplete box that pops up to help users get to them – are enforced differently, and it is not thought that any of the problem searches were displayed when users actually clicked through to the results.