- How Important Is the Power Supply (PSU) When Building a PC?
- How to Turn a Windows Laptop Into a Desktop PC
- What Is “Binning” for Computer Components?
- 6 Things to Do With Your Old PS4, Xbox, or Other Console
- When Are Integrated Graphics Good Enough on a PC?
- Everything You Need to Know Before Buying an Xbox Series X|S
- How “Unified Memory” Speeds Up Apple’s M1 ARM Macs
- How to Expand Your Xbox Series X|S Storage
- What Is Apple’s M1 Chip for the Mac?
- How to Work Around a Broken Keyboard Key on a Windows 10 PC
GOOGLE ACCIDENTALLY REVEALS NAMES OF RAPE VICTIMS DESPITE LAW
Google has been inadvertently revealing the names of rape victims that should remain secret.
Searching for details of certain cases can bring up predictions that reveal details of those cases that are confidential. Victims in rape cases are supposed to remain confidential forever, even if the accused is found not guilty.
A number of examples of searches that revealed the names of accusers were seen by The Times. Google has admitted that the illegal information was shown, but said it would work to stop them showing in the future.
But because that information is being pulled in from the internet, it can show irrelevant and sometimes shocking results. In the case of the rape results, the details appear to be pulled in from discussions on social media posts in which people are illegally naming accuserGoogle collected personal data about iPhone users, High Court hears
Google has systems in place that are intended to catch inappropriate predictions and avoiding showing them to users. But the huge number of searches going through the site – 15 per cent of which are new – make catching all of them difficult.
Google said that the autocomplete predictions highlighted by The Times were against its rules. It has removed the examples it has been alerted to and would remove any similar ones in the future, it said.
“We don’t allow these kinds of autocomplete predictions or related searches that violate laws or our own policies and we have removed the examples we’ve been made aware of in this case,” a Google spokesperson said. “We recently expanded our removals policy to cover predictions which disparage victims of violence and atrocities, and we encourage people to send us feedback about any sensitive or bad predictions.”
It is far from the first time that Google’s autocomplete feature has brought trouble to the company. Because it is based on searches and other data from the internet, it can occasionally show highly offensive predictions even for apparently innocent searches.
Google’s rules for what shows on search results pages – as opposed to the autocomplete box that pops up to help users get to them – are enforced differently, and it is not thought that any of the problem searches were displayed when users actually clicked through to the results.