Mohandas, who taught himself development and is based in Bengaluru, India, decided he wanted to create an alternate service for collecting and sharing images that is open source and end-to-end encrypted. Something “more personal, pleasant, and trustworthy”, he says. The paid assistance he designed, Ente, is prosperous and says it has over 100, 000 people, many of whom are now part of the privacy-obsessed group. Despite all the amenities it provides, Mohandas struggled to explain to wider people why they should reconsider using Google Photos.
Therefore, one May weekend, an Ente volunteer came up with the idea to demonstrate what some of Google’s Artificial models can learn from visual analysis. Last month, Ente launched https ://Theyseeyourphotos.com, a website and marketing ploy designed to turn Google’s technology against itself. A Google Cloud system vision system will write a strikingly complete three-paragraph description of any photo they want to add to the website. ( Ente asks the AI model to provide brief explanations of the images being uploaded. )
In front of an Indonesian temple, Mohandas tried to upload a selfie with his wife and daughter as one of his first attempts. Google’s analysis was exhaustive, even documenting the specific watch model that his wife was wearing, a Casio F-91W. But then, Mohandas says, the AI did something strange: It noted that Casio F-91W watches are commonly associated with Islamic extremists. According to Mohandas,” we had to change the prompts to make it a little more wholesome but still spooky.” Ente started asking the model to produce short, objective outputs—nothing dark.
The temple’s name and the “partly cloudy sky and lush greenery” that the same family photo was uploaded to Theyseeyourphotos now appear in a more generic form. However, the AI continues to make a number of assumptions about Mohandas and his family, including that their faces suggest” joint contentment” and that their parents are likely to be of South Asian descent and middle class. It judges their clothing ( “appropriate for sightseeing” ) and notes that” the woman’s watch displays a time as approximately 2 pm, which corroborates with the image metadata”.
Colin Smith, a spokesman for Google, declined to directly comment on Ente’s project. He ordered WIRED to support pages that stated that Google Photos uploads are only used to train generative AI models that monitor image libraries, such as those that track photo subjects ‘ ages and locations. The business claims that Google Photos contains no advertising or commercial content and that it does not sell or use it for marketing purposes. Because the data are not end-to-end encrypted, users can turn off some of the analysis features in Photos, but they ca n’t stop Google from stealing their images entirely.
Ente offers users the chance to experiment on Theyseeyourphotos with a number of pre-chosen stock images if you do n’t want to upload your own image. Google’s computer vision is able to pick up on subtle details in them, like a person’s tattoo that appears to be of the letter” G”, and a child’s temporary tattoo of a leaf. ” The whole point is that it is just a single photo”, Mohandas says. He hopes the website makes people think about how much Google can learn from the analysis of thousands of their cloud photos in the same way.
The transition might not be entirely smooth if Theyseeyourphotos motivates you to switch from Google to another image storage service. By combining files and compressing them, Mohandas claims that Google makes it difficult for users to move their photo library elsewhere. He also alleges that Google Play, the company’s Android app store, has flagged Ente’s app multiple times for issues such as non-transparent pricing, which Mohandas says are bogus. Smith for Google claims that Smith appreciates your feedback and that its services are constantly being improved.
Ente, which means “mine” in Mohandas ‘ native Malayalam, is n’t without its own downsides. Features like file sharing and search may not yet be as advanced as the service, which is smaller and open source. A user may lose access to their photo library if they forget or lose their password, which also serves as an encryption key. Mohandas claims he can rely on Ente, which provides users with two separate private backups of their own family photos. However, Google has decades of experience keeping photos safe from fading in a poof.
In some ways, though, that’s exactly what concerns Mohandas. He’s worried humanity’s visual archive will be mined in the future in ways he ca n’t predict or control. He claims that” Google is a company that will exist in 20 years.” His daughter’s image today shows who she is and what makes her happy or sad. ” This information could be used to manipulate her decades from now by anyone who has access to this data—advertisers, dating websites, employers, and industries that do n’t exist yet but will benefit from psychological profiles”, Mohandas says.
He recognizes that he might appear overly paranoid to some people, but,” We do n’t know how the future will turn out, and it does n’t hurt to be cautious, and it does n’t hurt to have an option”.