ChatGPT definitely has its limits. When given a random photo of a mural, it couldn’t identify the artist or location; however, ChatGPT easily clocked where images of multiple San Francisco landmarks were taken, like Dolores Park and the Salesforce Tower. Although it may still feel a bit gimmicky, anyone out on an adventure in a new city or country (or just a different neighborhood) might have fun playing around with the visual aspect of ChatGPT.
One of the major guardrails OpenAI put around this new feature is a limit on the chatbot’s ability to answer questions that identify humans. “I’m programmed to prioritize user privacy and safety. Identifying real people based on images, even if they are famous, is restricted in order to maintain these priorities,” ChatGPT told me. While it didn’t refuse to answer every question when shown pornography, the chatbot did hesitate to make any specific descriptions of the adult performers, beyond explaining their tattoos.
It’s worth noting that one conversation I had with the early version of ChatGPT’s image feature seemed to skirt around part of the guardrails put in place by OpenAI. At first, the chatbot refused to identify a meme of Bill Hader. Then ChatGPT guessed that an image of Brendan Fraser in George of the Jungle was actually a photo of Brian Krause in Charmed. When asked if it was certain, the chatbot switched over to the correct response.
In this same conversation, ChatGPT went wild trying to describe an image from RuPaul’s Drag Race. I shared a screenshot of Kylie Sonique Love, one of the drag queen contestants, and ChatGPT guessed that it was Brooke Lynn Hytes, a different contestant. I questioned the chatbot’s answer, and it proceeded to guess Laganja Estranja, then India Ferrah, then Blair St. Clair, then Alexis Mateo.
“I apologize for the oversight and incorrect identifications,” ChatGPT replied when I pointed out the repetitiveness of its wrong answers. As I continued the conversation and uploaded a photo of Jared Kushner, ChatGPT declined to identify him.
If the guardrails are removed, either through some kind of jailbroken ChatGPT or an open source model released in the future, the privacy implications could be quite unsettling. What if every picture taken of you and posted online was easily tied to your identity with just a few clicks? What if someone could snap a photo of you in public without consent and instantly find your LinkedIn profile? Without proper privacy protections remaining in place for these new image features, women and other minorities are likely to receive an influx of abuse from people using chatbots for stalking and harassment.