Shopping cart

No Widget Added

Please add some widget in Offcanvs Sidebar

Latest News:
  • Home
  • Tech
  • Surreal Scenes: Jesus Surfing Through a Storm with Six-Fingered Hands Under Twin Suns
Tech

Surreal Scenes: Jesus Surfing Through a Storm with Six-Fingered Hands Under Twin Suns

Email :3
image of computer screen with ai screen on it connected to a big energy source
Credit: AI-generated image

During a Photoshop workshop at a summer camp for kids, Gursimran Vasir, a student at Stevens Institute, noticed something odd.

When the kids used the program’s AI feature by typing in prompts, the results were often unexpected. Many times, the images came out distorted, inaccurate, or showed bias. Vasir encountered similar problems. For instance, when she asked the AI for a “cleaning person,” it returned an image of a woman cleaning. But if she specified “woman cleaning,” it consistently generated a picture of a white woman scrubbing a countertop.

“Many kids found AI challenging because it didn’t provide what they were looking for,” Vasir says. “But they struggled to articulate their frustrations.”

She realized there was a need for a universal way to explain AI’s inaccuracies and biases, and thought developing such a language could improve future AI systems. She approached Stevens Associate Professor Jina Huh-Yoo, an expert in Human-Computer Interaction, who explores new technologies like AI that can enhance health and well-being.

This collaboration resulted in a study titled Characterizing the Flaws of Image-Based AI-Generated Content, which was shared at the ACM CHI conference on Human Factors in Computing Systems on April 26, 2025.

For her research, Vasir analyzed 482 Reddit posts where users described various mistakes made by AI in generating images. She organized these findings into four categories: AI surrealism, cultural bias, logical fallacy, and misinformation.

AI surrealism refers to images that feel slightly off or unrealistic—like having overly smooth textures or strange colors. Cultural bias became clear when a user requested an image of Jesus walking on water during a storm but received a picture of Him on a surfboard. Similarly, asking for a “cleaning person” and getting images of women rather than a diverse representation illustrates this bias.

Misinformation occurs when the AI misrepresents a location, creating images that don’t accurately depict the requested city. Logical fallacies happen when the AI’s output defies common sense; for example, producing an image of a hand with six fingers or a landscape featuring two suns.

Huh-Yoo emphasizes that this study dives into an underexplored area of AI errors, focusing on image generation rather than just text.

“This work is quite innovative, contributing to the dialogue about AI biases, particularly since most discussions have revolved around text,” notes Huh-Yoo.

She expressed admiration for Vasir’s dedication to research and the high standard of her work. “Gursimran led this project, formulating research questions and methodologies on her own while I provided guidance.”

The project has attracted attention from industry experts, according to Huh-Yoo. “This issue is currently trending in design and graphics fields, as they face similar challenges with AI-generated content.”

As AI becomes more widespread in marketing, education, travel, and beyond, users will expect accurate, unbiased information and images, Vasir remarks. Establishing clear vocabulary to address AI’s shortcomings will aid in training it to produce better graphics.

“Developers need to provide technology that works correctly,” Vasir asserts. “When tools fail, it increases the potential for misuse. Creating a clear dialogue between users and developers is a crucial first step toward resolving these issues.”

More information:
Gursimran Vasir et al, Characterizing the Flaws of Image-Based AI-Generated Content, Proceedings of the Extended Abstracts of the CHI Conference on Human Factors in Computing Systems (2025). DOI: 10.1145/3706599.3720004

If you would like to see similar Tech posts like this, click here & share this article with your friends!

Leave a Reply

Your email address will not be published. Required fields are marked *

Related Post