A new study shows how ChatGPT and other artificial intelligence (AI) tools help people with autism confront workplace problems. Additionally, the Carnegie Mellon University research team found that AI systems sometimes give questionable advice. Because of this, there is controversy within the autism community about whether the use of chatbots is a good idea.

AI Addressing Issues

“What we found is there are people with autism who are already using ChatGPT to ask questions that we think ChatGPT is partly well-suited and partly poorly suited for,” said Andrew Begel with CMU’s School of Computer Science. “For instance, they might ask: ‘How do I make friends at work?'”

Begel is in charge of the VariAbility Lab, which seeks to develop workplaces that fit the needs of all people, including those with autism and neurodivergent disorders. According to researchers, underemployment and unemployment are problems with the autism community. They say as many as nine out of 10 adults with autism are impacted by one of the two. In addition, many workplaces don’t have the appropriate resources to help them and their coworkers overcome social and communication problems.

Begel and his team recruited 11 people with autism to understand how large language models (LLMs) could address this shortcoming. They tested two advice from two sources. One was a chatbot similar to ChatGPT-4. The other was a human disguised as a chatbot to participants. Surprisingly, the participants preferred the real chatbot. Begel believes it’s how the chatbot dispensed the advice rather than the advice itself.

“The participants prioritized getting quick and easy-to-digest answers,” Begel said.

AI Advice

The AI chatbot provided black-and-white, subtle answers that were usually in bullet points. In contrast, the human counselor disguised as a chatbot dug deeper. For example, the human asks questions about what the user wants to do about a situation or why they do it. According to Begel, most users weren’t interested in engaging in a back-and-forth conversation.

One participant explained why they prefer the chatbot, saying, “I think, honestly, with my workplace … it’s the only thing I trust because not every company or business is inclusive.”

However, when an expert in supporting job seekers with autism evaluated the AI responses, she found that some weren’t helpful. For example, when a user asked how to make friends, the chatbot recommended walking up to someone and talking to them. Begel said that the problem is most people with autism don’t feel comfortable doing that.

Begel said it is possible that chatbots trained to address problems could avoid giving bad advice. However, he doesn’t think everyone in the autism community will embrace it. Some might see it as a valuable tool, while others might see it as another instance of people expecting those whose brains work differently to accommodate everyone else.

Begel said, “There’s this huge debate over whose perspectives we privilege when we build technology without talking to people.”