x
What is explainable AI?
IoT News save
H
Posted

details

What is explainable AI?

Yet it’s true that AI systems, such as machine learning or deep learning, take inputs and then produce outputs (or make decisions) with no decipherable explanation or context. The system makes a decision or takes some action, and we don’t necessarily know why or how it arrived at that outcome. The system just does it. That’s the black box model of AI, and it’s indeed mysterious. In some use cases, that’s just fine. In other contexts, it’s plenty ominous.

“For small things like AI-powered chatbots or sentiment analysis of social feeds, it doesn’t really matter if the AI system operates in a black box,” says Stephen Blum, CTO and co-founder of PubNub. “But for use cases with a big human impact – autonomous vehicles, aerial navigation and drones, military applications – being able to understand the decision-making process is mission-critical. As we rely more and more on AI in our everyday lives, we need to be able to understand its ‘thought process’ and make changes and improvements over time.”

Read more…

 

No comments yet.

Commenting is limited to those invited by others in the community
or learn more.

x

Add to Collection