<img alt="" src="https://secure.insightful-enterprise-52.com/784587.png" style="display:none;">
28 Mar 2024

Fact or Fiction? The imperative of validating AI-generated information

Spokesperson : Suresh Sambandam
Fact or Fiction? The imperative of validating AI-generated information

Reports suggest while infrequent, AI hallucinations are constant and make up between 3 to 10 percent of the responses to prompts that users submit to gen AI models. IBM has defined AI hallucinations as – a phenomenon where an LLM perceives patterns or objects that are nonexistent or imperceptible to human observers, creating outputs that are nonsensical or altogether inaccurate. 

In this latest article for Edge Middle East, Sindhu V Kashyap discusses AI hallucinations and the genuineness of information received from AI tools.  Suresh Sambandam, CEO at Kissflow shares his insights on the significant challenges in perpetuated by the widespread trust in technology. 

“While AI, particularly models like Gen AI, exhibit remarkable capabilities in content generation and transformation, it’s essential to recognize their limitations. Gen AI operates based on patterns and historical data, lacking true understanding, feeling, or sense. These limitations can lead to inconsistencies or “hallucinations” in its outputs. Additionally, repeatability, a critical aspect of reliability in computer programs, is often lacking in gen AI. Such limitations pose challenges, especially in critical applications where reliability is paramount.” 

Read the full article here

Our news directly to your email
Share this article
Related News

01 Apr 2024

The balancing act of data vs ethics, how can AI be used in boardrooms

Read more Arrow

28 Mar 2024

Fact or Fiction? The imperative of validating AI-generated information

Read more Arrow

27 Mar 2024

Why is it important for CXOs to master the art of AI-Culture fusion

Read more Arrow

21 Mar 2024

The Big Interview with Prasanna Rajendran

Read more Arrow

11 Mar 2024

Kissflow: Nurturing a Culture of Happiness

Read more Arrow

Get in touch with us.