<img alt="" src="https://secure.insightful-enterprise-52.com/784587.png" style="display:none;">
28 Mar 2024

Fact or Fiction? The imperative of validating AI-generated information

Spokesperson : Suresh Sambandam
Fact or Fiction? The imperative of validating AI-generated information

Reports suggest while infrequent, AI hallucinations are constant and make up between 3 to 10 percent of the responses to prompts that users submit to gen AI models. IBM has defined AI hallucinations as – a phenomenon where an LLM perceives patterns or objects that are nonexistent or imperceptible to human observers, creating outputs that are nonsensical or altogether inaccurate. 

In this latest article for Edge Middle East, Sindhu V Kashyap discusses AI hallucinations and the genuineness of information received from AI tools.  Suresh Sambandam, CEO at Kissflow shares his insights on the significant challenges in perpetuated by the widespread trust in technology. 

“While AI, particularly models like Gen AI, exhibit remarkable capabilities in content generation and transformation, it’s essential to recognize their limitations. Gen AI operates based on patterns and historical data, lacking true understanding, feeling, or sense. These limitations can lead to inconsistencies or “hallucinations” in its outputs. Additionally, repeatability, a critical aspect of reliability in computer programs, is often lacking in gen AI. Such limitations pose challenges, especially in critical applications where reliability is paramount.” 

Read the full article here

Our news directly to your email
Share this article
Related News

02 May 2024

How Enterprises can avoid Tech Debt while Pioneering Progress

Read more Arrow

29 Apr 2024

An Approach As Distinctive As His Moustache

Read more Arrow

28 Apr 2024

How We Leveraged AI To Take Our Company To The Next Level

Read more Arrow

12 Apr 2024

Unpacking the Potential of Low-code/No-code Platforms

Read more Arrow

11 Apr 2024

15 Elements of Effective Change Management

Read more Arrow

Get in touch with us.