Loading...

An AI-powered app is decoding baby cries and hopes to detect autism

An AI-powered app is decoding baby cries and hopes to detect autism
Loading...

The wonders of artificial intelligence (AI) and machine learning (ML) are increasingly on display in various aspects of daily life. Now, can these technologies help tell us what babies think and feel?

That is what Dr Ariana Anderson and her team at UCLA (University of California, Los Angeles) are aiming to achieve with an app-based AI engine called ChatterBaby.

The free app, which is available on the iOS and Google Play stores, has analysed data from more than 1,700 babies. It runs on an AI-driven algorithm that uses signal processing and machine learning to determine which acoustics match with the baby's needs at a particular moment.

Loading...

"With the help of artificial intelligence, our algorithm correctly flags over 90% of pain cries," Anderson,  an assistant professor  in the department of psychiatry and biobehavioral sciences. explains on her website. 

So how was this database built?

First, painful cries were recorded from babies who were receiving vaccines or ear-piercings. Other cries were also collected (fussy, hungry, separation anxiety, colic, scared) and were labeled by a panel of veteran mothers. 

Loading...

Only the cries whose labels were universally agreed upon by the panel were used to teach the algorithm, Anderson said.

The app has currently predicts only three kinds of cries - hunger, pain, fussy - Anderson said, adding that these specific states are not developmentally dependent, and have consistent acoustic patterns for newborns and older babies.

Anderson is also trying to use the app and the platform to detect early signs of autism in the babies by trying to see if irregularities in pain cries can point towards the developmental disorder which affects communication skills. 

Loading...

And in order to get the data for the research, Anderson is asking parents to provide consent to the app and the company collecting data about their babies. The audio files are recorded by the app and sent to its servers to be analysed.

According to the app's website, Anderson and her colleagues -- Lauren Dunlap (mobile developer) and  Usha Nookala (signal processing expert) -- have been funded by  UCLA Clinical and Translational Science Institute, Burroughs Wellcome Fund and Semel Institute for Neuroscience and Human Behavior at UCLA.


Sign up for Newsletter

Select your Newsletter frequency