The Digital Dr. Doolittle: Artificial Intelligence & Chicken Speak

Artificial Intelligence has been hitting the news a lot recently. What used to be firmly in the realms of Sci-Fi is now within our reach, however it’s rapid development has given great minds such as Elon Musk and Stephen Hawking the goosebumps. IBM’s medical marvel ‘Watson’ and the plethora of digital drivers being eyed-up by the automobile industry may have to take a back seat to the University of Georgia’s latest project however: a machine that can understand the ‘language’ of chickens.

Wait, what?

Yeah, chickens. You should probably read on a little further to understand how and why.

Photo via Pixabay

The Research

The research, in tandem with the Georgia Institute of Technology, seeks to use machine learning to decipher the meaning behind chickens’ chatter. This data will then help the farm better regulate the chickens’ health and well-being.

While not a ‘language’ as such, chicken farmers often report that they can ‘feel’ the mood of their chickens. This is much harder to do, however, in commercial farms, where there are on average 20,000 chickens in a single space. Over the past five years, researchers and scientists have analysed “patterns of speech” in chickens and found that they contain plenty of information regarding their physical and emotional health.

Photo via Pixabay

The Results

Research engineer Wayne Daley and his colleagues at Georgia Tech have been studying the effects of mildly stressful situations on groups of 6-12 broiler chickens. The chickens’ vocal responses to higher temperatures, mild infections and higher levels of ammonia in the air were then fed in to an AI learning program, which was then able to tell the difference between happy and unhappy chickens. In fact, the program is even reported to be able to tell when chickens have a certain respiratory infection, due to the build-up of mucus in the airways.

While there are many more known vocalisations to analyse, as well as more stress responses to record, Carolynn Smith, a leading expert on chicken vocalisations at Macquarie University in Australia, described the study as “a neat proof of concept”. The potential to improve remote control farming is a highly lucrative one. Farmers can receive text messages alerting them to disgruntled chickens’ complaints, which they can then remedy at the click of a button. Too hot in the eastern corner? The automated windows can be open and shut remotely via a phone app. Unexpected ruckus near the door at 3am? Call security and let them know. With this technology, a handful of farmers can maintain many broiler houses without having to travel between them, or even be on site.

The Reality

There are still a few bugs to iron out before these applications can be realised, however. As mentioned before, the machine learning has only built a small ‘vocabulary’ so far and so a lot more work needs to be done on chicken vocalisations. Possibly the greatest hurdle is simply that modern commercial farms are very loud. All the machinery which farmers hope to one day remotely control currently make a lot of noise and are likely to continue doing so. Filtering out all that background noise will require further tech wizardry, and will likely determine the future commercial success of this program. What is the use of becoming fluent in chicken if you can’t hear what they are saying? Still, there is no shortage of parties out there interested in owning a computer which can filter out background noises in order to focus on specific pieces of speech, so there is plenty of imperative to keep working on this project.

Photo via Pixabay

The Rest

Chickens aren’t the only creatures whose language is being investigated, scientists at The University of Nottingham and Queen Mary University of London published a paper in 2016 detailing their findings on the wonderful language of cows. Researchers used acoustic analysis to prove a long-standing belief that cows and their calves used individualised language to communicate. The study identified at least two ‘maternal’ calls, a low pitched call when the calf was near and a louder, higher-pitched call when the calf was out of sight. The calves themselves also had a special call for when they wanted to start suckling. Dr Monica Padilla de la Torre from The University of Nottingham said that “each calf and cow have a characteristic and exclusive call of their own. Acoustic analysis also reveals that certain information is conveyed within the calf calls – age, but not gender.”.

Dr Alan McElligott at Queen Mary University of London also saw future farming applications for this research, saying “This is the first time that complex cattle calls of have been analysed using the latest and best techniques. Our results provide an excellent foundation for investigating vocal indicators of cattle welfare”. Further work needs to be conducted though, regarding cow ‘vocabulary’, specifically indicators of stressful conditions that farmers can, and are willing to, improve. The proof is there though, animals do say stuff. They can indicate concern and stress, and we can act to remedy some of these maladies.

The use of technology to decipher animal language is not only fascinating from a zoological and linguistic perspective but also to help us better understand the physical and emotional stress farm animals are under. With this knowledge we can increase their quality of life. These studies may well result in less stressed farm animals, but they also have the potential to make some people feel very uncomfortable. The practical applications are clear, yet an awkward question is left unanswered: Do you really want to know how sad your food is before you eat it?

Comments on The Digital Dr. Doolittle: Artificial Intelligence & Chicken Speak