-
[hal-04840794] Transitive reasoning in the adult domestic hen in a six-term series task
Transitive inference (TI) is a disjunctive syllogism that allows an individual to indirectly infer a relationship between two components, by knowing their respective relationship to a third component (if A > B and B > C, then A > C). The common procedure is the 5-term series task, in which individuals are tested on indirect, unlearned relations. Few bird species have been tested for TI to date, which limits our knowledge of the phylogenetic spread of such reasoning ability. Here we tested TI in adult laying hens using a more solid methodology, the 6-term series task, which has not been tested in poultry so far. Six hens were trained to learn direct relationships in a sequence of six arbitrary items (A > B > C > D > E > F) in a hybrid training procedure. Then, 12 testing sessions were run, comprising 3 non-rewarded inference trials each: BD, BE, and CE. All subjects showed TI within 12 inference trials and were capable of TI whatever the relative distance between the items in the series. We found that TI performance was not impacted by the reinforcement ratios of the items for most individuals, making it harder to support a purely associative-based resolution of the task. We suggest that TI is based on the same cognitive processes in poultry ( Galloanserae ) than in modern flying birds ( Neoaves ), and that the cognitive strategy to solve the task might be driven mainly by individual parameters within species. These results contribute to a better understanding of transitive inference processes in birds.
ano.nymous@ccsd.cnrs.fr.invalid (R. Degrande) 16 Dec 2024
https://hal.inrae.fr/hal-04840794v1
-
[hal-04677329] Horses can learn to identify joy and sadness against other basic emotions from human facial expressions
Recently, horses and other domestic mammals have been shown to perceive and react to human emotional signals, with most studies focusing on joy and anger. In this study, we tested whether horses can learn to identify human joyful and sad expressions against other emotions. We used a touchscreen-based automated device that presented pairs of human portraits and distributed pellets when the horse touched the rewarded face. Six horses were trained to touch the sad face and 5 the joyful face. By the end of training, horses' performances at the group level were significantly higher than chance level, with higher scores for horses trained with the sad face. At the individual level, evidence of task learning varied among horses, which could be explained by individual variations in horses' ability to identify different human facial expressions or attention issues during the tests. In a generalization test, we introduced portraits of different humans than those presented during training. Horses trained with the joyful face performed better than chance, demonstrating generalization. Conversely, horses trained with the sad face did not. Horses also showed differences in learning performance according to the nonrewarded emotion, providing insights into horses' cognitive processing of facial expressions.
ano.nymous@ccsd.cnrs.fr.invalid (Plotine Jardat) 26 Aug 2024
https://hal.inrae.fr/hal-04677329v1
-
[hal-04069771] Horses cross-modally recognize women and men
Abstract Several studies have shown that horses have the ability to cross-modally recognize humans by associating their voice with their physical appearance. However, it remains unclear whether horses are able to differentiate humans according to different criteria, such as the fact that they are women or men. Horses might recognize some human characteristics, such as sex, and use these characteristics to classify them into different categories. The aim of this study was to explore whether domesticated horses are able to cross-modally recognize women and men according to visual and auditory cues, using a preferential looking paradigm. We simultaneously presented two videos of women and men’s faces, while playing a recording of a human voice belonging to one of these two categories through a loudspeaker. The results showed that the horses looked significantly more towards the congruent video than towards the incongruent video, suggesting that they are able to associate women’s voices with women’s faces and men’s voices with men’s faces. Further investigation is necessary to determine the mechanism underlying this recognition, as it might be interesting to determine which characteristics horses use to categorize humans. These results suggest a novel perspective that could allow us to better understand how horses perceive humans.
ano.nymous@ccsd.cnrs.fr.invalid (Chloé Gouyet) 14 Apr 2023
https://hal.inrae.fr/hal-04069771v1
-
[hal-04300609] What drives horse success at following human-given cues? An investigation of handler familiarity and living conditions
Cues such as the human pointing gesture, gaze or proximity to an object are widely used in behavioural studies to evaluate animals’ abilities to follow human-given cues. Many domestic mammals, such as horses, can follow human cues; however, factors influencing their responses are still unclear. We assessed the performance of 57 horses at a two-way choice task testing their ability to follow cues of either a familiar ( N = 28) or an unfamiliar informant ( N = 29). We investigated the effects of the length of the relationship between the horse and a familiar person (main caregiver), their social environment (living alone, in dyads, or in groups) and their physical environment (living in stalls/paddocks, alternating between paddocks and pastures, or living full time in pastures). We also controlled for the effects of horses’ age and sex. Our results showed that horses’ success rate at the task was not affected by the familiarity of the informant and did not improve with the relationship length with the familiar informant but did increase with the age of the horses. Horses living in groups had better success than the ones kept either in dyads or alone. Finally, horses housed in small paddocks had lower success than those living on pasture. These results indicate that with age, horses get better at following human-given indications regardless of who the human informant is and that an appropriate living and social environment could contribute to the development of socio-cognitive skills towards humans. Therefore, such aspects should be considered in studies evaluating animal behaviour.
ano.nymous@ccsd.cnrs.fr.invalid (Océane Liehrmann) 22 Nov 2023
https://hal.inrae.fr/hal-04300609v1
-
[hal-04011829] Horses discriminate human body odors between fear and joy contexts in a habituation-discrimination protocol
Animals are widely believed to sense human emotions through smell. Chemoreception is the most primitive and ubiquitous sense, and brain regions responsible for processing smells are among the oldest structures in mammalian evolution. Thus, chemosignals might be involved in interspecies communication. The communication of emotions is essential for social interactions, but very few studies have clearly shown that animals can sense human emotions through smell. We used a habituation-discrimination protocol to test whether horses can discriminate between human odors produced while feeling fear vs. joy. Horses were presented with sweat odors of humans who reported feeling fear or joy while watching a horror movie or a comedy, respectively. A first odor was presented twice in successive trials (habituation), and then, the same odor and a novel odor were presented simultaneously (discrimination). The two odors were from the same human in the fear or joy condition; the experimenter and the observer were blinded to the condition. Horses sniffed the novel odor longer than the repeated odor, indicating they discriminated between human odors produced in fear and joy contexts. Moreover, differences in habituation speed and asymmetric nostril use according to odor suggest differences in the emotional processing of the two odors.
ano.nymous@ccsd.cnrs.fr.invalid (Plotine Jardat) 02 Mar 2023
https://hal.inrae.fr/hal-04011829v1
-
[hal-03751952] Horses form cross-modal representations of adults and children
Recently, research on domestic mammals' sociocognitive skills toward humans has been prolific, allowing us to better understand the human-animal relationship. For example, horses have been shown to distinguish human beings on the basis of photographs and voices and to have cross-modal mental representations of individual humans and human emotions. This leads to questions such as the extent to which horses can differentiate human attributes such as age. Here, we tested whether horses discriminate human adults from children. In a cross-modal paradigm, we presented 31 female horses with two simultaneous muted videos of a child and an adult saying the same neutral sentence, accompanied by the sound of an adult's or child's voice speaking the sentence. The horses looked significantly longer at the videos that were incongruent with the heard voice than at the congruent videos. We conclude that horses can match adults' and children's faces and voices cross-modally. Moreover, their heart rates increased during children's vocalizations but not during adults'. This suggests that in addition to having mental representations of adults and children, horses have a stronger emotional response to children's voices than adults' voices.
ano.nymous@ccsd.cnrs.fr.invalid (Plotine Jardat) 16 Aug 2022
https://hal.inrae.fr/hal-03751952v1
-
[hal-04213124] Horses discriminate between human facial and vocal expressions of sadness and joy
Communication of emotions plays a key role in intraspecific social interactions and likely in interspecific interactions. Several studies have shown that animals perceive human joy and anger, but few studies have examined other human emotions, such as sadness. In this study, we conducted a cross-modal experiment, in which we showed 28 horses two soundless videos simultaneously, one showing a sad, and one a joyful human face. These were accompanied by either a sad or joyful voice. The number of horses whose first look to the video that was incongruent with the voice was longer than their first look to the congruent video was higher than chance, suggesting that horses could form cross-modal representations of human joy and sadness. Moreover, horses were more attentive to the videos of joy and looked at them for longer, more frequently, and more rapidly than the videos of sadness. Their heart rates tended to increase when they heard joy and to decrease when they heard sadness. These results show that horses are able to discriminate facial and vocal expressions of joy and sadness and may form cross-modal representations of these emotions; they also are more attracted to joyful faces than to sad faces and seem to be more aroused by a joyful voice than a sad voice. Further studies are needed to better understand how horses perceive the range of human emotions, and we propose that future experiments include neutral stimuli as well as emotions with different arousal levels but a same valence.
ano.nymous@ccsd.cnrs.fr.invalid (Plotine Jardat) 21 Sep 2023
https://hal.science/hal-04213124v1
-
[hal-03626271] Pet-directed speech improves horses’ attention toward humans
In a recent experiment, we showed that horses are sensitive to pet-directed speech (PDS), a kind of speech used to talk to companion animals that is characterized by high pitch and wide pitch variations. When talked to in PDS rather than adult-directed speech (ADS), horses reacted more favorably during grooming and in a pointing task. However, the mechanism behind their response remains unclear: does PDS draw horses’ attention and arouse them, or does it make their emotional state more positive? In this study, we used an innovative paradigm in which female horses watched videos of humans speaking in PDS or ADS to better understand this phenomenon. Horses reacted differently to the videos of PDS and ADS: they were significantly more attentive and their heart rates increased significantly more during PDS than during ADS. We found no difference in the expressions of negative or positive emotional states during PDS and ADS videos. Thus, we confirm that horses’ perception of humans can be studied by means of video projections, and we conclude that PDS attracts attention and has an arousing effect in horses, with consequences on the use of PDS in daily interactions with them.
ano.nymous@ccsd.cnrs.fr.invalid (Plotine Jardat) 31 Mar 2022
https://hal.inrae.fr/hal-03626271v1
-
[hal-03753546] Domestic hens succeed at serial reversal learning and perceptual concept generalisation using a new automated touchscreen device
Improving the welfare of farm animals depends on our knowledge on how they perceive and interpret their environment; the latter depends on their cognitive abilities. Hence, limited knowledge of the range of cognitive abilities of farm animals is a major concern. An effective approach to explore the cognitive range of a species is to apply automated testing devices, which are still underdeveloped in farm animals. In screen-like studies, the uses of automated devices are few in domestic hens. We developed an original fully automated touchscreen device using digital computer-drawn colour pictures and independent sensible cells adapted for cognitive testing in domestic hens, enabling a wide range of test types from low to high complexity. This study aimed to test the efficiency of our device using two cognitive tests. We focused on tasks related to adaptive capacities to environmental variability, such as flexibility and generalisation capacities as this is a good start to approach more complex cognitive capacities. We implemented a serial reversal learning task, categorised as a simple cognitive test, and a delayed matching-to-sample (dMTS) task on an identity concept, followed by a generalisation test, categorised as more complex. In the serial reversal learning task, the hens performed equally for the two changing reward contingencies in only three reversal stages. In the dMTS task, the hens increased their performance rapidly throughout the training sessions. Moreover, to the best of our knowledge, we present the first positive result of identity concept generalisation in a dMTS task in domestic hens. Our results provide additional information on the behavioural flexibility and concept understanding of domestic hens. They also support the idea that fully automated devices would improve knowledge of farm animals’ cognition.
ano.nymous@ccsd.cnrs.fr.invalid (Rachel Degrande) 18 Aug 2022
https://hal.inrae.fr/hal-03753546v1
-
[hal-04216008] Tester les capacités de métacognition pour étudier la conscience chez les mammifères
Chez les animaux, étudier la conscience ou les processus mentaux de manière générale reste relativement compliqué. En effet ces derniers ne peuvent pas rapporter verbalement s’ils sont conscients de leurs actions, de ce qu’ils ont ou non en mémoire, ou de ce qu’ils comprennent des informations présentées. Pour contourner cette difficulté inhérente aux études animales, les recherches ont consisté à développer des méthodologies pour étudier des processus mentaux qui, chez l’humain, impliquent un traitement conscient des informations. L’étude de la métacognition animale est probablement un des domaines de recherche où ce type de développement méthodologique a été très poussé et qui a largement contribué au questionnement sur la conscience chez les animaux. La métacognition est une forme de cognition qui permet à un individu d’évaluer le niveau de ses connaissances. En d’autres termes, elle lui permet de rendre compte « qu’il sait qu’il sait » ou « qu’il sait qu’il ne sait pas », et donc qu’il a conscience de son niveau de connaissance. Cette faculté mentale a longtemps été considérée comme exclusivement humaine mais une étude pionnière chez le dauphin en 1995 a remis en question cette vision. Depuis cette expérience, un développement important de paradigmes expérimentaux a été entrepris pour tester la métacognition animale, en particulier chez les mammifères. Ces paradigmes permettent de tester deux aspects de la métacognition : le monitoring métacognitif (i.e. la capacité à juger de son propre état de connaissance) et le contrôle métacognitif (la capacité à rechercher des informations lorsqu'un manque de connaissance a été détecté)
ano.nymous@ccsd.cnrs.fr.invalid (Ludovic Calandreau) 23 Sep 2023
https://hal.inrae.fr/hal-04216008v1
-
[hal-03361795] Cognition and the human–animal relationship: a review of the sociocognitive skills of domestic mammals toward humans
In the past 20 years, research focusing on interspecific sociocognitive abilities of animals toward humans has been growing, allowing a better understanding of the interactions between humans and animals. This review focuses on five sociocognitive abilities of domestic mammals in relation to humans as follows: discriminating and recognizing individual humans; perceiving human emotions; interpreting our attentional states and goals; using referential communication (perceiving human signals or sending signals to humans); and engaging in social learning with humans (e.g., local enhancement, demonstration and social referencing). We focused on different species of domestic mammals for which literature on the subject is available, namely, cats, cattle, dogs, ferrets, goats, horses, pigs, and sheep. The results show that some species have remarkable abilities to recognize us or to detect and interpret the emotions or signals sent by humans. For example, sheep and horses can recognize the face of their keeper in photographs, dogs can react to our smells of fear, and pigs can follow our pointing gestures. Nevertheless, the studies are unequally distributed across species: there are many studies in animals that live closely with humans, such as dogs, but little is known about livestock animals, such as cattle and pigs. However, on the basis of existing data, no obvious links have emerged between the cognitive abilities of animals toward humans and their ecological characteristics or the history and reasons for their domestication. This review encourages continuing and expanding this type of research to more abilities and species.
ano.nymous@ccsd.cnrs.fr.invalid (Plotine Jardat) 22 Aug 2022
https://hal.inrae.fr/hal-03361795v1