No More Angry Birds: Investigating Touchscreen Ergonomics to Improve Tablet-Based Enrichment for Parrots (2024)

Rebecca Kleinberger, College of Art, Media, Design & Khoury College of Computing, Northeastern University, United States and Media Lab, MIT, United States, r.kleinberger@northeastern.edu

Jennifer Cunha, Northeastern University, United States and Parrot Kindergarten, United States, jen.cunha@gmail.com

Megan McMahon, Northeastern University, United States, mcmahon.me@northeastern.edu

Ilyena Hirskyj-Douglas, University of Glasgow, United Kingdom, ilyena.hirskyj-douglas@glasgow.ac.uk


Touchscreen devices, ubiquitous in humans’ day-to-day life, offer a promising avenue for animal enrichment. With advanced cognitive abilities, keen visual perception, and adeptness to engage with capacitive screens using dexterous tongues, parrots are uniquely positioned to benefit from this technology. Additionally, pet parrots often lack appropriate stimuli, supporting the need for inexpensive solutions using off-the-shelf devices. However, the current human-centric interaction design standards of tablet applications do not optimally cater to the tactile affordances and ergonomic needs of parrots. To address this, we conducted a study with 20 pet parrots, examining their tactile interactions with touchscreens and evaluating the applicability of existing HCI interaction models. Our research highlights key ergonomic characteristics unique to parrots, which include pronounced multi-tap behavior, a critical size threshold for touch targets, and greater effectiveness of larger targets over closer proximity. Based on these insights, we propose guidelines for tablet-based enrichment systems for companion parrots.

CCS Concepts:Human-centered computing → Interaction design; • Human-centered computing → Usability testing; • Human-centered computing → Touch screens;


Keywords: Animal-Computer Interactions, Touchscreen Interactions; Fitts’ Law, Animal Usability, Animal Enrichment, Interspecies Interactions, Parrot


ACM Reference Format:
Rebecca Kleinberger, Jennifer Cunha, Megan McMahon, and Ilyena Hirskyj-Douglas. 2024. No More Angry Birds: Investigating Touchscreen Ergonomics to Improve Tablet-Based Enrichment for Parrots. In Proceedings of the CHI Conference on Human Factors in Computing Systems (CHI '24), May 11--16, 2024, Honolulu, HI, USA. ACM, New York, NY, USA 16 Pages. https://doi.org/10.1145/3613904.3642119

No More Angry Birds: Investigating Touchscreen Ergonomics to Improve Tablet-Based Enrichment for Parrots (1)

1 INTRODUCTION

Conceived initially for air traffic control and first patented in 1960 [46], touchscreens have found widespread use in diverse domains. Best practices have emerged for various contexts, including the medical[32, 89], commercial[88], and artistic fields[72], and for specific user groups like older adults[8], young children[55], or people with different disabilities[16, 25, 94, 96]. Advancements in technology have optimized the designs and usability of touchscreens, making them highly suited for human touch across various contexts. This encompasses improvements in hardware (precise capacitive touch, high-resolution displays, color spectrum, refresh rate), physical characteristics (screen size, handleability, weight), and ergonomic considerations (target size and distance, screen location, multi-tap functionality, drags, timings, etc.).

The Computer-Human Interaction (CHI) community has played an important role in enhancing the capabilities of touchscreens. For instance, researchers have focused on incorporating multitouch functionalities[5], improving accuracy for precise small target selection[71], augmenting the interaction experience when the visual attention is directed elsewhere[37], or distinguishing subtle movements, such as rolling versus sliding motions on screens, for more intricate interactions[81]. The CHI community has also dived into novel form factors such as deformable touchscreens for novel dimensions to interactivity[84]. Personalized touchscreen interfaces have been developed to cater to individual user preferences and needs, ensuring more intuitive engagement[73].

Meanwhile, the use of interactive technology has also increased for non-human users, with numerous examples of touchscreens used with animals for cognitive studies[15], enrichment[70], and assisting working animals[86]. Studies with primates[98], dogs [102, 103], bears[77], cats[70], tortoises[69], rats[14], and other species [15] have demonstrated the ability of non-human animals to interact with touchscreens intentionally. Applications vary from public education[39] to cognitive and behavioral studies[15]. Particularly relevant to this work is the potential of touchscreens to enhance animal welfare and well-being through cognitive or social enrichment and by increasing their agency[15]. Indeed, touchscreen-based activities have shown to be enriching for various species[6, 65, 77, 93].

Yet, touchscreen experiences for animals often require bespoke systems tailoring the hardware to cater specifically to the physiological characteristics as they often diverge from the standards of human interfaces. For instance, touch detection usually requires adaptations, addressing either the absence of capacitive touch capabilities[26] or excessive moisture from saliva[102]. The visual interface must be made compatible with the animal's visual spectrum[15], and devices must be fortified to prevent animal damage and ensure their safety[68]. Special touchscreen devices have been designed to accommodate specific animal environments, such as underwater sonar-based touchscreens for dolphins [1].

If possible, there could be significant benefits in creating systems for animals that can be deployed on common human devices for easy dissemination. For instance, some non-human primates can trigger capacitive touch with their fingers[3, 98], mammals such as cats can use mobile games with their paws[70], and some species of birds can trigger capacitive touch with their tongues[52]. However, to ensure that these commercial devices, originally designed for humans, meet the specific needs of animals, their software and interfaces should be appropriately modified.

Pet parrots, in particular, might well benefit from tablet-based enrichment systems. In addition to triggering capacitive touch with their tongue, they possess acute visual perception[100], making them well-suited to the dynamic displays of touchscreens. Additionally, parrots often suffer under-stimulation and lack of conspecific companionship in domestic settings, leading to behavioral issues and abnormal stereotypies[23, 97]. Finally, previous research has highlighted parrots’ willingness and capacity to use touchscreens for social and cognitive enrichment[11, 52] as well as their ability to differentiate between live and prerecorded calls[33]. Thus, leveraging touchscreen technology can potentially cater to their cognitive and social needs. However, although they can trigger touch and potentially make sense of on-screen interactions, research is lacking to examine their specific ergonomic needs and ways to optimize tablet affordances to match their abilities and physiology.

Given the strong enrichment potential of touchscreen-based tablet interactions for parrots and the challenges birds meet when interacting with human-centered applications, we tackle the lack of parrot-specific knowledge about tactile adaptations and ergonomics with respect to off-the-shelf touchscreen devices. To this end, we conducted a study with 20 pet parrots (17 of whom completed the full study), collecting video and touch data as the birds interacted with a specially designed app.

After an initial survey of parrot caregivers to gain insights into the current usage patterns and inherent challenges of touchscreens with their avian companions, we developed an initial version of a target-touching application refined through a pilot study with five birds. We designed a learning protocol to engage parrots ethically with the touchscreen applications, and then ran the main experiment over three months. The first part, the learning phase, systematically introduced the parrots to touchscreen interfaces, gradually refining their interaction skills until they reached a consistent plateau – their individual “stable score.” After this, parrots who demonstrated stable engagement (17 out of 20) were involved in a target-touching task. We used the video and touch data to answer three research questions:

  • Do pet birds touch the screen the same way humans do? This was evaluated in terms of various ergonomic factors systematically collected by the device, including touch pressure, drag (distance between touch down and touch up), multi-tapping, and overall hit rate (how often the user presses within the target vs outside of it).
  • Can target distance and size be optimised for bird users? To determine this, we analyzed the birds’ exact touch points on the screen relative to the target's geometry, assessing whether they mostly aim at the target's center or optimize their movement time by aiming at the edge of the target.
  • Can Fitts's Law – a cornerstone in Human-Computer Interaction (HCI) predicting a linear relationship between movement time and distance – be validated with pet birds?

Our results suggest that birds interact with tablet screens differently than humans, exhibiting lighter touch pressure, more drag, higher rates of multi-tapping, and lower hit rates. Additionally, successful hit rates are not significantly impacted by target distance but instead by target size, with parrots requiring a minimum threshold of approximately 100pd/26mm. This target size is a more important predictor of touch success than a target's location on the screen or the distance between subsequent targets. Our data also question the applicability of Fitts's Law for pet parrots, indicating a movement time overhead that is independent of target distance. We speculate that this arises from a ’touch-and-retreat’ behavior, as evidenced by visual observations, potentially stemming from the anatomical characteristics of parrots. Specifically, the proximity of their eyes to their tongues may influence the visual and proprioceptive feedback loop. These insights contribute significantly to the field by providing:

  • An ethical, coercion-free protocol for usability testing with pet parrots, respecting their agency.
  • Quantitative data from an extensive study involving 20 birds and their caregivers.
  • Qualitative design considerations derived from caregiver feedback.
  • Key insights for optimizing target size and distance for touchscreens avian interaction, compatible with mainstream tablets.

These findings improve our understanding of how birds interact with technology and open new avenues for designing more inclusive and species-specific user interfaces.

By converging the insights from animal behavior with HCI principles, we aim to fill the gap in knowledge about parrot ergonomics for touchscreen interfaces. While there is a strong potential for using touchscreens as enrichment tools for parrots, our research paves the way for designing applications that are not just accessible but also intuitive, enriching, and personalized for animals. This project was approved for Animal (number EA31/23) and Human Use as Subjects (number 300220153) by the ethics board of the University of Glasgow.

2 BACKGROUND

Recent developments in HCI and ACI spotlight the value of touchscreens for animal enrichment. This section draws from key studies on animals’ touchscreen interactions, situating our research within established frameworks, to delve deeper into the nuances of bird-screen interactions.

2.1 Animals’ Interactions with Touchscreens

In recent years, the HCI and ACI (Animal-Computer Interactions) communities have worked on developing agency-enhancing tools to provide enrichment to captive animals based on their unique bodies, such as dogs[35], cats[99], parrots[53, 79], zoo animals such as monkeys[34], elephants[21] and birds[51], farm animals like chickens[54, 59] and cows[28], as well as marine mammals[80, 82]. When examining the use of technology by animals, it is essential to consider interactivity and agency as key factors to guarantee an ethical and meaningful experience for the animals[4]. In a comprehensive review of zoo-based cognitive research employing touchscreen interfaces, Egelkamp and Ross[15] identified 12 species that have been studied using touchscreen devices in zoological contexts, with the majority being primates. Since this 2019 survey, additional species have been tested including kea parrots (Nestor notabilis)[2], Japanese macaques (Macaca fuscata)[42], garrano horses (Equus ferus caballus)[85], and hens (Gallus gallus domesticus)[12]. For animals to use touchscreen hardware, the system often needs to be adapted to accommodate a diversity of interaction methods, from the beaks of laboratory-housed birds[26, 41] to the snouts or tongues of various mammals[20, 77, 103]. Innovative developments allow dolphins to use their sonar with specially constructed underwater touchscreen interfaces[1]. Yet, challenges arise when creating interactive screens when considering the ergonomic needs and visual perceptions of different species. For instance, capacitive touchscreens, commonly utilized in zoos, operate on electrical charges. These screens have faced difficulty registering interactions from birds or apes due to their respective beaks and thicker skin[58]. Alternatives such as infrared interfaces or the Echolocation Visualisation Interface System (ELVIS) cater to more robust species or those with unique interaction methods, like marine mammals[1]. The use of touchscreen systems in these environments goes beyond cognitive tests[15] and presents opportunities for assessing various behavioral aspects, to assessing mood and personality traits. It has also been suggested that the use of touchscreens may also help enhance animal welfare, as there is a growing consensus around the importance of providing choice to these animals, and touchscreens offering a potential medium for expression[78].

2.2 Birds’ Interaction with Screens

Although parrot vision varies significantly from humans – and we also see some variation within parrot species – there is strong evidence that parrots can perceive and make sense of screen-based stimuli and interactions[27]. Trained birds have been interacting with screens at least since the 1940 Ocron project for which Skinner trained pigeons (Columbidae) housed within missile capsules to guide their trajectory by pecking at a dot on a screen[90]. Since then, researchers have used pecking behaviors on screens to explore birds’ cognitive abilities and attention[13]. In operant conditioning paradigms, researchers have assessed pigeons’ aptitudes in forming associations[24], discerning numerical differences[83], participating in intricate match-to-sample assignments[101], and in determining their recognition of both known and unknown faces[92]. Despite potential challenges arising from birds’ perception of screen-based stimuli, such as the limited ultraviolet (UV) spectrum[30, 61, 100] and the high critical flicker fusion frequency (CFF) –the frequency at which flickering light can be perceived as continuous (50–90 Hz in humans and higher in birds).–[31, 67, 76], evidence suggests that many bird species are still able to interpret on-screen visuals. For instance, Goffin's co*ckatoos (Cacatua goffiniana) have corroborated complex discrimination selections on touch screen devices[10] and budgerigars (Melopsittacus undulatus) can mirror conspecific behaviors through telecommunication[44]. Furthermore, studies indicate that even species with theoretically high CFF, such as tit*, can learn from videos[29]. This is consistent with findings that suggest some birds respond meaningfully to video-based cues[17, 57]. However, rich bodies of work from Pepperberg and from Okanoya have shown that birds’ reactions and behaviors to on-screen interactions depend not only on the hardware[43] and the liveness context[44], but also on what one is trying to teach them[22, 74, 76, 87]. In summary, while accounting for context and species-specific sensory constraints is crucial, many bird species have already demonstrated a capability to engage meaningfully with screen-based content.

2.3 Parrot Tongue Anatomy

The unique anatomy of a parrot's tongue also plays a crucial role in their touchscreen interactions and might offer insights into their distinctive mode of engagement. Parrots are known for their robust beaks, which they use along with their powerful feet to climb, grasp, and break nuts. When engaging with technological interfaces, parrots have been shown to use both their beaks[11] and their feet[53], but more often rely on their tongues to trigger touchscreens[3, 52, 79]. Parrots exhibit very fine dexterity with their tongues. As primary seed and nut eaters, they possess thick and muscular tongues that allow for both strength and fine dexterity, more similar to mammals than other birds[47]. In many species, a wrinkled and folded tongue epithelium confers added flexibility and stretchability to the tongue[38]. When parrots interact with food, tactile stimulation induces a series of rapid posterior tongue movements, pushing the food toward the pharynx. Parrots possess fewer touch receptors (∼ 350) than humans (∼ 9000) and experience their food primarily based on touch[50]. Indeed, their tongue, oral cavity, and beak have a rich supply of touch receptors. Rapid saccading movements of the tongue also enable the parrot to adroitly manipulate seeds, rotating them to detect and exploit weak points in their shells[50]. Many parrot species, including macaws and co*ckatoos, exhibit an upper mandible specifically designed to crack seeds of various sizes. The seed, manipulated by the tongue, is positioned so the lower mandible can exploit its weak points[47]. Contrary to many other birds that moisten food with mucus, parrots have relatively dry tongues due to the reduced salivary secretion[47]. Given this combination of dexterity, strength, and ability for strong and precise movements, the tongue offers a promising way for parrots to interface with touchscreens. However, limited moisture for some species might influence the reliability of touch detection.

2.4 Practices for Ethical Learning about Parrots with Touchscreens

To adapt on-screen app for parrots and optimize their experience, we need to consider training approaches that not only promote engagement but also enhance the accuracy of their interactions[11]. Since our work focuses on enhancing animal experiences through agency-based enrichment systems, we use positive rather than negative reinforcement. Two primary training methods emerge: Automated Training Systems (AUTs) and Learning Methods (SLMs). AUTs train animals to interact with devices without human intervention, increasing in complexity as benchmarks are achieved [9]. While they lessen the human workload and can be placed in the animal's environment, they might require thousands of trials, potentially affecting animal welfare[18]. Social SLMs include human involvement and often yield quicker results[75]. This approach tailors the learning experience to individual animals, potentially providing a more enriching experience[75]. Given the importance of novelty and control for animal psychological well-being, SLMs appear more aligned with our objectives as they offer an individualized and less repetitive training regime. [98] Additionally, constant access to base food and water lowers the risk of coercion as they are not forced to engage in behaviors to meet their basic needs.

The ergonomic study of parrots and touchscreens is tied to the cognitive and behavioral aspects of how parrots use such devices. For optimal designs, we must understand the nuances of parrot interactions, mistakes, and habits. Freil et al.[20] provide a framework that classifies execution errors into two categories: slips and mistakes. Slips are errors due to motor control or mechanical factors. For example, a split might be if a user accidentally presses a stimulus because they are too closely spaced. In ergonomic terms, this signifies a need for environmental adjustments or further motor control training. Mistakes are failures in the plan and outcome, indicating cognitive challenges or misunderstandings. Understanding these errors can help refine app designs, making them more intuitive for parrots. In our study, the personalized learning phase aims to reduce mistakes and focus on slips. By integrating such frameworks into our ergonomic study, we can design touchscreens that not only fit the physical needs of parrots but also their cognitive and behavioral tendencies while maintaining an ethical approach.

2.5 HCI Evaluation Methods and Fitts's Law

An essential mission of HCI revolves around designing, evaluating, and implementing interactive systems for human use. Several methodologies have been developed to evaluate different aspects of HCI, including usability testing[62], cognitive walkthroughs[60], and model-based evaluations[49]. Another approach is through the use of predictive models. Such models aim to match the movement limits, capabilities, and potential of humans with input devices and interaction techniques on computing systems and allow metrics of human performance to be determined analytically[63]. Amongst various predictive models, Fitts's Law is particularly relevant in HCI when designing and evaluating pointing devices like the mouse, stylus, or touch screens [19]. The law can be stated as:

\begin{equation} MT = a + b \cdot ID \end{equation}
(1)

where MT is the movement time or the time taken to complete the motion. a and b are empirical constants dependent on the task conditions and the device used. These constants can be derived through linear regression on observed data. ID is the Index of Difficulty, which quantifies how difficult a particular movement task is. The Index of Difficulty is further defined as:

\begin{equation} ID = log_2(2A/W) \end{equation}
(2)

with A being the amplitude or the distance to the target and W the width of the target or its size in the direction of motion. Including the factor of 2 in the formula for ID ensures that tasks where the distance A is twice the target width W, have an ID of 1, making them baseline tasks in terms of difficulty.

Fitts's Law has been shown to apply in various conditions (including control with feet[36], or underwater[48]), and with various populations such as children[40], older adults[7], special needs participants[91], and drugged participants[56]). By understanding the relationship between movement time and difficulty, designers can optimize the responsiveness and accuracy of devices and use-cases. Designers can incorporate the principles of Fitts's law when creating user interface elements, for more intuitive and user-friendly interactions. The law can also evaluate users’ performance when interacting with an interface. By comparing predicted movement times to actual times, we can assess how well an interface supports users in their tasks. In the current project, while we do not claim that Fitts's law directly applies to parrots, we use the model as a framework for assessing performance and measuring the speed and accuracy of target selection tasks. We also do not claim that the proper evaluation condition of Fitts's law on motor command can be fulfilled in the context of pet parrots, but we use our data to assess the usability of the model in an avian context.

3 USABILITY SURVEY AND PILOT STUDY

As exploration steps, we ran an ergonomics survey with parrot caregivers and a short pilot study to inform the main study protocol development and design choices. The anonymous survey aimed to gather insights about how parrots currently use tablets. Twelve participants were recruited through social media, which included two extra small birds (determined by species weight of 30g-150g), two small birds (150g-299g), five medium birds (300g-700g), and three large birds (>700 g). Most participants had over six months of experience interacting with touch screens by playing interactive games. To determine a consistent setup for the study, we asked participants about the setup that appeared to work best in their current tablet use. Most survey participants (66.6%) reported using a medium or large tablet over small-size tablets (16.7%) or phones (16.7%). This led us to design the study for medium size tablets (∼ 10”). The choice of screen angle appeared quite consistent, with 83% of participants reporting setting the tablet at either a 60 or a 75-degree angle. When asked about the distance between the perch and the bottom of the screen, responses were highly correlated with bird size, with extra small birds averaging 1.75”, 2.35” for small birds, 3.33” for medium, and 3.67” for large birds. Regarding physical setups, more participants (41.7%) reported using a dedicated perch, followed by a chair back or armrest (16.7%), a table (16.7%), or standing on a person (16.7%). This led us to instruct our study participants to use a dedicated perch for the study with distances set depending on their bird categories: 2.5” for small birds, 3.5” for medium, and 3.75” for large birds (see Figure2a). We excluded extra small birds for consistency as they might require much smaller screens or need assistance from their caregivers to access different parts of the screen.

No More Angry Birds: Investigating Touchscreen Ergonomics to Improve Tablet-Based Enrichment for Parrots (2)

We ran a pilot study using a standard Fitts application (FittsTouch Android app[64]) on five birds. This study highlighted important parrot-specific considerations, including fatigue mitigation, the maximum number of sessions per day, and the need for a multi-tap threshold. Although traditional Fitts's Law testing protocols often use a series of at least 20 targets, this number appeared too high for parrots, as one of the pilot participants disengaged after a handful of touches. Indeed, as the birds voluntarily engaged in the research and had access to food and water ad libitum, the tasks had to remain short to preserve the birds’ engagement. This led to dividing the test into five short, manageable series of five target-touching tasks, to be completed between one and three sessions at a time (between five and 15 tasks) depending on the parrot's interest and fatigue. Caregivers were asked to keep sessions no longer than 30 minutes per day. The parameters of each series are the target width (W) (i.e. the size of the circle) and the movement amplitude (A) (i.e. the distance between the center of the previous and the next target). To cover different conditions without fatiguing the parrots, four values of W and A were selected. These values were chosen based on observations that the birds experienced some difficulty in successfully touching the smallest dot and all birds could easily touch the biggest dot. This led to 16 W-A conditions each repeated three times to allow for missed targets. Pilot participants finished a series in a span of 5s to a few minutes, and could complete a session of three series generally in a period of 5-17 minutes. In consultation with experts and in conjunction with our observation of the learning phases in the pilot birds, it was decided to set up the study into three stages of training, and then a final testing period.

Some birds learned they would get a treat at the end of a series regardless of accuracy and started ignoring target dots. To counter this in the study, we added a two-dot warm-up with rewards for each touch, reinforcing the goal of touching dots. Training time was also extended (from 2 to up to 6 weeks) to encourage touching all dots before receiving a treat. Additionally, one bird attempted to quit the apps when fatigued, which resulted in lost data. Consequently, we included an exit button on the test screen, allowing the bird to opt out of the app at any time without data loss.

To maintain the birds’ engagement and inspired by balloon-pop-type games, we added three custom sound effects; one “pop” sound for each successful target touch, one “missed” sound for missed targets, and one “celebratory sound” for the completion of a series. The addition of these sounds was validated through the pilot experiments.

Finally, most parrots appear to “multi-tap” when touching the screen, meaning they tap the screen multiple times very quickly at roughly the same spot. As this could cause issues with false negative target touch, we used pilot data to identify a time threshold of "multi-taps" to filter them out of the target touching exercise. Figure2b shows the time distribution of the duration it takes the bird between two touches plotted against the distance between touches. The orange line shows the threshold clustering multi-taps vs. instances when the bird moves to another target. Based on the pilot data, we set the multi-tap threshold to 0.3s between touches.

In summary, the initial survey and our pilot experiments led to several design choices for the main study:

  • Implementation of a consistent setup guideline consisting of a medium-sized device angled at 70° horizontally and located at a precise distance from the bird perch
  • Addition of sound effects for motivation and an exit button to increase birds’ agency.
  • Limitation of the number of touches per task to five in a row, and set up the test into four sessions of five series that each had to be repeated three times for consistency. No more than three sessions per day.
  • Choice of four target Widths (40, 70, 100, 130 dp) and four Amplitudes (200, 250, 300, 400 dp) based on the observed difficulty, for a total of 16 different Amplitude-Width (A-W) combinations.
  • Addition of warm-up exercises of only two dots at the start of each series to keep birds engaged.
  • Integration of a 300 ms multi-tap threshold to avoid missed targets based on multi-taps. Taps less than 300 ms apart were recorded but ignored for the purpose of target selection.

4 Materials & Method

4.1 Participants

To recruit home-based pet parrot participants already habituated to interacting with touch screens, we advertised through social media. To obtain representative data on adult healthy birds, each parrot was required to be over one year old and to have no known behavioral or health issues. To facilitate the introduction of the study tool, birds had to be comfortable with looking at screens, touching objects, and have prior experience with balloon-pop-types games. The parrot caregivers were required to have some prior training working with animals and sufficient available time to facilitate the interactions. To ensure consistency, we conducted an initial survey to determine the devices accessible to our participants. The results revealed that the majority already had access to a Galaxy Tab A. For the two participants who did not, we facilitated the borrowing of this device for the study. The device had a diagonal screen size is 10.1” (256.5 mm) and a resolution of 1920 x 1200 px at 224 ppi. 20 parrots were selected to participate in the study (P01-20) of which 17 fulfilled the requirements to enter the testing phase and completed the entire protocol. Throughout the learning phase, three bird participants were excluded, either because they did not appear engaged enough to advance in the learning phases in 6 weeks (P13 and P16) or because they showed slight aggressive behavior (P12). Figure3 summarizes the participants’ IDs, size categories, and species.

No More Angry Birds: Investigating Touchscreen Ergonomics to Improve Tablet-Based Enrichment for Parrots (3)

4.2 Housing and Setup

Birds resided in family homes and stayed in their regular environment. They had access to food and water ad libitum. Caregivers were instructed to not change their bird's regular schedule and feeding during the study. In alignment with best practices[11] caregivers were encouraged to run the study sessions before a meal. The setup illustrated in figure4 included 1) a Galaxy Tab A device set at 75% brightness, placed on a tripod or kickstand, with a protective case and the custom study app installed and 2) a camera, placed on a tripod or kickstand, to record the interactions. For the session location, the participants used play areas to which the birds were already acclimated. The perch was placed at a specific distance from the device which was positioned at a 70° angle incline. Participants received emailed communications and thorough live training sessions as well as 24/7 support from parrot behaviourist. They were provided with instructions, scripts, and visual illustrations for setup, and several live remote presentations and meetings.

No More Angry Birds: Investigating Touchscreen Ergonomics to Improve Tablet-Based Enrichment for Parrots (4)

4.3 Ethics

In alignment with HCI and ACI ethical standards[66, 95], we aimed to minimize any discomfort or fear during the study while emphasizing free choice and consent in the animals’ interactions. Agency and autonomy were central to the study's ethical framework in the short- and long-term. In the short time frame of the study, the protocol was designed to allow the parrots to voluntarily participate or withdraw from the experiment. The birds had an easy escape route to walk or fly away, and humans were instructed to let them do so. Ethical considerations also led to limiting the number of targets per series and the number of series per session. Additionally, caregivers were trained to recognize signs of distress and to end the intervention should stress behaviors occur, and reduce the number of tasks to maintain engagement. The presence and training of the caregivers were a key component of the study, as they played an essential role in providing positive reinforcement in the form of praise, encouragement, and treats. A parrot behaviorist reviewed the sessions to ensure welfare. In the long term, the goal of this research is to allow for more parrot-adapted interaction with technology to provide agency and enrichment by controlling interfaces more ergonomically adapted to them. However, our systems are not designed to be used in isolation and we believe technology can best benefit parrots in a social context and to the extent that it enriches and reinforces the bond with their caregivers as well.

4.4 Study Application

The custom target-testing application was adapted from the Fitts-Touch app[64]. Participants have to touch red targets on the screen as quickly as possible; each touch makes the target reappear in a new location until the series ends. Performance data, including movement time (MT), touch locations, missed targets, and finger pressure when available, are automatically recorded and sent for analysis. The application also computes additional metrics such as outlier presence, index of difficulty (ID), effective target width (We), amplitude (Ae), and effective index of difficulty (IDe).

As illustrated in figure5, each session started with filling out the form on the home page with participant details, which led to an initial warm-up practice test of 2 targets, following which, the participant completed five series of five target touches. After each series, a message was displayed to remind the caregiver that the bird should receive a treat. Then, after 5 seconds, a blue dot appeared that the participant tapped to start the next series or end the session.

.

No More Angry Birds: Investigating Touchscreen Ergonomics to Improve Tablet-Based Enrichment for Parrots (5)

4.5 Protocol

Ergonomic studies necessitate a certain level of ease in interacting with an interface, and the subject's ’best effort’ to complete the task as fast as possible. As instructing animals to complete a task quickly is challenging, we preceded our testing phase with a practice phase lasting up to 6 weeks. Figure6 illustrates the different stages of the protocol and the use of treats. During Stage 1 of the practice phase, bird participants were introduced to the testing task and taught to touch the targets. The caregiver held the device and guided them to the target. They were given a standard treat for touching anywhere on the screen and a "jackpot" of treats (either a large quantity of the same treats or a higher value treat) if they touched the target location. The choice and quantity of treats to use was personal for each bird and decided in consultation between the caregiver and a parrot behaviorist to best cater to the animal's species, taste, and context. We leveraged contra-freeloading[45], where animals work for food even when it is also freely available. Stage 1 lasted until the birds reached "stable scores," which were defined as hit rates varying by less than 15% for three sessions as a way to combat the learning curve. During Stage 2 of the practice phase, the birds were trained to touch the targets as accurately as possible while it was positioned on a stand. The birds were given a "jackpot" treat for each target touched and a small treat after 2 misses in a row. Once the birds reached stable scores in this context, they were advanced to the next stage. During Stage 3, the caregiver encouraged their birds to increasingly touch multiple targets between treat rewards until, ultimately, the birds completed the entire series before rewards. The birds received a jackpot of treats at the end of each series. This phase duration was personal to each bird and lasted between 3 and 6 weeks. After the training phase, the birds completed the same set of tasks during the testing phase and their data were used for analysis. During testing, similar to stage 3, parrots didn't receive treats during the series but received a jackpot treat at the end of each series. The app reminded the caregiver when to give a treat. For the entire protocol, caregivers were instructed to limit the parrots to a maximum of three sessions per day to mitigate the risk of fatigue and ensure the well-being of the parrot participants throughout the learning and testing process. Allowing for individual variability in participants’ learning times allowed each bird to reach their performance plateau at their own pace, thereby providing a more authentic representation of typical interactions. Our objective was not to drive the parrots to achieve their maximum possible scores in a specific time, but rather to reach their consistent performance level, to adapt to their needs. We chose a methodology that personalized the training times to mitigate the novelty effect and prevent disengagement caused by overly repetitive tasks.

No More Angry Birds: Investigating Touchscreen Ergonomics to Improve Tablet-Based Enrichment for Parrots (6)

In the case where birds were not able to advance to the next phase in the estimated maximum times, or if they got stressed during the learning phase, they were released from the study.

Finally, after completion, bird caregivers were asked to take the test themselves and complete a post-study questionnaire asking about 1) their parrots’ use of technology prior to the study, 2) their perceived benefits of technology for birds, 3) concerns, risks and ethical considerations and 4) challenges with human-centric applications, 5) expectations and desires for parrot-centered apps and finally 6) feedback on the study.

5 Analysis & Results

In following the framework of our research questions, we first examine the characteristics of how parrots use touchscreens by looking at multi-taps, drag, pressure, and hit rate differences between the parrot and human groups. Second, we investigate whether parrot touch patterns can inform the design of button sizes and distances for optimized success in hitting targets. This involved analyzing the distribution of touches across different target conditions to determine if birds tend to aim consistently at the center or towards the edge of the target. Third, we explored movement time in relation to the Width-Amplitude (W-A) condition to see if Fitts's Law is applicable and identify the presence of ’touch-and-retreat’ behaviors. Lastly, we gathered human insights on the bird's experience from the post-study questionnaire.

5.1 Data Management and Cleaning

The study yielded three types of data collected: 1) videos of all the interaction sessions, 2) touch data collected automatically by the application, and 3) post-study questionnaires filled out by participants. The dataset was first cleaned and verified by analyzing the recorded data from the application/spreadsheet against the corresponding video. Another phase of data cleaning, specific to Fitts's law validation, involved the exclusion of segments where birds were distracted. For this, two researchers independently reviewed and labelled each video segment, coding each touch as either on-task or off-task. Off-task behaviors included instances where birds were eating, playing with toys, interacting with the caregiver, or otherwise distracted. The codings were cross-verified, yielding an inter-rater agreement of 97%. Any disagreements were resolved through discussion. Following this stage, the data subsequently analyzed for Fitts's Law validation was considered to reflect only user activities with a focus on the study task. Following cleaning, we used the data to assess potential connections between performance data (movement time and error rate), test conditions (amplitude, width of dots), and subject characteristics (thought size of birds). Target and touch location were measured in pd (Density-independent Pixels) units relative to a 160 dpi screen. For the identification of ’touch-and-retreat’ behaviors, three researchers independently reviewed the testing phase video data for each participant to scrutinize their touch movements. They identified instances where the birds remained focused on the task but exhibited pauses between targets. These pauses manifested in two ways: the birds either moved their heads backwards to locate the next target before advancing, or they remained stationary, pausing and rotating their heads or eyes to find the next target before moving. As animal interaction data are traditionally not normally distributed, we used non-parametric tests for significance testing, either Wilcoxon Rank Sum tests for two group comparisons (humans/birds) or Kruskal Wallis for three or more groups. ANOVAs were used for touch data analysis.

5.2 Parrot vs Human Interaction Characteristics

Based on the analysis schema, we obtained a total of 4,720 target touches (8,995 when counting subsequent touches in case of multi-taps) from 17 birds, over 892 series. The testing phase video recordings totaled over 3 hours (and over 26 hours during the learning phase). Caregivers also completed the test phase themselves for human/bird comparison.

5.2.1 Multi-taps. Parrots engage in multi-taps (defined as two or more taps within 300ms) 45.73% of the time. No multi-taps were observed in our human data. For birds, single taps represented a majority (55.14%), followed by double (22.48%), triple (8.93%) and quadruple taps (5.11%). The maximum number of taps recorded as multi-tap was 41 by P03. Figure7 shows the distribution of multi-taps. The average time between multi-taps was 135.01 ms, and the average time between single taps was 945.95 ms. The minimum time recorded between taps was 7ms.

No More Angry Birds: Investigating Touchscreen Ergonomics to Improve Tablet-Based Enrichment for Parrots (7)

5.2.2 Drag and Pressure. For each touch from the birds and humans, we measured both the finger down and finger up positions (or tongue down and tongue up positions) and derived the drag factor as the difference between the two. We observed a statistically significant difference using a Wilcoxon Rank Sum test of drag lengths between the birds (Mdn = 6.3854) and the human (Mdn = 0) populations (Z = 21.02, p < .001). Regarding tap pressure, only one device (P17) had built-in pressure touch measurement (the tablet was from the same model but a different series). Based on this human-bird participant pair alone, we compared the bird and the human touch pressure, measured as a normalized value between 0.0 (no touch) and 1.0 (max detectable force). The comparison revealed a significant difference between the human data (Mdn=0.1137) and the bird data (Mdn=0.0588) (Z=-7.77, p<0.001). This suggests that, within this pair, the bird presses the screen less strongly (about half) than their human caregiver.

No More Angry Birds: Investigating Touchscreen Ergonomics to Improve Tablet-Based Enrichment for Parrots (8)

5.2.3 Hit Rates. When focusing on the first tap location, we found that although hit rates – defined as the percentage of targets successfully hit – were above chance on even the smallest target, birds present a lower hit rate (Mnd=41.57%) compared to humans (Mdn=95.9%). For reference, the chance hit rate would be equivalent to the ratio between the target area and the full area of the screen of 1920 x 1200 (chance hit rate= 0.22% for W=40pd, 0.67% for W=70pd, 1.3% for W=100pd and 2.3% for W=130pd.) Consequently, the average chance hit rate for the test was 1.1%. When clustering birds by size, we observe a trend that medium-sized birds appear to have higher hit rates (Mdn=49.82%), than large (Mdn=25.78%) or small (Mdn=27.13%) birds although this trend was not significant (H = 5.83, p=0.161).

When looking at hit rates across different amplitudes shown in figure10, (i.e. distance between targets), we did not observe statistically significant differences (H = 2.32, p=0.50). However, the hit rate increases significantly (H = 19, p=0.0003) as target width increases (Mdn=19.29% for W=40pd, 31.14% for W=70dp, 40.47% for W=100pd, and 47.08% for W= 130pd) (see figure9. This suggests that the bigger the targets, the more the birds tap successfully. However, this increase in successful hits could be due either to enhanced precision by the birds or simply to the larger target area available for touching. After consulting with renowned Fitts expert Dr. Scott MacKenzie on our results, we conducted a more detailed analysis of the exact locations of taps in relation to target size. Indeed, it is known that for humans, when the target width increases, subjects tend to aim not at the center, but at the border of the target to save time and optimize their movements. If birds exhibit similar behavior, it would suggest a need to balance target size and distance for optimal design.

No More Angry Birds: Investigating Touchscreen Ergonomics to Improve Tablet-Based Enrichment for Parrots (9)
No More Angry Birds: Investigating Touchscreen Ergonomics to Improve Tablet-Based Enrichment for Parrots (10)

5.3 Tap Accuracy

For a more in-depth understanding of tap accuracy, and to determine optimal target size, we explored the distribution of tap location compared to target coordinates. We centered our data around the intended target location to measure the deviation from the target in horizontal and vertical directions. Specifically, Δ X represents the horizontal deviation from the target, and Δ Y signifies the vertical deviation. These delta values, Δ X and Δ Y, effectively capture the touchpoint's relative distance from the target, providing a measure of accuracy in touch interactions. We calculated the Euclidean distance from each touchpoint to the center of the target, utilizing the previously defined Δ X and Δ Y values. This resulted in four distributions corresponding to the four target sizes: 40pd, 70pd, 130pd, and 130pd used to assess the influence of target size on touch precision. There appeared to be no discernible difference in touch precision across the varying target sizes (Figure11.) This suggests that the size of the target may not play a significant role in determining how closely the birds’ touch aligned with the center of the target. This preliminary observation from the visual examination was reinforced by revealing non-statistically significant in the distribution of Euclidean distances (F=0.96, p = 0.41> 0.01) suggesting no difference in touch precision across the target sizes.

No More Angry Birds: Investigating Touchscreen Ergonomics to Improve Tablet-Based Enrichment for Parrots (11)

However, we observed a five-angle star shape in the tapping patterns, suggesting that some taps occur in the trajectory between the previous and current target (as the target test contained five alternative targets in a circle organization.) To identify individual differences, we visually inspected the tap distribution for each of the 17 birds during the testing phase. Figure12 presents these individual distributions, sorted from highest to lowest hit rate.

No More Angry Birds: Investigating Touchscreen Ergonomics to Improve Tablet-Based Enrichment for Parrots (12)

Upon close visual examination of the plots and videos, varying interaction patterns emerged. Some birds appear to show consistently high accuracy, marked by a high hit rate (HR>38%) and taps concentrated on the target (e.g., P3, P5, P6, P8, P11, P20), while others exhibit more dispersed touch locations with lower hit rates (HR<30%) (e.g., P2, P14, P17, P18). Additionally, a pronounced star-shaped touch pattern was seen in birds such as P1, P4, P9, P19, suggesting touches either on the previous target or between targets. This pattern, observed on the plots, was confirmed through observation of the video recordings by three expert data analysts. They noted that these four birds tended to touch at previous locations once the target had moved or in transition between targets. We propose additional considerations in the discussion section.

5.4 Fitts Predictive Model

In addition to touch location and accuracy, the tablet application also measured movement time (MT) and estimated index of difficulty (IDe) (defined in section 2.5). We analyze the data set using the Fitts's Law framework that, in humans, predicts a linear relationship between MT and IDe. In contrast to the previous analysis, which only focused on individual touches, validating the Fitts model also necessitated the complete execution of series to obtain accurate movement time estimates. Therefore, only series where birds remained fully focused were considered (refer to section 5.1 for the filtering protocol). Consequently, birds that failed to complete at least one of each full series while being entirely on-task were excluded. This criterion led to the exclusion of five birds (P01, P04, P09, P15, P16) from this part of the analysis. The results are plotted in figure13. For every bird, we obtain a very low R2 of 0.1 or below which appears to indicate no correlation between movement time and index of difficulty. This suggests that birds’ interaction with screens does not follow the commonly used human predictive model regarding movement time (MT). As this inadequacy of Fitts's Law suggests a potential overhead in movement times – rendering them invariant of the targets’ amplitude – the video data from all participants were reviewed to identify potential explanations. This review process led to the identification of a ’touch-and-retreat’ motion, where the birds appeared to take some distance from the screen between targets to better see where the next target arose. This observation was corroborated independently by three researchers across all participants on at least one touch per series. Based on these observations, potential insights and explanations are proposed in the discussion section, as well as future ways to more systematically quantify this behavior.

No More Angry Birds: Investigating Touchscreen Ergonomics to Improve Tablet-Based Enrichment for Parrots (13)

5.5 Post Study Questionnaire

No More Angry Birds: Investigating Touchscreen Ergonomics to Improve Tablet-Based Enrichment for Parrots (14)

From the survey, we found that, prior to our study, over half of the birds used tablets daily (45.5%), a third weekly (36.4%), and some monthly (9.1%) or rarely (9.1%) (Figure14a). All of the participants reported that their parrots used interactive games and 72 % used the tablet for music-playing or educational games for their birds. The most popular applications used prior to the study were an app called "balloon pop" (66.6%) followed by paint and coloring apps (58.3%) (Figure14a).

5.5.1 Perceived benefits of technology for parrots: When we asked caregivers if they believe that mobile applications can be enriching for birds, 90.9% of participants strongly agreed, with the other 9.1% responding that they agreed. When asked in what ways applications can benefit parrots and provided multiple choices, all of the participants selected “cognitive enrichment & mental challenge”, “enrichment / entertainment" and “bonding with caregivers”. 91.6% of participants also selected “skill learning” and “social interaction". When asked for additional ways they thought parrots can be enriched by applications, one person manually entered an additional option of “communication and agency”.

5.5.2 Potential risks. When asked what risks tablet applications could pose to parrots, the most common answer was “potential damage to the tablet” (73.6%) followed by “overstimulation” and “physical harm” (both at 45.5%), dependency on technology (18.2%), exposure to harmful content (9.1%) and reduced physical activity (9.1%). When asked about potential ethical issues, a third of respondents answered being concerned about the technology's “interference with their natural behavior”, “trivializ[ing] parrot experiences for human entertainment”, and “increas[ing] isolation from their natural environment“, 20% of respondents reported concerns about “long-term well-being effects”, and 10% reported concerns about “increased isolation from human companionship” and “sustainability”. 30% of respondents reported no ethical concerns around tablets. Participants in our study often emphasized the risk of their birds’ overuse, likening their screen time needs to those of children, “just like with children, I think it needs to be limited” (P04), the importance of parrots’ consent before using technology “we always ask if they want to use the tablet” (P19) and the need for guidelines was also highlighted. Continued engagement was underscored, noting parrots might suffer if learning opportunities cease: “Concerned about future caregivers continuing this enjoyable and educational experience” (P01).

5.5.3 Challenges with Human-Centric Apps. When asked about challenges parrots face when using apps designed for humans, 72.7% of participants reported touch recognition issues. Other frequently cited challenges included interface size and complexity for bird users (63.6%), non-intuitive interaction mechanisms (54.5%), and irrelevant content (45.5%). Additional concerns encompassed sound or visual overstimulation, lack of tactile feedback, and icon positioning. When asked which features should be better tailored for parrots, 63.6% of respondents highlighted the need for adjustments to the user interface size & layout. Other cited areas for improvement included sound design like alerts and feedback (81.8%), more sensitive screens (72.7%), graphical content and animations (72.7%), additional feedback touched upon color schemes and visual design (63.6%) and tactile feedback considerations (63.6%).

As the focus of the study was to investigate the affordances of tablets to parrots; we asked participants if they believed that more adapted button sizes, locations and timing would help improve parrots’ experience when interacting with tablets. All participants agreed, with 54.5% answering “strongly agree” and the other 45.5% answering “agree”. Finally, 90.9% of participants responded they would be “very likely” (and 9.1% responded “likely”) to use a parrot-specific app designed with the findings from this study.

5.5.4 Human Feedback on Bird's Experience. As our work aims to improve parrots’ experience and provide them with agency and enrichment, we asked human participants to reflect on their bird's experience. Most reported enthusiastic engagement from their parrots, though some, like P13 and P16, displayed waning interest over time, possibly due to the repetitive nature of the tasks or the study's duration. Participants mainly found the study enjoyable and enriching for their birds, as one stated, “I feel my bird seemed to be enriched and enjoyed participating in the social interactions we had based on watching his behaviors and body language with myself and the tablet” (P05). The app's ease of use and engaging nature were highlighted, but some suggested that variety could help sustain bird interest. The initiative was viewed as a means to enhance the bond between caregivers and their parrots, with one human participant reporting it "brought my bird and I closer," (P01) and another observing their bird's growth: "just to see him at the end opening the app himself and moving to different stages was incredible." (P03) The overall experience was positive, fostering bonding and enrichment. Several participants expressed interest in continuing to use the app: “Am I allowed to open and still run the app for the parrot who might enjoy doing it once in awhile.” (P01), “He still loves his games, but he looks at the spot on the screen where the Fitts app used to be and I think he misses it” (P18).

6 DISCUSSION

This study explored the usability of off-the-shelf devices for pet parrot enrichment, utilizing a custom protocol and tablet application to gather data on target-touching tasks in home environments. Our results indicate varying levels of task mastery among birds based on target size and suggest that Fitts’ law does not apply to pet parrots in this context. The study provides key insights for designing future parrot-focused interfaces.

6.1 Hit Rate

The observed low hit rates among the birds, compared to humans, underscore the inherent challenges pet parrots face when interacting with human-centered interfaces. The high variance of results within bird participants, coupled with the size-dependent performance, emphasizes the significance of physical attributes, prior experiences, and individual differences when designing these interfaces. The uniform touch location across target sizes infers that target size might affect hit accuracy due to the increased target surface area rather than influencing birds’ precision. A derived implication is the potential existence of an optimal button size for maximized accuracy; for instance, while a 40pd target seemed suboptimal for many birds, a range from 100pd to 130pd buttons seems more suitable.

6.2 Applicability of Fitts’ Law to Parrot Touchscreen Interactions

One key finding is the potential inapplicability of Fitts's Law to parrot touchscreen interactions. This ergonomics rule, suggesting a linear relationship between movement time and index of difficulty in humans, may not suit parrots due to their unique anatomy and behavior. Indeed, unlike humans, parrots use their tongues to interact with touchscreens, an organ situated very close to their eyes. This appears to introduce a need for constant recalibration during the interaction. When a parrot approaches a target on the screen, its proximity potentially impedes clear vision, necessitating a brief retreat or pause to accurately identify the next target location. This repeated ’touch-and-retreat’ motion, observed in our video data, appears to introduce a consistent overhead in their movement time, making it almost invariant to the actual distance between targets. In consequence, the relationship between the index of difficulty and movement time — central in Fitts's Law — may be substantially altered for parrots.

6.3 Limitations and Future Work

Our study, encompassing multiple parrot species, provides a broad perspective on avian touchscreen interactions. However, it also raises the question of species-specific nuances. We also intentionally excluded very small birds who might exhibit very different interaction behaviors. Species-specific studies may offer help in guiding interface design. Another limitation is the potential inconsistency in the birds’ individual experience levels, motivation, and other intrinsic vs extrinsic factors. Moreover, the home setting, while more natural for the birds, brings with it numerous uncontrollable variables that might affect the bird's performance and behavior. While incorporating a human in the loop was imperative for our study's context, it might introduce potential biases. Birds frequently seeking validation and praise from the human participant could influence the results, indicating a reliance on human feedback rather than pure interface usability.

In this study, behaviors were identified manually from the videos and based on graph observations. This method provided initial insights, yet a more extensive behavioral analysis remains for future work. Our emphasis was on the quantitative aspects of touch and ergonomics, setting a foundational base for future in-depth behavioral studies. The scope of this paper is specifically tailored to explore HCI dimensions in avian touchscreen interactions. Future research could delve deeper into a more precise categorisation of behavioral aspects, using advanced methodologies to build upon our findings.

Furthermore, the touch-and-retreat behavior emerged from the analysis indicating the inadequacy of Fitts's Law for avian interactions. This finding was confirmed through careful reviews of the video data. Future work could leverage machine learning, AI video analysis, and other approaches to systematically analyze these interactions in more detail, providing a more comprehensive and quantifiable understanding of the behavior.

6.4 Insights and Design Guidelines

In this section, we propose concrete considerations for the future of parrot-focused application design.

6.4.1 Potential of Tablet-Based Parrot Enrichment. Human participants’ feedback supports the idea that tablet-based applications may offer significant benefits for parrots. Most caregivers strongly believed in the potential of mobile applications for cognitive enrichment, entertainment, and bonding. This feedback offers a roadmap for developers. We recommend the development of systems that consider mechanisms to encourage owner-parrot interactions, turning screen time into quality bonding moments. For instance, systems could allow caregivers to set challenges and rewards personalized for their birds to create a collaborative environment, reinforcing the bonding experience.

6.4.2 Inadequacy of Existing Human Systems and Designing for the Bodies of Parrots. An important challenge in parrot-centered app design is understanding and accommodating their unique touch interactions. The issue of touch recognition extends beyond device sensitivity, and also on how parrots interact with screens. About half of all parrot interactions comprised multi-taps suggesting the need to implement multi-tap recognition with adaptive thresholds. Parrots seem to apply more drag and softer touches than humans but still strong enough for device measurement. This nuance requires designers to recalibrate touch recognition for birds.

Furthermore, our research highlighted a potential ’touch-and-retreat’ behavior where parrots appear to pause or pull back post-interaction, differing from steady human hand movements. Standard usability metrics, like Fitts's Law, would benefit from specific adjustments to be used in parrots. The low HR further show the disparities in interactions between humans and birds. In essence, optimal software for companion parrot users requires a rethinking of interaction paradigms, recognizing the unique behavioral patterns, behaviors and bodies of parrots.

6.4.3 Need for Personalisation. Parrots, across and within species, exhibit a vast range of individual capabilities and characteristics supporting the need for personalisation in the design of parrot-centric applications. Given the variety in parrot sizes, their beak or tongue sizes and motor functions, interface elements such as button size must be adaptable to fit these physical dimensions and capacities. The need for personalization is further supported by unique interaction styles, even within a single species; for instance, we observed birds with high precision in their interactions, some who tapped more towards rather than on the target, and some who exhibited less consistent patterns. Precise birds could benefit from a tight layout, while less precise birds might benefit from personalized multi-tap thresholds, increased target areas, and rewarding interactive results not depending on hit rate. Based on these observations, we hypothesize that personalized applications are key to offering enrichment for different birds, and should be based on continuous user monitoring, support, and adaptation. Future work is needed to systematically define and characterize these interaction patterns, and support the design of personalization strategies.

While some parrots might be readily able to hit targets, others might require or prefer a more exploratory, open-ended interaction. Grouping these interaction styles and learning curves can help cater to these inclinations and levels of mastery. Lastly, auditory and tactile mechanisms could also be personalised as some birds experience neophobia which should help alleviate the novelty effect.

6.4.4 Continuous accompaniment. Our study underscored the need for ongoing support in ACI. The learning phase consisted of an extended period to cater to individual learning curves and equip the parrots for subsequent challenges. This support goes beyond the implementation of human-catered on-boarding phases; it stresses the importance of patience and flexibility in the parrots’ and caregivers’ initial interactions. An ideal system would integrate continuous accompaniment and meaningfully alternate moments of challenge with phases of simpler, more exploratory interactions. For optimal engagement, the interaction paradigm should be flexibly based on the parrots’ needs and be able to adapt in the long term.

6.4.5 Social Context and Human in the Loop. Human support and facilitation was central to our study. Caregivers provided rewards and praise to make the experience more significant and rewarding for the birds. Furthermore, human participants reported that the sessions fostered deeper connections with their parrots, emphasizing a need for systems that promote understanding and connection. In this context, “human in the loop” enriches the experience with technology while providing a safer and more nurturing environment for the animals. It considers the birds’ unique histories and preferences while providing them with enhanced agency. This connection further underscores the necessity for platforms that facilitate bird-digital interaction and promote human-parrot bonding.

6.4.6 Ethics and Safety. The ethical implementation of parrot-centric technology is based on ensuring the well-being of the parrot mentally and physically. Our approach emphasizes the importance of optimizing tablet placement, ensuring materials are safe from ingestion or injury risks, and implementing interface designs that prevent accidental harm. We believe that technology for parrot enrichment should be introduced with the animal's welfare in mind rather than human entertainment or sole pursuit of knowledge. Future systems should incorporate mechanisms that restrict excessive usage, such as session duration limits or automated break intervals, ensuring a balanced engagement and reducing the risks of over-reliance on technology.

7 CONCLUSION

Touchscreen devices have become an integral part of human routines, and our study explored their potential for pet parrot enrichment. Given parrots’ cognitive abilities, their strong enrichment needs, and their unique interactions with touchscreens, there is strong potential to adapt human-centric design standards to their requirements. Our research with 20 pet parrots highlighted ergonomic characteristics crucial for developing suitable tablet-based enrichment systems. Notably, the study's insights into multi-tap behavior and touch target preferences highlight the need (and potential) to adapt current HCI models and practices. Feedback from parrot caregivers underscores the critical importance of tailored technological solutions for enhancing avian well-being. By developing systems specifically for animals, we are not only offering tangible tools for human caregivers but also providing outlets to foster and strengthen connections with animals living amongst us. Such initiatives extend beyond interactional improvements. They prompt us to consider how technology might further serve as a medium, fostering an enriched and mutual understanding between species.

ACKNOWLEDGMENTS

We wish to thank the avian participants and their human caregivers for their contributions to this study. Additionally, we are grateful for the help and support from Dr. Scott MacKenzie, Dr. Akito van Troyer, Cassie Crawford, Hao Jin, and Zitong Bao for their technical expertise, and behavioral training support. Ethical approvals for this project were approved by the ethics committees of the University of Glasgow, both for animal (reference: EA31/23) and human subjects research (reference: 300220153).

REFERENCES

  • Mats Amundin, Josefin Starkhammar, Mikael Evander, Monica Almqvist, Kjell Lindström, and HansW. Persson. 2008. An echolocation visualization and interface system for dolphin research. The Journal of the Acoustical Society of America 123, 2 (2008), 1188–1194.
  • AmaliaPM Bastos, XimenaJ Nelson, and AlexH Taylor. 2022. From the lab to the wild: how can captive studies aid the conservation of kea (Nestor notabilis)?Current Opinion in Behavioral Sciences 45 (2022), 101–131.
  • Amalia P.M. Bastos, PatrickM. Wood, and AlexH. Taylor. 2021. Are parrots naive realists? Kea behave as if the real and virtual worlds are continuous. Biology Letters 17 (2021), 101–131.
  • Eric P.S. Baumer. 2015. Usees. In CHI ’15: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems. ACM, 3295–3298.
  • Hrvoje Benko, AndrewD. Wilson, and Patrick Baudisch. 2006. Precise selection techniques for multi-touch screens. In CHI ’06: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 1263–1272.
  • Helen Boostrom. 2013. Problem-solving with orangutans (Pongo pygmaeus and Pongo bbelii) and chimpanzees (Pan troglodytes): Using the iPad to provide novel enrichment opportunities. Master's thesis. Texas A&M University.
  • GeorgeErich Brogmus. 1991. Effects of age and sex on speed and accuracy of hand movements: And the refinements they suggest for Fitts’ Law. 35, 3 (1991), 208–212.
  • Niamh Caprani, NoelE. O'Connor, and Cathal Gurrin. 2012. Touch screens for the older user. In Assistive Technologies, Fernando AuatCheein (Ed.). InTech, 95–118.
  • FayE. Clark. 2017. Cognitive enrichment and welfare: Current approaches and future directions. Animal Behavior and Cognition 4, 1 (2017), 52–71.
  • Jennifer Cunha and Carlie Rhoads. 2020. Use of a tablet-based communication board and subsequent choice and behavioral correspondences in a Goffin's co*ckatoo (Cacatua goffiana). In ACI ’20: Proceedings of the Seventh International Conference on Animal–Computer Interaction. ACM, Article 8.
  • JenniferM. Cunha and CorinneC. Renguette. 2022. A framework for training animals to use touchscreen devices for discrimination tasks. In ACI ’22: Proceedings of the Ninth International Conference on Animal–Computer Interaction. ACM, Article 4.
  • Rachel Degrande, Fabien Cornilleau, Léa Lansade, Plotine Jardat, Violaine Colson, and Ludovic Calandreau. 2022. Domestic hens succeed at serial reversal learning and perceptual concept generalisation using a new automated touchscreen device. Animal 16, 8, Article 100607 (2022).
  • Reuven Dukas and AlanC Kamil. 2000. The cost of limited attention in blue jays. Behavioral Ecology 11, 5 (2000), 502–506.
  • JulieR. Dumont, Ryan Salewski, and Flavio Beraldo. 2021. Critical mass: The rise of a touchscreen technology community for rodent cognitive testing. Genes, Brain and Behavior 20, 1, Article e12650 (2021).
  • CrystalL. Egelkamp and StephenR. Ross. 2019. A review of zoo-based cognitive research using touchscreen interfaces. Zoo Biology 38, 2 (2019), 220–235.
  • Jo Egerton, Jan Cook, and Cheryl Stambolis. 2009. Developing a model of pedagogical best practice in the use of interactive whiteboards for children with autism and complex learning disabilities: Implications for initial teacher training. Technical Report. Training and Development Agency for Schools, UK.
  • C.S. Evans and P. Marler. 1991. On the use of video images as social stimuli in birds: Audience effects on alarm calling. Animal Behaviour 41, 1 (1991), 17–26.
  • Joël fa*got and Elodie Bonté. 2010. Automated testing of cognitive performance in monkeys: Use of a battery of computerized test systems by a troop of semi-free-ranging baboons (Papio papio). Behavior Research Methods 42 (2010), 507–516.
  • PaulM. Fitts and JamesR. Peterson. 1964. Information capacity of discrete motor responses.Journal of Experimental Psychology 67, 2 (1964), 103–112.
  • Larry Freil, Ceara Byrne, Giancarlo Valentin, Clint Zeagler, David Roberts, Thad Starner, Melody Jackson, et al. 2017. Canine-centered computing. Foundations and Trends in Human–Computer Interaction 10, 2 (2017), 87–164.
  • Fiona French, Clara Mancini, and Helen Sharp. 2017. Exploring research through design in animal computer interaction. In ACI ’17: Proceedings of the Fourth International Conference on Animal–Computer Interaction. ACM, Article 2.
  • Zdzislaw Galoch and Hans-Joachim Bischof. 2006. Zebra finches actively choose between live images of conspecifics. Ornithological Science 5, 1 (2006), 57–64.
  • JosephP. Garner, CherylL. Meehan, and JoyA. Mench. 2003. Stereotypies in caged parrots, schizophrenia and autism: Evidence for a common mechanism. Behavioural Brain Research 145, 1–2 (2003), 125–134.
  • BrettM. Gibson, EdwardA. Wasserman, Lloyd Frei, and Keith Miller. 2004. Recent advances in operant conditioning technology: A versatile and affordable computerized touchscreen system. Behavior Research Methods, Instruments, & Computers 36, 2 (2004), 355–362.
  • William Grussenmeyer and Eelke Folmer. 2017. Accessible touchscreen technology for people with visual impairments: A survey. ACM Transactions on Accessible Computing 9, 2, Article 6 (2017).
  • MélanieF. Guigueno, ScottA. MacDougall-Shackleton, and DavidF. Sherry. 2015. Sex differences in spatial memory in brown-headed cowbirds: Males outperform females on a touchscreen task. PLOS One 10, 6, Article e0128302 (2015).
  • Reinhard Gupfinger and Martin Kaltenbrunner. 2019. Animal-Centred sonic interaction design: Musical instruments and interfaces for grey parrots. In ACI ’19: Proceedings of the Sixth International Conference on Animal–Computer Interaction. ACM, Article 10.
  • Juan Haladjian, Johannes Haug, Stefan Nüske, and Bernd Bruegge. 2018. A wearable sensor system for lameness detection in dairy cattle. Multimodal Technologies and Interaction 2, 2 (2018).
  • Liisa Hämäläinen, Johanna Mappes, HannahM. Rowland, Marianne Teichmann, and Rose Thorogood. 2020. Social learning within and across predator species reduces attacks on novel aposematic prey. Journal of Animal Ecology 89, 5 (2020), 1153–1164.
  • Franziska Hausmann, KathrynE. Arnold, N.Justin Marshall, and Ian P.F. Owens. 2003. Ultraviolet signals in birds are special. Proceedings of the Royal Society B: Biological Sciences 270, 1510 (2003), 61–67.
  • Kevin Healy, Luke McNally, GraemeD. Ruxton, Natalie Cooper, and AndrewL. Jackson. 2013. Metabolic rate and body size are linked with perception of temporal information. Animal Behaviour 86, 4 (2013), 685–696.
  • Ken Hinckley, JohnC. Goble, Randy Pausch, and NealF. Kassell. 1995. New applications for the touchscreen in 2D and 3D medical imaging workstations. In Proceedings of the 1995 SPIE Conference on Medical Imaging: Image Display, SPIE Proceedings Vol. 2431. SPIE, 561–570.
  • Ilyena Hirskyj-Douglas, Jennifer Cunha, and Rebecca Kleinberger. 2024. Call of the Wild Web: Parrot Engagement in Live vs. Pre-recorded Video Calls. In Proceedings of the 2024 CHI Conference on Human Factors in Computing Systems (Honolulu, USA) (CHI ’23). Association for Computing Machinery, New York, NY, USA, 25pages. https://doi.org/10.1145/3613904.3641938
  • Ilyena Hirskyj-Douglas and Vilma Kankaanpää. 2021. Exploring how white-faced sakis control digital visual enrichment systems. Animals 11, 2, Article 557 (2021).
  • Ilyena Hirskyj-Douglas and Andrés Lucero. 2019. On the internet, nobody knows you're a dog... unless you're another dog. In CHI ’19: Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. ACM, Article 117.
  • ErrolR. Hoffmann. 1991. A comparison of hand and foot movement times. Ergonomics 34, 4 (1991), 397–406.
  • Eve Hoggan, StephenA. Brewster, and Jody Johnston. 2008. Investigating the effectiveness of tactile feedback for mobile touchscreens. In CHI ’08: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 1573–1582.
  • DominiqueG. Homberger and AlanH. Brush. 1986. Functional-morphological and biochemical correlations of the keratinized structures in the African Grey Parrot, Psittacus erithacus (Aves). Zoomorphology 106, 2 (1986), 103–114.
  • LydiaM. Hopper, LauraM. Kurtycz, StephenR. Ross, and KristinE. Bonnie. 2015. Captive chimpanzee foraging in a social setting: A test of problem solving, flexibility, and spatial discounting. PeerJ 3, Article e833 (2015).
  • JuanPablo Hourcade, BenjaminB. Bederson, Allison Druin, and Francois Guimbretiere. 2003. Accuracy, target reentry and Fitts’ law performance of preschool children using mice. Technical Report. University of Maryland (Technical Report HCIL-2003).
  • Ludwig Huber, Nils Heise, Christopher Zeman, and Christian Palmers. 2015. The ALDB box: Automatic testing of cognitive performance in groups of aviary-housed pigeons. Behavior Research Methods 47 (2015), 162–171.
  • SarahM. Huskisson, ChristinaR. Doelling, StephenR. Ross, and LydiaM. Hopper. 2021. Assessing the potential impact of zoo visitors on the welfare and cognitive performance of Japanese macaques. Applied Animal Behaviour Science 243, Article 105453 (2021).
  • Maki Ikebuchi and Kazuo Okanoya. 1999. Male zebra finches and Bengalese finches emit directed songs to the video images of conspecific females projected onto a TFT display. Zoological Science 16, 1 (1999), 63–70.
  • Yuko Ikkatai, Kazuo Okanoya, and Yoshimasa Seki. 2016. Observing real-time social interaction via telecommunication methods in budgerigars (Melopsittacus undulatus). Behavioural Processes 128 (2016), 29–36.
  • GlenD. Jensen. 1963. Preference for bar pressing over “freeloading” as a function of number of rewarded presses.Journal of Experimental Psychology 65, 5 (1963), 451–454.
  • EricArthur Johnson. 1965. Touch display? A novel input/output device for computers. Electronics Letters 8, 1 (1965), 219–220.
  • NancyE. Johnston. 2014. The avian tongue. Technical Report. Golden Gate Audubon Society, San Francisco.
  • Robert Kerr. 1973. Movement time in an underwater environment. Journal of Motor Behavior 5, 3 (1973), 175–178.
  • David Kieras. 2009. Model-based evaluation. In Human–computer interaction: Development process, Andrew Sears and JulieA. Jacko (Eds.). CRC Press, 309–326.
  • KirkC. Klasing. 1999. Avian gastrointestinal anatomy and physiology. Seminars in Avian and Exotic Pet Medicine 8, 2 (1999), 42–50.
  • Rébecca Kleinberger. 2023. Sonic enrichment at the zoo: What will the zoo of the future sound like?Interaction Studies 24, 2 (2023), 257–288.
  • Rebecca Kleinberger, Jennifer Cunha, MeghaM. Vemuri, and Ilyena Hirskyj-Douglas. 2023. Birds of a feather video-flock together: Design and evaluation of an agency-based parrot-to-parrot video-calling system for interspecies ethical enrichment.. In CHI ’23: Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems. ACM, Article 534.
  • Rébecca Kleinberger, Anne H.K. Harrington, Lydia Yu, Akito VanTroyer, David Su, JanetM. Baker, and Gabriel Miller. 2020. Interspecies interactions mediated by technology: An avian case study at the zoo. In CHI ’20: Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. ACM.
  • Rebecca Kleinberger, Megha Vemuri, Janelle Sands, Harpreet Sareen, and Janet Baker. 2022. TamagoPhone: A framework for augmenting artificial incubators to enable vocal interaction between bird parents and eggs. In ACI ’22: Proceedings of the 2022 ACI Conference on Animal–Computer Interaction. ACM, Article 15.
  • Natalia Kucirkova and Barry Zuckerman. 2017. A guiding framework for considering touchscreens in children under two. International Journal of Child–Computer Interaction 12 (2017), 46–49.
  • TaraldO. Kvålseth. 1977. Effects of marijuana on human reaction time and motor control. Perceptual and Motor Skills 45, 3 (1977), 935–939.
  • Stephen E.G. Lea and WinandH. Dittrich. 2000. What do birds see in moving video images. Picture Perception in Animals (2000), 143–180.
  • David Lee. 2010. Capacitive vs. resistive touchscreens. Technical Report. R-Tools Technology, Inc.
  • ShangPing Lee, AdrianDavid Cheok, Teh KengSoon James, Goh PaeLyn Debra, ChioWen Jie, Wang Chuang, and Farzam Farbiz. 2006. A mobile pet wearable computer and mixed reality system for human–poultry interaction through the internet. Personal and Ubiquitous Computing 10, 5 (2006), 301–317.
  • Clayton Lewis and Cathleen Wharton. 1997. Cognitive walkthroughs. In Handbook of human–computer interaction, MartingG. Helander, ThomasK. Landauer, and PrasadV. Prabhu (Eds.). Elsevier, 717–732.
  • Olle Lind, Mindaugas Mitkus, Peter Olsson, and Almut Kelber. 2014. Ultraviolet vision in birds: The importance of transparent eye media. Proceedings of the Royal Society B: Biological Sciences 281, 1774, Article 20132209 (2014).
  • Gitte Lindgaard and Jarinee Chattratichart. 2007. Usability testing: What have we overlooked?. In CHI ’07: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 1415–1424.
  • I.Scott MacKenzie. 2003. Motor behaviour models for human–computer interaction. In HCI models, theories, and frameworks: Toward a multidisciplinary science, J.M. Carroll (Ed.). Morgan Kaufmann, 27–54.
  • Scott MacKenzie. 2023. Fitts’ law software download. Technical Report. September 2023 version of http://www.yorku.ca/mack/FittsLawSoftware/ portal
  • Suma Mallavarapu, M.A. Bloomsmith, C.W. Kuhar, and T.L. Maple. 2013. Using multiple joystick systems in computerised enrichment for captive orangutans. Animal Welfare 22, 3 (2013), 401–409.
  • Clara Mancini. 2017. Towards an animal-centred ethics for Animal–Computer Interaction. International Journal of Human–Computer Studies 98 (Feb. 2017), 221–233.
  • NataliaD. Mankowska, AnnaB. Marcinkowska, Monika Waskow, RitaI. Sharma, Jacek Kot, and PawelJ. Winklewski. 2021. Critical flicker fusion frequency: A narrative review. Medicina 57, 10, Article 1096 (2021).
  • ChristopherFlynn Martin, Akiho Muramatsu, and Tetsuro Matsuzawa. 2022. Apex and ApeTouch: Development of a portable touchscreen system and software for primates at zoos. Animals 12, 13, Article 1660 (2022).
  • Julia Mueller-Paul, Anna Wilkinson, Ulrike Aust, Michael Steurer, Geoffrey Hall, and Ludwig Huber. 2014. Touchscreen performance and knowledge transfer in the red-footed tortoise (Chelonoidis carbonaria). Behavioural Processes 106 (2014), 187–192.
  • Frank Noz and Jinsoo An. 2011. Cat cat revolution: An interspecies gaming experience. In CHI ’11: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 2661–2664.
  • Anna Ostberg and Nada Matic. 2015. Hover cursor: Improving touchscreen acquisition of small targets with hover-enabled pre-selection. In CHI EA ’15: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. ACM, 1723–1728.
  • Jennifer O'Meara. 2019. Touchscreens, tactility, and material traces: From avant-garde artists to Instagram ASMRtists. European Journal of Media Studies – NECSUS 8, 2 (2019), 235–262.
  • Yi-Hao Peng, Muh-Tarng Lin, Yi Chen, TzuChuan Chen, PinSung Ku, Paul Taele, ChinGuan Lim, and MikeY. Chen. 2019. PersonalTouch: Improving touchscreen usability by personalizing accessibility settings based on individual user's touchscreen interaction. In CHI ’19: Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. ACM, Article 683.
  • IreneM. Pepperberg. 2006. Cognitive and communicative abilities of Grey parrots. Applied Animal Behaviour Science 100, 1–2 (2006), 77–86.
  • IreneM. Pepperberg. 2017. Animal language studies: What happened?Psychonomic Bulletin & Review 24, 1 (2017), 181–185.
  • IreneM. Pepperberg and StevenR. Wilkes. 2004. Lack of referential vocal learning from LCD video by Grey parrots (Psittacus erithacus). Interaction Studies 5, 1 (2004), 75–97.
  • BonnieM. Perdue. 2016. The effect of computerized testing on sun bear behavior and enrichment preferences. Behavioral Sciences 6, 4, Article 19 (2016).
  • BonnieM. Perdue, TheodoreA. Evans, DavidA. Washburn, DuaneM. Rumbaugh, and MichaelJ. Beran. 2014. Do monkeys choose to choose?Learning & Behavior 42 (2014), 164–175.
  • Franck Péron, S. Hoummady, N. Mauny, and Dalia Bovet. 2012. Touch screen device and music as enrichments to captive housing conditions of African grey parrots. Journal of Veterinary Behavior: Clinical Applications and Research 7, 6, Article e13 (2012).
  • Diana Reiss. 2011. The dolphin in the mirror: Exploring dolphin minds and saving dolphin lives. Houghton Mifflin Harcourt.
  • Anne Roudaut, Eric Lecolinet, and Yves Guiard. 2009. MicroRolls: Expanding touch-screen input vocabulary by distinguishing rolls vs. slides of the thumb. In CHI ’09: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 927–936.
  • Jörg Rychen, Julie Semoroz, Alexander Eckerle, Richard H.R. Hahnloser, and Rébecca Kleinberger. 2022. Full-duplex acoustic interaction system for cognitive experiments with cetaceans. bioRxiv (2022).
  • Damian Scarf, Harlene Hayne, and Michael Colombo. 2011. Pigeons on par with primates in numerical competence. Science 334, 6063 (2011), 1664–1664.
  • Martin Schmitz, Jürgen Steimle, Jochen Huber, Niloofar Dezfuli, and Max Mühlhäuser. 2017. Flexibles: Deformation-aware 3D-printed tangibles for capacitive touchscreens. In CHI ’17: Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. ACM, 1001–1014.
  • Clara-Lynn Schubert, Barbara Ryckewaert, Carlos Pereira, and Tetsuro Matsuzawa. 2022. Garrano horses perceive letters of the alphabet on a touchscreen system: A pilot study. Animals 12, 24, Article 3514 (2022).
  • BenjaminM. Seitz, Kelsey McCune, Maggie MacPherson, Luisa Bergeron, AaronP. Blaisdell, and CorinaJ. Logan. 2021. Using touchscreen equipped operant chambers to study animal cognition: Benefits, limitations, and advice. PLOS One 16, 2, Article e0246446 (2021).
  • Yoshimasa Seki and Kazuo Okanoya. 2008. Sex differences in audiovisual discrimination learning by Bengalese finches (Lonchura striata var. domestica).Journal of Comparative Psychology 122, 1 (2008), 26–34.
  • Ben Shneiderman. 1991. Touch screens now offer compelling uses. IEEE Software 8, 2 (1991), 93–94.
  • MocehebLazam Shuwandy, B.B. Zaidan, A.A. Zaidan, AhmedShihab Albahri, AbdullahHussein Alamoodi, OsamahShihab Albahri, and Mamoun Alazab. 2020. mHealth authentication approach based 3D touchscreen and microphone sensors for real-time remote healthcare monitoring system: comprehensive review, open issues and methodological aspects. Computer Science Review 38, Article 100300 (2020).
  • BurrhusF. Skinner. 1960. Pigeons in a pelican.American Psychologist 15, 1 (1960), 630–639.
  • B.C.M. Smits-Engelsman, P.H. Wilson, Y. Westenberg, and Jaak Duysens. 2003. Fine motor deficiencies in children with developmental coordination disorder and learning disabilities: An underlying open-loop control deficit. Human Movement Science 22, 4–5 (2003), 495–513.
  • Claudia Stephan, Anna Wilkinson, and Ludwig Huber. 2012. Have we met before? Pigeons recognise familiar human faces. Avian Biology Research 5, 2 (2012), 75–80.
  • LoraineR. Tarou, ChristopherW. Kuhar, D. Adco*ck, MollieA. Bloomsmith, and TerryL. Maple. 2004. Computer-assisted enrichment for zoo-housed orangutans (Pongo pygmaeus). Animal Welfare 13, 4 (2004), 445–453.
  • Ovidiu-Ciprian Ungurean, Radu-Daniel Vatavu, LuisA. Leiva, and Réjean Plamondon. 2018. Gesture input for users with motor impairments on touchscreens: Empirical results based on the kinematic theory. In CHI EA ’18: Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems. ACM, Article LBW537.
  • HeliK. Väätäjä and EmiliaK. Pesonen. 2013. Ethical issues and guidelines when conducting HCI studies with animals. In CHI EA ’13: CHI ’13 Extended Abstracts on Human Factors in Computing Systems. ACM, 2159–2168.
  • Xabier Valencia, J.Eduardo Pérez, Myriam Arrue, Julio Abascal, Carlos Duarte, and Lourdes Moreno. 2017. Adapting the Web for people with upper body motor impairments using touch screen tablets. Interacting with Computers 29, 6 (2017), 794–812.
  • Yvonne R.A. van Zeeland, BerryM. Spruit, T.Bas Rodenburg, Bernd Riedstra, YvonneM. van Hierden, Bart Buitenhuis, S.Mechiel Korte, and JohannesT. Lumeij. 2009. Feather damaging behaviour in parrots: A review with consideration of comparative aspects. Applied Animal Behaviour Science 121, 2 (2009), 75–95.
  • DavidA. Washburn. 2015. The four Cs of psychological wellbeing: Lessons from three decades of computer-based environmental enrichment. Animal Behavior and Cognition 2, 3 (2015), 218–232.
  • Michelle Westerlaken and Stefano Gualeni. 2014. Felino: The philosophical practice of making an interspecies video game. Technical Report. Game Philosophy Network (Philosophy of Computer Games Conference, Istanbul).
  • Jay Withgott. 2000. Taking a bird's-eye view... in the UV: Recent studies reveal a surprising new picture of how birds see the world. BioScience 50, 10 (2000), 854–859.
  • AnthonyA. Wright, RobertG. Cook, JacquelyneJ. Rivera, StephenF. Sands, and JuanD. Delius. 1988. Concept learning by pigeons: Matching-to-sample with trial-unique video picture stimuli. Animal Learning & Behavior 16, 4 (1988), 436–444.
  • Clint Zeagler, Scott Gilliland, Larry Freil, Thad Starner, and Melody Jackson. 2014. Going to the dogs: Towards an interactive touchscreen interface for working dogs. In UIST ’14: Proceedings of the 27th Annual ACM Symposium on User Interface Software and Technology. ACM, 497–507.
  • Clint Zeagler, Jay Zuerndorfer, Andrea Lau, Larry Freil, Scott Gilliland, Thad Starner, and MelodyMoore Jackson. 2016. Canine computer interaction: Towards designing a touchscreen interface for working dogs. In ACI ’16: Proceedings of the Third International Conference on Animal–Computer Interaction. ACM, Article 2.

No More Angry Birds: Investigating Touchscreen Ergonomics to Improve Tablet-Based Enrichment for Parrots (15)
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs International 4.0 License.

CHI '24, May 11–16, 2024, Honolulu, HI, USA

© 2024 Copyright held by the owner/author(s).
ACM ISBN 979-8-4007-0330-0/24/05.
DOI: https://doi.org/10.1145/3613904.3642119

No More Angry Birds: Investigating Touchscreen Ergonomics to Improve Tablet-Based Enrichment for Parrots (2024)
Top Articles
Latest Posts
Article information

Author: Msgr. Refugio Daniel

Last Updated:

Views: 5293

Rating: 4.3 / 5 (74 voted)

Reviews: 81% of readers found this page helpful

Author information

Name: Msgr. Refugio Daniel

Birthday: 1999-09-15

Address: 8416 Beatty Center, Derekfort, VA 72092-0500

Phone: +6838967160603

Job: Mining Executive

Hobby: Woodworking, Knitting, Fishing, Coffee roasting, Kayaking, Horseback riding, Kite flying

Introduction: My name is Msgr. Refugio Daniel, I am a fine, precious, encouraging, calm, glamorous, vivacious, friendly person who loves writing and wants to share my knowledge and understanding with you.