Tropical forest covers 12% of the planet’s land surface yet hosts around two thirds of all terrestrial species. Amazonia, which spans the vast Amazon River basin and the Guiana Shield in South America, is the largest extent of remaining tropical forest globally, home to more species of animal than any other terrestrial landscape on the planet.
Spotting wildlife in these dark and dense forests teeming with insects and spiny palms is always challenging. This is because of the very nature of biodiversity in Amazonia, where there is a small number of abundant species, and a greater number of rare species which are difficult to survey adequately.
Understanding what species are present and how they relate to their environment is of fundamental importance for ecology and conservation, providing us with essential information on the impacts of human-made disturbances such as climate change, logging or wood burning. In turn, this can also enable us to pick up on sustainable human activities such as selective logging – the practice of removing one or two trees and leaving the rest intact.
As part of BNP’s Bioclimate project, we are deploying a range of technological fixes like camera traps and passive acoustic monitors to overcome these hurdles and refine our understanding of Amazonian wildlife. These devices beat traditional surveys through their ability to continuously gather data without the need for human interference, allowing animals to go about their business undisturbed.
The eyes among the trees
Camera traps are small devices that are triggered by changes of activity in their vicinity, such as animal movements. They have been essential to our field work in the Tapajos National Forest in Para, North West Brazil, allowing us to investigate whether disturbances such as climate change have impacted upon the presence and behaviour of animals that are in turn necessary to natural processes.
Animals’ dispersal of seeds, which enables forest regeneration, is one of such processes. By eating fruits or carrying nuts, they will typically excrete or drop the seeds elsewhere. Our research has shown that at least 85% of all tree species in our plots have their seeds scattered by animals.
We also know that many of these animals are strongly impacted by disturbance. To better grasp the impact of losing these seed-dispersing species, we need to know which ones spread which plants and how far.
We have attempted to look at this by setting up cameras at the foot of fruit-bearing trees on our study site, revealing which species were eating which fruits and thus carrying seeds across the forest.
The survey resulted in over 30,000 hours of footage, and we were able to ascertain that 5,459 videos contained animals. An impressive total of 152 species of birds and mammals were recorded, including rare records of threatened species such as the vulturine parrot (Pyrilia vulturina).
The videos included incredible insights into animal behaviour, such as an ocelot (Leopardus pardalis) hunting a common opossum (Didelphis marsupialis), a giant anteater (Myrmecophaga tridactyla) carrying an infant on its back, and even a curious female tufted capuchin monkey (Sapajus apella) that checked out a camera and ended up throwing it to the floor.
Importantly, we also recorded 48 species eating fruit, including species considered important seed dispersers, such as the South American tapir (Tapirus terrestris) which is able to scatter large seeds over longer distances due to its size.
Our research demonstrated that bird species such as the white-crested guan (Penelope pileata) and mammals like the silvery marmoset (Mico argentatus) and the Amazonian brown brocket deer (Mazama nemorivaga) are frequent consumers of fruits. Many of these species are overhunted in the study region which may lead to cascading impacts for forest regeneration.
Acoustic recorders, on the other hand, are key to compiling inventories of the species-rich bird community. Indeed although birds are rarely seen in dense forest, their vocalisations reveal their presence.
When ornithologists study tropical birds, they are limited by how often they can conduct counts as it is often logistically challenging to return to individual locations. Consequently traditional surveys are often of quite long duration – between 5 and 15 minutes – with only a limited number of repeat counts at each surveyed site. This means that only a small proportion of the time period in which birds are most active – the two hours after sunrise typically known as the dawn chorus – is able to be surveyed.
Yet birds don’t all sing at the same time: a few species prefer to sing very early in the morning, most wait until it is slightly warmer and the sun is fully up, and a few more rise late. By limiting ourselves to a few surveys, it is difficult to cover the full time period and detect all the species present. Moreover, surveys only conducted on a handful of days mean factors such as the weather or the presence of predators on certain days can completely change which species are detected.
Our research found that by setting autonomous acoustic recorders to take 240 very short 15 second recordings totalling one hour of surveying, we could record 50% more species at each site that we surveyed in comparison to four 15-minute surveys that replicated the duration of human surveys. The extra surveys allowed us to spread our survey period across more days, but most importantly across the whole dawn chorus. We found that there was a small group of species which preferred to sing from 15 minutes before sunrise to 15 minutes after, and we were only really likely to detect if them if we had multiple surveys during that period – something only possible with automated recorders.
These more complete surveys allow us to provide better estimates of the species living in these hyperdiverse regions – but also of the ones that vanish when forests are logged or burnt. Thanks to this method, we were able to detect 224 species of bird across 29 locations with a total of just one hour of surveying at each location.
The species present across intact and disturbed forest also confirmed our previous research that showed that undisturbed, primary forests hold unique bird communities that are lost when forests are damaged by selective logging or wildfires.
Acoustic recorders have also allowed us to gather data over long stretches of time, with over 10,000 hours clocked thus far.
However, collecting data on this scale also means it is not viable for a scientist to listen to all of the recordings. Instead, the new field of ecoacoustics has developed statistical techniques to characterise entire soundscapes. These acoustic indices measure variation in amplitude and frequency to give a metric of just how busy or varied each soundscape is. By doing away with the need to identify individual sounds, these can efficiently process large volumes of acoustic data.
We have used acoustic indices to show undisturbed primary forests have unique soundscapes that can be identified with machine-learning techniques. Such data in turn allows us to contrast soundscapes that have been disturbed by phenomena such as fires or logging and make out the species groups which have been the most impacted.
To conclude, camera traps and acoustic recorders allows us to have eyes and ears in the forest even when our researchers are not there. As technology develops we will continue to use the latest techniques to understand animal behaviour and ecology better, and how to use that to better value and protect the habitats they live in.
We are particularly looking to develop deep-learning models to identify species, and in some cases to differentiate between individuals of the same species. Images and recorded sounds from automated recorders are opening up new ways of understanding animal abundance and behaviour, providing new insights into the secret world of tropical forest fauna.
The research project “Bioclimate” of which this publication is part was supported by the BNP Paribas Foundation as part of the Climate and Biodiversity Initiative program. It is coordinated by the Rede Amazonia Sustentavel (RAS).
Oliver Metcalf a reçu des financements de ECOFOR (NE/K016431/1), and AFIRE (NE/P004512/1), PELD?RAS (CNPq/CAPES/PELD 441659/2016?0) and the BNP Paribas Foundation’s Climate and Biodiversity Initiative (Project Bioclimate).
Liana Chesini Rossi a reçu des financements de PELD?RAS (CNPq/CAPES/PELD 441659/2016?0), RESFLORA (MCIC-CNPq 420254/2018-8), SEM-FLAMA (CNPq-PrevFogo-IBAMA 441949/2018-5), ECOFOR (NE/K016431/1), Instituto Nacional de Ciência e Tecnologia—Biodiversidade e Uso da Terra na Amazônia (CNPq 574008/2008-0) and Embrapa (SEG: 02.08.06.005.00).
Read the full article here.
This content was originally published by The Conversation. Original publishers retain all rights. It appears here for a limited time before automated archiving. By The Conversation