ΑΙhub 07月17日 18:04
A behaviour monitoring dataset of wild mammals in the Swiss Alps
index_new5.html
../../../zaker_core/zaker_tpl_static/wap/tpl_guoji1.html

 

为了更深入地了解野生动物在自然栖息地中的行为,特别是在气候变化和人类活动影响下,EPFL的科学家们与瑞士国家公园合作,创建了MammAlps数据集。该数据集是首个多视角、多模态的野生动物行为标注数据集,旨在训练AI模型识别物种和行为,从而更高效、更经济、更智能地支持野生动物保护工作。MammAlps通过九个相机陷阱捕捉了超过43小时的原始视频,并进行了详细的标注,包括高层级的活动(如觅食)和细微的动作(如梳理),同时结合了音频和环境信息,为AI模型提供了更丰富的上下文,以更准确地理解动物行为。

🦌 **MammAlps数据集的创建背景**:传统的野生动物行为研究方法存在干扰大或范围有限的缺点,而相机陷阱虽然侵入性小,但数据量庞大难以分析。AI虽然能提供帮助,但需要标注数据集,而现有数据集往往缺乏真实性或细节。MammAlps数据集旨在解决这些问题,为AI模型提供高质量的训练数据。

🏞️ **MammAlps数据集的构成与特点**:该数据集由EPFL的研究人员与瑞士国家公园合作开发,收集了超过43小时的野生动物视频录像,并进行了细致的标注。它采用了多视角、多模态的方法,不仅包含视频,还结合了音频记录和“参考场景地图”,记录了水源、灌木等环境因素,并交叉引用了天气条件和个体数量,以提供更全面的行为上下文。

🧠 **多层级标注与行为理解**:MammAlps数据集对动物行为的标注采用了分层方法,将行为分为高层级活动(如觅食、玩耍)和低层级动作(如行走、梳理、嗅探)。这种结构有助于AI模型将细微的动作与更广泛的行为模式联系起来,从而更准确地理解动物的整体行为。

📊 **推动野生动物监测新标准**:MammAlps数据集为野生动物监测设定了新的标准,它提供了动物行为的全方位感官快照,跨越多个视角、声音和环境背景。此外,它还引入了“长期事件理解”的基准,使得研究人员能够研究孤立行为之外的更广泛的生态场景,例如跟踪一只狼在多个相机视角下跟踪一只鹿的整个过程。

🚀 **AI在野生动物保护中的潜力**:通过扩展类似MammAlps的数据集,AI模型能够从数百小时的视频中识别感兴趣的行为,从而极大地扩展当前的野生动物监测能力。这将为野生动物保护者提供及时、可操作的见解,帮助他们追踪气候变化、人类活动或疾病爆发对野生动物行为的影响,并更有效地保护濒危物种。

Two roe deer foraging, with manual annotations for each individual animal. Credit: A. Mathis (EPFL).

By Nik Papageorgiou

Have you ever wondered how wild animals behave when no one’s watching? Understanding these behaviors is vital for protecting ecosystems—especially as climate change and human expansion alter natural habitats. But collecting this kind of information without interfering has always been tricky.

Traditionally, researchers relied on direct observation or sensors strapped to animals—methods that are either disruptive or limited in scope. Camera traps offer a less invasive alternative, but they generate vast amounts of footage that’s hard to analyze.

AI could help, but there’s a catch: it needs annotated datasets to learn from. Most current video datasets are either scraped from the internet, missing the authenticity of real wild settings, or are small-scale field recordings lacking detail. And few include the kind of rich context—like multiple camera angles or audio—that’s needed to truly understand complex animal behavior.

Introducing MammAlps

To address this challenge, scientists at EPFL have collected and curated MammAlps, the first richly annotated, multi-view, multimodal wildlife behavior dataset in collaboration with the Swiss National Park. MammAlps is designed to train AI models for species and behavior recognition tasks, and ultimately to help researchers understand animal behavior better. This work could make conservation efforts faster, cheaper, and smarter.

MammAlps was developed by Valentin Gabeff, a PhD student at EPFL under the supervision of Professors Alexander Mathis and Devis Tuia, together with their respective research teams.

How MammAlps was developed

The researchers set up nine camera traps that recorded more than 43 hours of raw footage over the course of several weeks. The team then meticulously processed it, using AI tools to detect and track individual animals, resulting in 8.5 hours of material showing wildlife interaction.

They labeled behaviors using a hierarchical approach, categorizing each moment at two levels: high-level activities like foraging or playing, and finer actions like walking, grooming, or sniffing. This structure allows AI models to interpret behaviors more accurately by linking detailed movements to broader behavioral patterns.

To provide AI models with richer context, the team supplemented video with audio recordings and captured “reference scene maps” that documented environmental factors like water sources, bushes, and rocks. This addition al data enables better interpretation of habitat-specific behaviors. They also cross-referenced weather conditions and counts of individuals per event to create more complete scene descriptions.

“By incorporating other modalities alongside video, we’ve shown that AI models can better identify animal behavior,” explains Alexander Mathis. “This multi-modal approach gives us a more complete picture of wildlife behavior.”

A new standard to wildlife monitoring

MammAlps brings a new standard to wildlife monitoring: a full sensory snapshot of animal behavior across multiple angles, sounds, and contexts. It also introduces a “long-term event understanding” benchmark, meaning scientists can now study not just isolated behaviors from short clips, but broader ecological scenes over time—like a wolf stalking a deer across several camera views.

Research is still ongoing. The team is currently processing data collected in 2024 and carries out more fieldwork in 2025. These additional surveys are necessary to expand the set of recordings for rare species such as alpine hares and lynx and are also useful to develop methods for the temporal analysis of wildlife behavior over multiple seasons.

Building more datasets like MammAlps could radically scale up current wildlife monitoring efforts by enabling AI models to identify behaviors of interest from hundreds of hours of video. This would provide wildlife conservationists with timely, actionable insights. Over time, this could make it easier to track how climate change, human encroachment, or disease outbreaks impact wildlife behavior, and help protect vulnerable species.

For more information about MammAlps and access to the dataset, visit the project webpage.

Read the work in full

MammAlps: A multi-view video behavior monitoring dataset of wild mammals in the Swiss Alps, Valentin Gabeff, Haozhe Qi, Brendan Flaherty, Gencer Sumbül, Alexander Mathis, Devis Tuia.

Fish AI Reader

Fish AI Reader

AI辅助创作,多种专业模板,深度分析,高质量内容生成。从观点提取到深度思考,FishAI为您提供全方位的创作支持。新版本引入自定义参数,让您的创作更加个性化和精准。

FishAI

FishAI

鱼阅,AI 时代的下一个智能信息助手,助你摆脱信息焦虑

联系邮箱 441953276@qq.com

相关标签

MammAlps 野生动物行为 AI监测 生态保护 多模态数据
相关文章