未知数据源 2024年10月02日
Detect population variance of endangered species using Amazon Rekognition
index_new5.html
../../../zaker_core/zaker_tpl_static/wap/tpl_guoji1.html

 

文章指出地球面临全球物种灭绝危机,众多物种面临灭绝危险。介绍了一些保护组织和人员的努力,以及利用Amazon Rekognition Custom Labels等技术实现濒危物种识别和研究的解决方案,包括其架构、步骤、所需条件和实施方法等。

🌍地球面临全球灭绝危机,超百万物种恐走向灭绝,栖息地丧失、偷猎和入侵物种是常见原因,众多保护组织和人员在努力应对。

💻利用Amazon Rekognition Custom Labels及运动传感器相机陷阱等技术,实现濒危物种自动识别和研究,该方案使用多种AI服务等构建可扩展且具成本效益的架构。

📋解决方案的具体步骤包括训练定制模型识别濒危物种、上传相机陷阱拍摄的图像到S3桶、触发Lambda函数进行检测、更新DynamoDB数据库、使用QuickSight可视化数据等。

📚构建有效模型需要良好的训练集,文章使用了AWS Marketplace和Kaggle的数据集,还详细介绍了实施该解决方案的工作流程及相关数据集内容。

<section class="blog-post-content"><p>Our planet faces a global extinction crisis. <a href="https://www.smithsonianmag.com/science-nature/one-million-species-risk-extinction-threatening-human-communities-around-world-un-report-warns-180972114/&quot; target="_blank" rel="noopener noreferrer">UN Report</a> shows a staggering number of more than a million species feared to be on the path of extinction. The most common reasons for extinction include loss of habitat, poaching, and invasive species. Several <a href="https://www.treehugger.com/top-wildlife-conservation-organizations-4088567&quot; target="_blank" rel="noopener noreferrer">wildlife conservation foundations</a>, research scientists, volunteers, and <a href="https://www.iapf.org/&quot; target="_blank" rel="noopener noreferrer">anti-poaching rangers</a> have been working tirelessly to address this crisis. Having accurate and regular information about endangered animals in the wild will improve wildlife conservationists’ ability to study and conserve endangered species. Wildlife scientists and field staff use cameras equipped with infrared triggers, called <a href="https://www.wwf.org.uk/project/conservationtechnology/camera-trap#:~:text=The%20importance%20of%20camera%20traps,and%20other%20forms%20of%20wildlife&quot; target="_blank" rel="noopener noreferrer">camera traps</a>, and place them in the most effective locations in forests to capture images of wildlife. These images are then manually reviewed, which is a very time-consuming process.</p><p>In this post, we demonstrate a solution using <a href="https://aws.amazon.com/rekognition/custom-labels-features/&quot; target="_blank" rel="noopener noreferrer">Amazon Rekognition Custom Labels</a> along with motion sensor camera traps to automate this process to recognize engendered species and study them. Rekognition Custom Labels is a fully managed computer vision service that allows developers to build custom models to classify and identify objects in images that are specific and unique to their use case. We detail how to recognize endangered animal species from images collected from camera traps, draw insights about their population count, and detect humans around them. This information will be helpful to conservationists, who can make proactive decisions to save them.</p><h2>Solution overview</h2><p>The following diagram illustrates the architecture of the solution.<a href="https://d2908q01vomqb2.cloudfront.net/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59/2022/09/12/image001-2.png&quot;&gt;&lt;img class="wp-image-42368 size-full aligncenter" src="https://d2908q01vomqb2.cloudfront.net/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59/2022/09/13/image001-2-e1663023193495.png&quot; alt="Solution overview" width="661" height="341" /></a>This solution uses the following AI services, serverless technologies, and managed services to implement a scalable and cost-effective architecture:</p><ul><li><a href="https://aws.amazon.com/athena&quot; target="_blank" rel="noopener noreferrer">Amazon Athena</a> – A serverless interactive query service that makes it easy to analyze data in Amazon S3 using standard SQL</li><li><a href="https://aws.amazon.com/cloudwatch/&quot; target="_blank" rel="noopener noreferrer">Amazon CloudWatch</a> – A monitoring and observability service that collects monitoring and operational data in the form of logs, metrics, and events</li><li><a href="https://aws.amazon.com/dynamodb/&quot; target="_blank" rel="noopener noreferrer">Amazon DynamoDB</a> – A key-value and document database that delivers single-digit millisecond performance at any scale</li><li><a href="http://aws.amazon.com/lambda&quot; target="_blank" rel="noopener noreferrer">AWS Lambda</a> – A serverless compute service that lets you run code in response to triggers such as changes in data, shifts in system state, or user actions</li><li><a href="https://aws.amazon.com/quicksight/&quot; target="_blank" rel="noopener noreferrer">Amazon QuickSight</a> – A serverless, machine learning (ML)-powered business intelligence service that provides insights, interactive dashboards, and rich analytics</li><li><a href="https://aws.amazon.com/rekognition/&quot; target="_blank" rel="noopener noreferrer">Amazon Rekognition</a> – Uses ML to identify objects, people, text, scenes, and activities in images and videos, as well as detect any inappropriate content</li><li><a href="https://aws.amazon.com/rekognition/custom-labels-features/&quot; target="_blank" rel="noopener noreferrer">Amazon Rekognition Custom Labels</a> – Uses AutoML to help train custom models to identify the objects and scenes in images that are specific to your business needs</li><li><a href="https://aws.amazon.com/sqs/&quot; target="_blank" rel="noopener noreferrer">Amazon Simple Queue Service (Amazon SQS)</a> – A fully managed message queuing service that enables you to decouple and scale microservices, distributed systems, and serverless applications</li><li><a href="http://aws.amazon.com/s3&quot; target="_blank" rel="noopener noreferrer">Amazon Simple Storage Service (Amazon S3)</a> – Serves as an object store for documents and allows for central management with fine-tuned access controls.</li></ul><p>The high-level steps in this solution are as follows:</p><ol><li>Train and build a custom model using Rekognition Custom Labels to recognize endangered species in the area. For this post, we train on images of rhinoceros.</li><li>Images that are captured through the motion sensor camera traps are uploaded to an S3 bucket, which publishes an event for every uploaded image.</li><li>A Lambda function is triggered for every event published, which retrieves the image from the S3 bucket and passes it to the custom model to detect the endangered animal.</li><li>The Lambda function uses the Amazon Rekognition API to identify the animals in the image.</li><li>If the image has any endangered species of rhinoceros, the function updates the DynamoDB database with the count of the animal, date of image captured, and other useful metadata that can be extracted from the image <a href="https://exiftool.org/TagNames/EXIF.html&quot; target="_blank" rel="noopener noreferrer">EXIF</a> header.</li><li>QuickSight is used to visualize the animal count and location data collected in the DynamoDB database to understand the variance of the animal population over time. By looking at the dashboards regularly, conservation groups can identify patterns and isolate probable causes like diseases, climate, or poaching that could be causing this variance and proactively take steps to address the issue.</li></ol><h2>Prerequisites</h2><p>A good training set is required to build an effective model using Rekognition Custom Labels. We have used the images from AWS Marketplace (<a href="https://aws.amazon.com/marketplace/pp/prodview-3upmmq3orzcjk#offers&quot; target="_blank" rel="noopener noreferrer">Animals &amp; Wildlife Data Set from Shutterstock</a>) and <a href="https://www.kaggle.com/datasets/brsdincer/danger-of-extinction-animal-image-set&quot; target="_blank" rel="noopener noreferrer">Kaggle</a> to build the model.</p><h2>Implement the solution</h2><p>Our workflow includes the following steps:</p><ol><li>Train a custom model to classify the endangered species (rhino in our example) using the AutoML capability of Rekognition Custom Labels.</li></ol><p>You can also perform these steps from the Rekognition Custom Labels console. For instructions, refer to <a href="https://docs.aws.amazon.com/rekognition/latest/customlabels-dg/mp-create-project.html&quot; target="_blank" rel="noopener noreferrer">Creating a project</a>, <a href="https://docs.aws.amazon.com/rekognition/latest/customlabels-dg/creating-datasets.html&quot; target="_blank" rel="noopener noreferrer">Creating training and test datasets</a>, and <a href="https://docs.aws.amazon.com/rekognition/latest/customlabels-dg/training-model.html&quot; target="_blank" rel="noopener noreferrer">Training an Amazon Rekognition Custom Labels model</a>.</p><p>In this example, we use the dataset from Kaggle. The following table summarizes the dataset contents.</p><table border="1px" cellpadding="10px"><tbody><tr class="c4"><td><strong>Label</strong></td><td><strong>Training Set</strong></td><td><strong>Test Set</strong></td></tr><tr><td>Lion</td><td>625</td><td>156</td></tr><tr><td>Rhino</td><td>608</td><td>152</td></tr><tr><td>African_Elephant</td><td>368</td><td>92</td></tr></tbody></table><ol start="2"><li>Upload the pictures captured from the camera traps to a designated S3 bucket.</li><li>Define the event notifications in the <strong>Permissions</strong> section of the S3 bucket to send a notification to a defined SQS queue when an object is added to the bucket.</li></ol><p><a href="https://d2908q01vomqb2.cloudfront.net/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59/2022/09/12/image002.jpg&quot;&gt;&lt;img class="alignnone wp-image-42369 c5" src="https://d2908q01vomqb2.cloudfront.net/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59/2022/09/13/image002.jpg&quot; alt="Define event notification" width="983" height="217" /></a></p><p>The upload action triggers an event that is queued in Amazon SQS using the Amazon S3 event notification.</p><ol start="4"><li>Add the appropriate permissions via the access policy of the SQS queue to allow the S3 bucket to send the notification to the queue.</li></ol><p><a href="https://d2908q01vomqb2.cloudfront.net/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59/2022/09/12/image003-2.jpg&quot;&gt;&lt;img class="alignnone wp-image-42403 c5" src="https://d2908q01vomqb2.cloudfront.net/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59/2022/09/13/image003-2.jpg&quot; alt="ML-9942-event-not" width="978" height="298" /></a></p><ol start="5"><li>Configure a Lambda trigger for the SQS queue so the Lambda function is invoked when a new message is received.</li></ol><p><a href="https://d2908q01vomqb2.cloudfront.net/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59/2022/09/12/image004-2.jpg&quot;&gt;&lt;img class="alignnone wp-image-42387 c5" src="https://d2908q01vomqb2.cloudfront.net/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59/2022/09/13/image004-2.jpg&quot; alt="Lambda trigger" width="979" height="237" /></a></p><ol start="6"><li>Modify the access policy to allow the Lambda function to access the SQS queue.</li></ol><p><a href="https://d2908q01vomqb2.cloudfront.net/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59/2022/09/13/image005-1.jpg&quot;&gt;&lt;img class="alignnone size-full wp-image-42372 c5" src="https://d2908q01vomqb2.cloudfront.net/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59/2022/09/13/image005-1.jpg&quot; alt="Lambda function access policy" width="989" height="171" /></a></p><p>The Lambda function should now have the right permissions to access the SQS queue.</p><p><a href="https://d2908q01vomqb2.cloudfront.net/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59/2022/09/13/image006-4.jpg&quot;&gt;&lt;img class="alignnone wp-image-42373 c5" src="https://d2908q01vomqb2.cloudfront.net/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59/2022/09/13/image006-4.jpg&quot; alt="Lambda function permissions" width="990" height="379" /></a></p><ol start="7"><li>Set up the environment variables so they can be accessed in the code.</li></ol><p><a href="https://d2908q01vomqb2.cloudfront.net/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59/2022/09/13/image007-1.jpg&quot;&gt;&lt;img class="alignnone wp-image-42374 c5" src="https://d2908q01vomqb2.cloudfront.net/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59/2022/09/13/image007-1.jpg&quot; alt="Environment variables" width="988" height="251" /></a></p><h2>Lambda function code</h2><p>The Lambda function performs the following tasks on receiving a notification from the SNS queue:</p><ol start="1"><li>Make an API call to Amazon Rekognition to detect labels from the custom model that identify the endangered species:</li></ol><ol start="2"><li>Fetch the EXIF tags from the image to get the date when the picture was taken and other relevant EXIF data. The following code uses the dependencies (package – version) exif-reader – ^1.0.3, sharp – ^0.30.7:</li></ol><p>The solution outlined here is asynchronous; the images are captured by the camera traps and then at a later time uploaded to an S3 bucket for processing. If the camera trap images are uploaded more frequently, you can extend the solution to detect humans in the monitored area and send notifications to concerned activists to indicate possible poaching in the vicinity of these endangered animals. This is implemented through the Lambda function that calls the Amazon Rekognition API to detect labels for the presence of a human. If a human is detected, an error message is logged to CloudWatch Logs. A filtered metric on the error log triggers a CloudWatch alarm that sends an email to the conservation activists, who can then take further action.</p><ol start="3"><li>Expand the solution with the following code:</li></ol><ol start="4"><li>If any endangered species is detected, the Lambda function updates DynamoDB with the count, date and other optional metadata that is obtained from the image EXIF tags:</li></ol><h2>Query and visualize the data</h2><p>You can now use Athena and QuickSight to visualize the data.</p><ol start="1"><li>Set the DynamoDB table as the data source for Athena.<a href="https://d2908q01vomqb2.cloudfront.net/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59/2022/09/13/image008-1.jpg&quot;&gt;&lt;img class="alignnone wp-image-42375 c5" src="https://d2908q01vomqb2.cloudfront.net/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59/2022/09/13/image008-1.jpg&quot; alt="DynamoDB data source" width="943" height="384" /></a></li></ol><ol start="2"><li>Add the data source details.</li></ol><p>The next important step is to define a Lambda function that connects to the data source.</p><ol start="3"><li>Chose <strong>Create Lambda function</strong>.</li></ol><p><a href="https://d2908q01vomqb2.cloudfront.net/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59/2022/09/13/image009-1.png&quot;&gt;&lt;img class="alignnone wp-image-42376 c5" src="https://d2908q01vomqb2.cloudfront.net/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59/2022/09/13/image009-1.png&quot; alt="Lambda function" width="996" height="190" /></a></p><ol start="4"><li>Enter names for <strong>AthenaCatalogName</strong> and <strong>SpillBucket</strong>; the rest can be default settings.</li><li>Deploy the connector function.</li></ol><p><a href="https://d2908q01vomqb2.cloudfront.net/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59/2022/09/13/image010-1.png&quot;&gt;&lt;img class="alignnone size-full wp-image-42377 c5" src="https://d2908q01vomqb2.cloudfront.net/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59/2022/09/13/image010-1.png&quot; alt="Lambda connector" width="714" height="668" /></a></p><p>After all the images are processed, you can use QuickSight to visualize the data for the population variance over time from Athena.</p><ol start="6"><li>On the Athena console, choose a data source and enter the details.</li><li>Choose <strong>Create Lambda function</strong> to provide a connector to DynamoDB.</li></ol><p><a href="https://d2908q01vomqb2.cloudfront.net/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59/2022/09/13/image011-1.jpg&quot;&gt;&lt;img class="alignnone wp-image-42378 c5" src="https://d2908q01vomqb2.cloudfront.net/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59/2022/09/13/image011-1.jpg&quot; alt="Create Lambda function" width="1013" height="630" /></a></p><ol start="8"><li>On the QuickSight dashboard, choose <strong>New Analysis</strong> and <strong>New Dataset</strong>.</li><li>Choose Athena as the data source.</li></ol><p><a href="https://d2908q01vomqb2.cloudfront.net/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59/2022/09/13/image012.jpg&quot;&gt;&lt;img class="alignnone wp-image-42379 c5" src="https://d2908q01vomqb2.cloudfront.net/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59/2022/09/13/image012.jpg&quot; alt="Athena as data source" width="1014" height="607" /></a></p><ol start="10"><li>Enter the catalog, database, and table to connect to and choose <strong>Select</strong>.</li></ol><p><a href="https://d2908q01vomqb2.cloudfront.net/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59/2022/09/13/image013-1.png&quot;&gt;&lt;img class="alignnone wp-image-42380 c5" src="https://d2908q01vomqb2.cloudfront.net/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59/2022/09/13/image013-1.png&quot; alt="Catalog" width="1013" height="386" /></a></p><ol start="11"><li>Complete dataset creation.</li></ol><p><a href="https://d2908q01vomqb2.cloudfront.net/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59/2022/09/13/image013-1.png&quot;&gt;&lt;img class="alignnone wp-image-42380" src="https://d2908q01vomqb2.cloudfront.net/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59/2022/09/13/image013-1.png&quot; alt="Catalog" width="1018" height="388" /></a></p><p>The following chart shows the number of endangered species captured on a given day.</p><p><a href="https://d2908q01vomqb2.cloudfront.net/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59/2022/09/13/image015-1.png&quot;&gt;&lt;img class="alignnone size-full wp-image-42382 c5" src="https://d2908q01vomqb2.cloudfront.net/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59/2022/09/13/image015-1.png&quot; alt="QuickSight chart" width="1026" height="528" /></a></p><p>GPS data is presented as part of the EXIF tags of a captured image. Due to the sensitivity of the location of these endangered animals, our dataset didn’t have the GPS location. However, we created a geospatial chart using simulated data to show how you can visualize locations when GPS data is available.</p><p><a href="https://d2908q01vomqb2.cloudfront.net/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59/2022/09/13/image016-1.png&quot;&gt;&lt;img class="alignnone wp-image-42383 c5" src="https://d2908q01vomqb2.cloudfront.net/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59/2022/09/13/image016-1.png&quot; alt="Geospatial chart" width="1010" height="718" /></a></p><h2>Clean up</h2><p>To avoid incurring unexpected costs, be sure to turn off the AWS services you used as part of this demonstration—the S3 buckets, DynamoDB table, QuickSight, Athena, and the trained Rekognition Custom Labels model. You should delete these resources directly via their respective service consoles if you no longer need them. Refer to <a href="https://docs.aws.amazon.com/rekognition/latest/customlabels-dg/tm-delete-model.html&quot; target="_blank" rel="noopener noreferrer">Deleting an Amazon Rekognition Custom Labels model</a> for more information about deleting the model.</p><h2>Conclusion</h2><p>In this post, we presented an automated system that identifies endangered species, records their population count, and provides insights about variance in population over time. You can also extend the solution to alert the authorities when humans (possible poachers) are in the vicinity of these endangered species. With the AI/ML capabilities of Amazon Rekognition, we can support the efforts of conservation groups to protect endangered species and their ecosystems.</p><p>For more information about Rekognition Custom Labels, refer to <a href="https://docs.aws.amazon.com/rekognition/latest/customlabels-dg/getting-started.html&quot; target="_blank" rel="noopener noreferrer">Getting started with Amazon Rekognition Custom Labels</a> and <a href="https://docs.aws.amazon.com/rekognition/latest/dg/moderation.html&quot; target="_blank" rel="noopener noreferrer">Moderating content</a>. If you’re new to Rekognition Custom Labels, you can use our Free Tier, which lasts 3 months and includes 10 free training hours per month and 4 free inference hours per month. The Amazon Rekognition Free Tier includes processing 5,000 images per month for 12 months.</p><h3>About the Authors</h3><p class="c6"><img class="alignleft wp-image-34779 size-full" src="https://d2908q01vomqb2.cloudfront.net/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59/2022/09/13/IMG_8242-exif-copy.jpg&quot; alt="author-jyothi" width="100" height="116" /><strong>Jyothi Goudar</strong> is Partner Solutions Architect Manager at AWS. She works closely with global system integrator partner to enable and support customers moving their workloads to AWS.</p><p class="c6"><img class="alignleft wp-image-34779 size-full" src="https://d2908q01vomqb2.cloudfront.net/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59/2022/09/13/jayrao-1.png&quot; alt="" width="100" height="116" /><strong>Jay Rao</strong> is a Principal Solutions Architect at AWS. He enjoys providing technical and strategic guidance to customers and helping them design and implement solutions on AWS.</p></section>

Fish AI Reader

Fish AI Reader

AI辅助创作,多种专业模板,深度分析,高质量内容生成。从观点提取到深度思考,FishAI为您提供全方位的创作支持。新版本引入自定义参数,让您的创作更加个性化和精准。

FishAI

FishAI

鱼阅,AI 时代的下一个智能信息助手,助你摆脱信息焦虑

联系邮箱 441953276@qq.com

相关标签

物种灭绝危机 保护解决方案 Amazon Rekognition 濒危物种识别
相关文章