AWS Machine Learning Blog 2024年10月16日
Accelerate migration portfolio assessment using Amazon Bedrock
index_new5.html
../../../zaker_core/zaker_tpl_static/wap/tpl_guoji1.html

 

本文介绍了如何利用生成式AI和Amazon Bedrock来简化、加速和扩展云迁移评估。通过使用Amazon Bedrock代理、操作组和Amazon Bedrock知识库,我们可以构建一个迁移助手应用程序,该应用程序可以快速生成迁移计划、R-处置和迁移到AWS的应用程序成本估算。这种方法使您能够扩展应用程序组合发现并显著加快您的规划阶段。

🤔 **准确性和一致性:** 为了确保迁移助手能够提供准确一致的响应,需要实现Amazon Bedrock知识库。知识库应包含基于公司私有数据源的上下文信息。这使得迁移助手能够使用检索增强型生成(RAG),从而提高响应的准确性和一致性。知识库应包含多个数据源,包括: * 应用程序发现问卷的响应(参见示例) * 来自配置管理数据库(CMDB)或AWS应用程序发现代理数据的输出(参见示例) * 迁移到AWS的最佳实践和白皮书(例如,迁移Lens - AWS Well-Architected框架和容器迁移方法) * 任何组织特定的指南、迁移模式或应用程序模式

🤯 **处理幻觉:** 减少迁移助手应用程序中大型语言模型(LLM)的幻觉涉及实施一些关键策略。根据您的要求实施定制提示,并结合高级提示技术来指导模型的推理并提供示例以获得更准确的响应。这些技术包括思维链提示、零样本提示、多样本提示、少样本提示和特定于模型的提示工程指南(参见Amazon Bedrock上Anthropic Claude的提示工程指南)。RAG将信息检索与生成能力相结合,以增强上下文相关性并减少幻觉。最后,在特定数据集上微调LLM时,反馈循环或人机交互将有助于使响应与准确和相关的信息保持一致,从而减少错误和过时内容。

🚀 **模块化设计:** 使用Amazon Bedrock操作组构建迁移助手应用程序,操作组具有模块化设计,具有三个主要优点: * **定制和适应性:** 操作组允许用户自定义迁移工作流以适应特定的AWS环境和要求。例如,如果用户将Web应用程序迁移到AWS,他们可以自定义迁移工作流以包含针对Web服务器设置、数据库迁移和网络配置量身定制的特定操作。这种自定义确保迁移过程与正在迁移的应用程序的独特需求保持一致。 * **维护和故障排除:** 通过将问题隔离到各个组件,简化维护和故障排除任务。例如,如果迁移工作流中的数据库迁移操作出现问题,可以独立解决它,而不会影响其他组件。这种隔离简化了故障排除过程,最大限度地减少了对整体迁移操作的影响,确保更流畅的迁移和更快地解决问题。 * **可扩展性和可重用性:** 促进跨不同AWS迁移项目的可扩展性和可重用性。例如,如果用户使用一组模块化操作组成功地将应用程序迁移到AWS,他们可以使用相同的操作组来迁移具有类似要求的其他应用程序。这种可重用性在开发新的迁移工作流时节省了时间和精力,并确保了多个迁移项目的一致性。此外,模块化设计通过允许用户根据工作负载需求向上或向下扩展迁移操作来促进可扩展性。例如,如果他们需要迁移具有更高资源要求的更大应用程序,他们可以通过添加更多相关操作组的实例轻松扩展迁移工作流,而无需从头开始重新设计整个工作流。

Conducting assessments on application portfolios that need to be migrated to the cloud can be a lengthy endeavor. Despite the existence of AWS Application Discovery Service or the presence of some form of configuration management database (CMDB), customers still face many challenges. These include time taken for follow-up discussions with application teams to review outputs and understand dependencies (approximately 2 hours per application), cycles needed to generate a cloud architecture design that meets security and compliance requirements, and the effort needed to provide cost estimates by selecting the right AWS services and configurations for optimal application performance in the cloud. Typically, it takes 6–8 weeks to carry out these tasks before actual application migrations begin.

In this blog post, we will harness the power of generative AI and Amazon Bedrock to help organizations simplify, accelerate, and scale migration assessments. Amazon Bedrock is a fully managed service that offers a choice of high-performing foundation models (FMs) from leading AI companies like AI21 Labs, Anthropic, Cohere, Meta, Stability AI, and Amazon through a single API, along with a broad set of capabilities you need to build generative AI applications with security, privacy, and responsible AI. By using Amazon Bedrock Agents, action groups, and Amazon Bedrock Knowledge Bases, we demonstrate how to build a migration assistant application that rapidly generates migration plans, R-dispositions, and cost estimates for applications migrating to AWS. This approach enables you to scale your application portfolio discovery and significantly accelerate your planning phase.

General requirements for a migration assistant

The following are some key requirements that you should consider when building a migration assistant.

Accuracy and consistency

Is your migration assistant application able to render accurate and consistent responses?

Guidance: To ensure accurate and consistent responses from your migration assistant, implement Amazon Bedrock Knowledge Bases. The knowledge base should contain contextual information based on your company’s private data sources. This enables the migration assistant to use Retrieval-Augmented Generation (RAG), which enhances the accuracy and consistency of responses. Your knowledge base should comprise multiple data sources, including:

Handle hallucinations

How are you reducing the hallucinations from the large language model (LLM) for your migration assistant application?

Guidance: Reducing hallucinations in LLMs involves implementation of several key strategies. Implement customized prompts based on your requirements and incorporate advanced prompting techniques to guide the model’s reasoning and provide examples for more accurate responses. These techniques include chain-of-thought prompting, zero-shot prompting, multishot prompting, few-shot prompting, and model-specific prompt engineering guidelines (see Anthropic Claude on Amazon Bedrock prompt engineering guidelines). RAG combines information retrieval with generative capabilities to enhance contextual relevance and reduce hallucinations. Finally, a feedback loop or human-in-the-loop when fine-tuning LLMs on specific datasets will help align the responses with accurate and relevant information, mitigating errors and outdated content.

Modular design

Is the design of your migration assistant modular?

Guidance: Building a migration assistant application using Amazon Bedrock action groups, which have a modular design, offers three key benefits.

Overview of solution

Before we dive deep into the deployment, let’s walk through the key steps of the architecture that will be established, as shown in Figure 1.

    Users interact with the migration assistant through the Amazon Bedrock chat console to input their requests. For example, a user might request to Generate R-disposition with cost estimates or Generate Migration plan for specific application IDs (for example, A1-CRM or A2-CMDB). The migration assistant, which uses Amazon Bedrock agents, is configured with instructions, action groups, and knowledge bases. When processing the user’s request, the migration assistant invokes relevant action groups such as R Dispositions and Migration Plan, which in turn invoke specific AWS Lambda The Lambda functions process the request using RAG to produce the required output. The resulting output documents (R-Dispositions with cost estimates and Migration Plan) are then uploaded to a designated Amazon Simple Storage Service (Amazon S3)

The following image is a screenshot of a sample user interaction with the migration assistant.

Prerequisites

You should have the following:

Deployment steps

    Configure a knowledge base:
      Open the AWS Management Console for Amazon Bedrock and navigate to Amazon Bedrock Knowledge Bases. Choose Create knowledge base and enter a name and optional description. Select the vector database (for example, Amazon OpenSearch Serverless). Select the embedding model (for example, Amazon Titan Embedding G1 – Text). Add data sources:
        For Amazon S3: Specify the S3 bucket and prefix, file types, and chunking configuration. For custom data: Use the API to ingest data programmatically.
      Review and create the knowledge base.
    Set up Amazon Bedrock Agents:
      In the Amazon Bedrock console, go to the Agents section and chose Create agent. Enter a name and optional description for the agent. Select the foundation model (for example, Anthropic Claude V3). Configure the agent’s AWS Identity and Access Management (IAM) role to grant necessary permissions. Add instructions to guide the agent’s behavior. Optionally, add the previously created Amazon Bedrock Knowledge Base to enhance the agent’s responses. Configure additional settings such as maximum tokens and temperature. Review and create the agent.
    Configure actions groups for the agent:
      On the agent’s configuration page, navigate to the Action groups Choose Add action group for each required group (for example, Create R-disposition Assessment and Create Migration Plan). For each action group: After adding all action groups, review the entire agent configuration and deploy the agent.

Clean up

To avoid unnecessary charges, delete the resources created during testing. Use the following steps to clean up the resources:

    Delete the Amazon Bedrock knowledge base: Open the Amazon Bedrock console.
    Delete the knowledge base from any agents that it’s associated with.

      From the left navigation pane, choose Agents. Select the Name of the agent that you want to delete the knowledge base from. A red banner appears to warn you to delete the reference to the knowledge base, which no longer exists, from the agent. Select the radio button next to the knowledge base that you want to remove. Choose More and then choose Delete. From the left navigation pane, choose Knowledge base. To delete a source, either choose the radio button next to the source and select Delete or select the Name of the source and then choose Delete in the top right corner of the details page. Review the warnings for deleting a knowledge base. If you accept these conditions, enter delete in the input box and choose Delete to confirm.
    Delete the Agent
      In the Amazon Bedrock console, choose Agents from the left navigation pane. Select the radio button next to the agent to delete. A modal appears warning you about the consequences of deletion. Enter delete in the input box and choose Delete to confirm. A blue banner appears to inform you that the agent is being deleted. When deletion is complete, a green success banner appears.
    Delete all the other resources including the Lambda functions and any AWS services used for account customization.

Conclusion

Conducting assessments on application portfolios for AWS cloud migration can be a time-consuming process, involving analyzing data from various sources, discovery and design discussions to develop an AWS Cloud architecture design, and cost estimates.

In this blog post, we demonstrated how you can simplify, accelerate, and scale migration assessments by using generative AI and Amazon Bedrock. We showcased using Amazon Bedrock Agents, action groups, and Amazon Bedrock Knowledge Bases for a migration assistant application that renders migration plans, R-dispositions, and cost estimates. This approach significantly reduces the time and effort required for portfolio assessments, helping organizations to scale and expedite their journey to the AWS Cloud.

Ready to improve your cloud migration process with generative AI in Amazon Bedrock? Begin by exploring the Amazon Bedrock User Guide to understand how it can streamline your organization’s cloud journey. For further assistance and expertise, consider using AWS Professional Services (contact sales) to help you streamline your cloud migration journey and maximize the benefits of Amazon Bedrock.


About the Authors

Ebbey Thomas is a Senior Cloud Architect at AWS, with a strong focus on leveraging generative AI to enhance cloud infrastructure automation and accelerate migrations. In his role at AWS Professional Services, Ebbey designs and implements solutions that improve cloud adoption speed and efficiency while ensuring secure and scalable operations for AWS users. He is known for solving complex cloud challenges and driving tangible results for clients. Ebbey holds a BS in Computer Engineering and an MS in Information Systems from Syracuse University.

Shiva Vaidyanathan is a Principal Cloud Architect at AWS. He provides technical guidance, design and lead implementation projects to customers ensuring their success on AWS. He works towards making cloud networking simpler for everyone. Prior to joining AWS, he has worked on several NSF funded research initiatives on performing secure computing in public cloud infrastructures. He holds a MS in Computer Science from Rutgers University and a MS in Electrical Engineering from New York University.

Fish AI Reader

Fish AI Reader

AI辅助创作,多种专业模板,深度分析,高质量内容生成。从观点提取到深度思考,FishAI为您提供全方位的创作支持。新版本引入自定义参数,让您的创作更加个性化和精准。

FishAI

FishAI

鱼阅,AI 时代的下一个智能信息助手,助你摆脱信息焦虑

联系邮箱 441953276@qq.com

相关标签

云迁移 生成式AI Amazon Bedrock 迁移助手 RAG
相关文章