Prepare for the AWS Certified AI Practitioner Exam with flashcards and multiple choice questions. Each question includes hints and explanations to help you succeed on your test. Get ready for certification!

Practice this question and more.


To utilize a custom model for document summarization in Amazon Bedrock, what must be done?

  1. Purchase Provisioned Throughout for the custom model

  2. Deploy the custom model in an Amazon SageMaker endpoint

  3. Register the model with the Amazon SageMaker Model Registry

  4. Grant access to the custom model in Amazon Bedrock

The correct answer is: Deploy the custom model in an Amazon SageMaker endpoint

Using a custom model for document summarization in Amazon Bedrock requires deploying the custom model in an Amazon SageMaker endpoint. Amazon SageMaker provides the infrastructure and services necessary to build, train, and deploy machine learning models effectively. By deploying the custom model to a SageMaker endpoint, users can then invoke the model to generate summaries of documents. Deploying the model in an endpoint is essential because it allows for scalable and on-demand access to the model, meaning that the model can respond to requests for summarization in real-time. This move is crucial for ensuring that the model performs efficiently and can handle concurrent requests. While the other answers may involve important aspects of working with machine learning models in AWS, such as registration and access, they do not directly pertain to making the model operational for summarization within Amazon Bedrock. Registering a model, for instance, allows for better model management but does not facilitate the real-time access needed for summarization tasks. Granting access is also important in the context of permissions and security, but it does not replace the need for deploying the model to run inference tasks in Bedrock.