Prepare for the AWS Certified AI Practitioner Exam with flashcards and multiple choice questions. Each question includes hints and explanations to help you succeed on your test. Get ready for certification!

Practice this question and more.


Which strategy should an AI practitioner use to store invocation logs for a model in Amazon Bedrock?

  1. Configure AWS CloudTrail as the logs destination for the model

  2. Enable invocation logging in Amazon Bedrock

  3. Configure AWS Audit Manager as the logs destination for the model

  4. Configure model invocation logging in Amazon EventBridge

The correct answer is: Enable invocation logging in Amazon Bedrock

Enabling invocation logging in Amazon Bedrock is the appropriate strategy for storing logs related to model invocations. This process allows for the collection of detailed logs that capture each invocation request made to the model, including inputs, outputs, and any errors encountered. This information is vital for understanding how the model performs, individual request patterns, and debugging issues. By directly enabling invocation logging within Amazon Bedrock, users can ensure that all necessary data is captured and stored in a structured format. This capability is specifically designed to work seamlessly with Bedrock's architecture, providing a reliable and efficient means of monitoring and analyzing model interactions. In contrast, the other choices don't align with the capabilities or purposes of Bedrock. For example, configuring AWS CloudTrail would provide logs related to API calls made within the AWS account but would not focus specifically on model invocation details. AWS Audit Manager is more suited for compliance management and assessment reports rather than logging specific model invocations. Similarly, while EventBridge can capture events, it is not the designated method for storing detailed invocation logs for a model in Bedrock. Therefore, enabling invocation logging in Amazon Bedrock is the most direct and effective approach for this scenario.