AIP-C01 Study Hub
FM Integration Week 1 · Tuesday

Day 2: FM Selection, Resilience, and Model Switching

Learning Objectives

  • - Compare all Bedrock model families and their modalities
  • - Understand inference parameters (temperature, top-p, max tokens)
  • - Design dynamic model selection with Lambda + AppConfig
  • - Implement resilience with Cross-Region Inference and circuit breakers
  • - Use Step Functions for circuit breaker patterns

Tasks

Tasks

0/5 completed
  • Read45m

    Bedrock Supported Models - Model Catalog

    Know every model family: Amazon Nova (Micro/Lite/Pro/Premier), Anthropic Claude, Meta Llama, Mistral, Cohere, Stability AI. Know modalities.

  • Read30m

    Bedrock Inference Parameters

    Temperature, top-p, top-k, max tokens, stop sequences. Know what each controls and when to adjust.

  • Read30m

    Bedrock Cross-Region Inference

    Automatic failover to another region when primary is unavailable. Key resilience pattern.

  • Blog20m

    Intelligent Prompt Routing for Cost and Latency Benefits

    Auto-analyzes prompt complexity and routes to cheapest capable model. Automated model cascading.

  • Watch20m

    Amazon Bedrock: Simplifying GenAI Development

    Overview of model selection, customization, and deployment patterns in Bedrock.

Exam Skills

Write your understanding, then reveal the reference answer.

0/3 reviewed

Hands-On Lab

Build real muscle memory with these activities.

beginner 30 min

Compare Model Outputs in Bedrock Playground

Systematically compare outputs from Claude, Nova, Llama, and Mistral for the same prompt to understand model family strengths.

  1. 1 Open Bedrock Chat playground and select Claude Sonnet
  2. 2 Enter a complex reasoning prompt: 'A company has 3 AWS accounts. Design a least-privilege IAM strategy for Bedrock access across accounts.'
  3. 3 Record the response quality, latency, and token count
  4. 4 Repeat with Amazon Nova Pro, Meta Llama, and Mistral
  5. 5 Compare results in a table: model, quality, tokens, latency
Open Lab
beginner 25 min

Test Inference Parameters (Temperature, Top-p)

Experiment with temperature and top-p settings to understand their effect on model creativity and determinism.

  1. 1 Open Bedrock Chat playground with Claude Sonnet
  2. 2 Set temperature to 0 and enter: 'Write a haiku about cloud computing'
  3. 3 Run the same prompt 3 times and confirm identical outputs (deterministic)
  4. 4 Change temperature to 1.0 and run the same prompt 3 times — observe variation
  5. 5 Test top-p at 0.1 vs 0.9 with the same prompt and note the difference in creativity
Open Lab
intermediate 20 min

Set Up Cross-Region Inference Profile

Configure a Cross-Region Inference profile to enable automatic failover for Bedrock model invocations.

  1. 1 Open the Bedrock console and navigate to Cross-Region Inference
  2. 2 Create an inference profile for Claude Sonnet targeting us-east-1 and us-west-2
  3. 3 Note the inference profile ARN
  4. 4 Use the AWS CLI to invoke the model via the inference profile ARN: aws bedrock-runtime invoke-model --model-id <profile-arn>
  5. 5 Verify in CloudWatch that the request was routed to one of the configured regions
Open Lab

Scenarios

Think through each scenario before revealing the answer.

D1: FM IntegrationMedium
#2

Cross-Region Model Resilience

Your production GenAI app serves users in US and Europe. The primary model (Claude) occasionally hits rate limits during peak hours. How do you ensure availability?
Think First
  • What Bedrock feature handles cross-region failover automatically?
  • How can Step Functions implement a circuit breaker pattern?
  • What tool enables model switching without code changes?

Practice Questions

6 questions across 3 difficulty levels.

Further Reading

Go deeper into today's topics.