AWS Certified Data Engineer Practice Exams

1 of 10 Free AWS Data Engineer Exams | More than 500 Certification Exam Questions

100 Question BOSS Exam
Exam Objectives Test β€” 100-question BOSS mock exam banner
101 Question FINAL Exam
Free Certification β€” 101-question final practice test banner

AWS Data Engineer – Associate Exam Facts

  • 50 scored questions plus 15 unscored questions
  • Question types are multiple choice and multiple response
  • Scaled scoring 100–1000 with a passing score of 720
  • Target candidate: 2–3 years in data engineering and 1–2 years with AWS
  • Compensatory model β€” only overall exam score must pass

DEA-C01 Content Domains & Weighting

  • Domain 1: Data Ingestion & Transformation – 34%
  • Domain 2: Data Store Management – 26%
  • Domain 3: Data Operations & Support – 22%
  • Domain 4: Data Security & Governance – 18%

The Trick to IT Certification Success

Stop wasting time. Download this proven Certification Success Study Plan for free.

Practice

Do the practice tests

Prompt

AI driven training

Perform

Learn by doing

Pass

Get certified in half the time

AWS Certified Data Engineer – Associate Exam Topics

Exam Basics

  • Format: 50 scored questions and 15 unscored questions
  • Question types: multiple choice and multiple response
  • Scoring: scaled 100–1000; passing score 720
  • Model: compensatory β€” section scores don’t need to pass individually
  • Target candidate: 2–3 years data engineering; 1–2 years hands-on with AWS

Domain 1: Data Ingestion & Transformation (34%)

  • Ingest streaming and batch data with Kinesis/MSK/DynamoDB Streams, S3, Glue, EMR, Redshift, DMS, AppFlow
  • Schedule and trigger jobs via EventBridge, MWAA (Airflow), Glue workflows; handle replayability, fan-in/out, throttling
  • Build ETL/ELT with EMR, Glue, Lambda, Redshift; connect via JDBC/ODBC; integrate multiple sources
  • Transform formats (e.g., CSV to Parquet), optimize containerized processing (EKS/ECS), and expose data via APIs

Domain 2: Data Store Management (26%)

  • Select and configure stores for access patterns and performance: Redshift, RDS, DynamoDB, EMR/Lake Formation, Kinesis/MSK
  • Use Redshift Spectrum, federated queries, materialized views; integrate Transfer Family
  • Catalog and discover data with Glue Data Catalog and crawlers; manage schemas, partitions, and metadata
  • Manage lifecycles: S3 Lifecycle tiers and expiration, S3 versioning, DynamoDB TTL; hot vs cold storage choices
  • Design schemas and evolve them; apply indexing/partitioning/compression; perform schema conversion with AWS SCT/DMS

Domain 3: Data Operations & Support (22%)

  • Automate pipelines with MWAA, Step Functions, Glue/EMR/Redshift features, Lambda, EventBridge
  • Analyze with Athena/QuickSight; cleanse and profile data using DataBrew, Wrangler, notebooks
  • Log and monitor with CloudWatch/CloudTrail/Macie; analyze logs via Athena, OpenSearch, Logs Insights
  • Ensure data quality: define rules, sample data, detect skew, validate completeness/consistency/integrity

Domain 4: Data Security & Governance (18%)

  • Apply AuthN/AuthZ with IAM roles/policies, SGs, PrivateLink, S3 Access Points; manage creds with Secrets Manager/Parameter Store
  • Encrypt at rest and in transit; use KMS; implement masking/anonymization and cross-account encryption
  • Centralize and audit logs with CloudTrail, CloudWatch Logs, CloudTrail Lake; analyze with Athena/OpenSearch
  • Enforce privacy/governance: protect PII (Macie + Lake Formation), control data sharing (Redshift), manage regional restrictions and config drift (Config)

Out of Scope for the Exam

  • Performing AI/ML tasks
  • Language-specific syntax expertise
  • Drawing business conclusions from data

How to Prepare

  • Study the official exam guide and domain task statements
  • Build pipelines end-to-end with Kinesis/MSK, Glue/EMR/Spark, Redshift, and Step Functions/MWAA
  • Practice cataloging and lifecycle policies with Glue Data Catalog and S3 Lifecycle
  • Drill security/governance: IAM policies/roles, KMS encryption, Macie, Lake Formation, CloudTrail/CloudWatch
  • Use Athena/QuickSight for analysis; validate data quality with DataBrew and queries