이 GCP PDE 시험 덤프는 최신 Google Professional Data Engineer 시험 형식을 기반으로 한 실제 문제와 상세한 설명을 포함합니다. GCP 시험 덤프를 검증된 솔루션과 함께 찾고 있다면 Cloud Pass 앱에서 10,000개 이상의 연습 문제를 시도해보세요.
중복 문제 없음
모든 문제는 고유하며 신중하게 선별되었습니다
최신 기출 문제
2025년 시험 패턴으로 정기적으로 업데이트
Sample Questions
실전 문제
Question 1
You are troubleshooting an Apache Flink streaming cluster running on 12 Compute Engine VMs in a managed instance group without external IPs on the custom VPC "analytics-vpc" and subnet "stream-subnet".
TaskManager nodes cannot communicate with one another.
Your networking team manages access using Google Cloud network tags to define firewall rules.
Flink has been configured to use TCP ports 12345 and 12346 for RPC and data transport between nodes.
You need to identify the issue while following Google-recommended networking security practices.
What should you do?
Question 2
Your company operates three independent data workflows that must be orchestrated from a single place with consistent scheduling, monitoring, and on-demand execution.
1) A Dataproc Serverless Spark batch job in us-central1 converts CSV files in a regional Cloud Storage bucket into partitioned BigQuery tables; it must run daily at 01:30 UTC and complete within 45 minutes.
2) A Storage Transfer Service job pulls approximately 50 GB from an external SFTP endpoint into Cloud Storage every 4 hours; you cannot install agents on the external system.
3) A Dataflow Flex Template pipeline calls a third-party REST API (rate limit: 1,000 requests/hour) to fetch deltas and stage them in Cloud Storage.
You need a single solution to schedule these three workflows, monitor task status and logs, receive failure alerts within 10 minutes, and allow manual ad-hoc runs (up to 5 per day) without building custom infrastructure. What should you do?
Question 3
You are designing a platform to store 1-second interval temperature and humidity readings from 12 million cold-chain sensors across 40 warehouses.
Analysts require real-time, ad hoc range queries over the most recent 7 days with sub-second latency.
You must avoid per-query charges and ensure the schema can scale to 25 million sensors and accommodate new metrics without frequent schema changes.
Which database and data model should you choose?
Question 4
You operate a Cloud Run service that receives messages from a Cloud Pub/Sub push subscription at a steady rate of ~1,200 messages per minute, aggregates events into 5-minute batches, and writes compressed JSON files to a dedicated Cloud Storage bucket.\nYou want to configure Cloud Monitoring alerts that will reliably indicate if the pipeline stalls for more than 10 minutes by detecting a growing upstream backlog and a slowdown in data written downstream; which alerts should you create?
Question 5
Your fintech compliance team must store 12 TB of transaction audit files (about 200,000 objects per month) in a Cloud Storage Archive bucket with a 7-year retention requirement.
Due to a zero-trust mandate, you must implement a Trust No One (TNO) model so that even cloud provider personnel cannot decrypt the data; uploads will be performed from an on-prem hardened host using gsutil, and only the internal security team may hold the encryption material.
What should you do to meet these requirements?