Job Title: GCP-Certified Data Engineer

Location: New York
Job Type: Hybrid (3 days in office)
Experience Level: Senior (5+ Years)

We are seeking a GCP-Certified Data Engineer with 5+ years of hands-on experience in cloud data engineering, ideally with direct experience migrating from Snowflake to BigQuery. This role is key to modernizing and scaling our data infrastructure, ensuring robust data ingestion, transformation, and performance optimization using Google Cloud’s native tools.

Key Responsibilities:

  • Design, develop, and optimize scalable ETL/ELT pipelines using Apache Beam (Dataflow) and Pub/Sub
  • Orchestrate complex data workflows using Cloud Composer (Apache Airflow)
  • Lead or support large-scale data migrations from AWS/Snowflake to BigQuery, including schema mapping and performance tuning
  • Enhance BigQuery performance through strategic use of partitioning, clustering, and effective resource management
  • Implement rigorous data quality frameworks, validation checks, and ensure pipeline observability and monitoring
  • Partner with analytics, product, and business teams to understand data needs and deliver timely, reliable data solutions

Required Skills and Experience:

  • GCP Certified (Professional Data Engineer preferred)
  • 5+ years of experience in cloud data engineering, including real-time and batch processing
  • Strong proficiency in Python and SQL
  • Deep understanding of BigQuery, Dataflow, Pub/Sub, and Cloud Storage
  • Experience with Cloud Composer (Airflow) for orchestration
  • Prior experience with ETL/ELT migrations, particularly from Snowflake to GCP
  • Proven track record in performance optimization and managing large datasets (structured & semi-structured)
  • Familiarity with Terraform or Infrastructure as Code (IaC)
  • Experience with CI/CD for data pipelines
  • Knowledge of AWS services and multi-cloud data strategies