Saxon Global
Location: Pleasanton,CA, USA
Date: 2024-12-23T07:10:36Z
Job Description:
GCP Data Engineer - Remote / Hybrid / Onsite Position: Hybrid (2-3 days/week at the office)
- Specific Location of the resource: Dallas TX, but open to Pleasanton CA and Phoenix
- USC or GC only
- $60-65/hr W2 or $70-75/hr c2c - try for independent candidate.
- Contract's Duration: 6 months, with potential extension linked to performance
- Target Start Date: Immediate start
- Client: Cerebus (GFT consulting)
Required Qualifications - 7+ years proven experience in developing and deploying data pipelines in GCP or Azure.
- 5+ years of Snowflake, BigQuery and/or Databricks experience
- 5+ years proven experience in building frameworks for data ingestion, processing, and consumption using GCP Data Flow, GCP Data Composer, Big Query.
- 4+ years of strong experience with SQL, Python, Java, API development
- 2+ years of proven expertise in creating real-time pipelines using Kafka, Pub/sub.gcp
- Building high quality data pipelines with monitoring and observability
- 2+ years of experience building dashboards and reports with PowerBI and/or Thoughtspot
Preferred Qualifications - Extensive experience in data transformations for Retail and e-commerce business use cases will be a plus
- Bachelor's or Master's in computer engineering, computer science or related area.
- Knowledge of Github Actions for CICD
- Knowledge of building machine learning models
Apply Now!