As an Advanced Data Engineer, you will have the opportunity to lead the development of innovative data solutions, enabling the effective use of data across the organization. You will be responsible for designing, building, and maintaining robust data pipelines and platforms to meet business objectives, focusing on data as a strategic asset. Your role will involve collaboration with cross-functional teams, leveraging cutting-edge technologies, and ensuring scalable, efficient, and secure data engineering practices. A strong emphasis will be placed on expertise in GCP, Vertex AI, and advanced feature engineering techniques.
Requirements
- 4+ years of professional Data Development experience.
- 4+ years of experience with SQL and NoSQL technologies.
- 3+ years of experience building and maintaining data pipelines and workflows.
- 5+ years of experience developing with Java.
- 2+ years of experience developing with Python.
- 3+ years of experience developing Kafka solutions.
- 2+ years of experience in feature engineering for machine learning pipelines.
- Experience with GCP services such as BigQuery, Vertex AI Platform, Cloud Storage, AutoMLOps, and Dataflow.
- Experience with CI/CD pipelines and processes.
- Experience with automated unit, integration, and performance testing.
- Experience with version control software such as Git.
- Full understanding of ETL and Data Warehousing concepts.
- Strong understanding of Agile principles (Scrum).