Diverse Lynx
Location: Dallas,TX, USA
Date: 2024-11-11T08:48:55Z
Job Description:
Title: GCP Consultant Type: Contract Duration: 52 W, 1 D Location: Dallas, TX (This is an onsite role. Must visit office biweekly) Responsibilities: 1. Design, develop, and maintain scalable data pipelines and data processing systems on the Google Cloud Platform (GCP). 2. Collaborate with data scientists, analysts, and other stakeholders to understand their data requirements and implement solutions accordingly. 3. Develop and optimize ETL processes to ensure efficient data ingestion, transformation, and loading. 4. Implement data governance and security measures to ensure data quality, integrity, and privacy. 5. Monitor and troubleshoot data pipelines to identify and resolve issues in a timely manner. 6. Work with cross-functional teams to integrate data from various sources and systems. 7. Conduct performance tuning and optimization of data processing jobs. 8. Stay updated with the latest trends and technologies in the field of data engineering and GCP services. Focus Areas: •Building scalable and efficient data pipelines on the GCP. •Data governance and security. •Integration of data from multiple sources and systems. •Performance tuning and optimization. •Staying updated with emerging technologies and best practices in data engineering. Key Skill Sets: 1. Data Engineering Work: The candidate should have experience in building pipelines using Python/Pyspark on GCP cloud. 2. Dataproc Knowledge: The candidate should have working knowledge of serverless Dataproc and Ephemeral Dataproc. 3. Airflow Expertise: Proficiency in Airflow is required. 4. BigQuery: The candidate must be very strong in writing SQL. 5. Client Experience: Experience in machine learning will be an added advantage for retaining the position. 6. Vertex AI: Knowledge and working experience in model building using Vertex AI etc will be added advantage Qualifications we seek in you! Minimum Qualifications: 1. Bachelor's degree in computer science, information systems, or a related field. 2. Experience in data engineering or a similar role. 3. Demonstrated experience in designing and implementing data pipelines using GCP services. 4. Proficiency in Python, SQL, and data manipulation techniques. 5. Strong understanding of cloud computing concepts and distributed systems. Preferred Qualifications/skills: 7. Master's degree in computer science, information systems, or a related field. 8. Experience with other cloud platforms such as AWS or Azure. 9. Certification in GCP data engineering or related field. 10. Familiarity with machine learning concepts and frameworks. 11. Experience with real-time data processing and streaming technologies.Diverse Lynx LLC is an Equal Employment Opportunity employer. All qualified applicants will receive due consideration for employment without any discrimination. All applicants will be evaluated solely on the basis of their ability, competence and their proven capability to perform the functions outlined in the corresponding role. We promote and support a diverse workforce across all levels in the company.
Apply Now!