Description: Mus have - Python
- DAG/Airflow
- Gitlab
- SQL
- Relation / MPP Database Systems
-
- Tereadata, Netezza, Exadata
- Building CI/CD pipelines
- Ability to analyze data
- Cloud - preference for GCP
Responsibilities - Design and develop robust data pipelines to support data ingestion, processing, storage, and retrieval. Create and contribute to data frameworks and tools to build scalable and reliable data infrastructure.
- Collaborate with cross-functional teams, other data architectures and business stakeholders, to understand data requirements and deliver efficient data pipelines.
- Optimize data engineering systems and processes to handle large-scale data sets efficiently.
Skills needed: - 4+ Years of experience in ETL, data engineering and analytics technical roles.
- 4+ Proficiency in building ETL/data engineering pipelines using python, DAG/Airflow, Gitlab, SQL
- Programming experience with Object-Oriented language like Python.
- Practical application development and data manipulation/transformation using Python.
- 2+ Experience with Relational and MPP database system such as Teradata, Netezza, Exadata etc.
- Knowledge on any Cloud databases is nice to have.
- Experience in analyzing data using SQL including not limited to analytical functions.
- Experience in versioning and building CI/CD Pipelines using Git tools is a plus.
Must Have: - Python
- DAG/Airflow
- Gitlab
- SQL
- Relation / MPP Database Systems
-
- Tereadata, Netezza, Exadata
- Building CI/CD pipelines
- Ability to analyze data
- Programming
- Data Manipulation/ Transformation
Soft skills: - Communication
- Articulate data information
- Adaptable
- Can work independently and on a team
Regards Jaya Kushwaha Associate Manager | Recruitment Email: ...@usgrpinc.com Work Phone: 614-###-####*676 USG Inc.