Responsibilities:
- Design, develop and maintain ETL processes using Ab Initio to integrate data as Iceberg on ADLS/Snowflake
- Optimize ETL workflows to ensure efficient data processing and loading.
- Develop scripts to automate data processing and loading tasks.
- Implement data quality checks and validation processes within ETL workflows.
- Understanding business requirements/scope of projects, create ETL code as per business logic/process; be able to provide estimation for the tasks as required with supporting data points.
- Ensure data governance policies are adhered to, including data lineage and metadata management.
- Provide support for data-related issues and troubleshoot ETL-related problems.
- Create and maintain technical documentation and reports for stakeholders.
Requirements Must Have skills:
- 5-6 years of total technical experience on designing, developing, and implementing ETL solutions using Ab Initio/Linux.
- Experience in complete Software Development Life Cycle which includes Systems Analysis of various applications in Client/Server Environment.
- Working with various software applications with advanced knowledge (such as Ab Initio, Azure ADLS, Spark, Snowflake, Data warehousing, SQL, CI/CD, Control M, Knowledge on Data Modelling, Python, Hive Thrift Server, lift, Apache Ranger)
- Experience in creating graphs, PSets, shell scripting, deployment activities, performance tuning and error handling skills.
- Excellent hands-on experience in various Transform, Partition/De-partition, Database, Dataset and XML components, .
- Analytical problem solving and business interaction skills.
- Must have strong exposure to Parallelism techniques, Generic graph design, EME and Data Warehousing concepts.
- Experience in writing complex Database queries (preferably Oracle and DB2).
- Effective communication with entire offshore team, customer and all concerned teams on day-to-day basis; provide daily status reporting to all stakeholders.