*US Based Domestic applicants only. Must be US Citizen or Green Card holder. No Visa Sponsorship
**This is a 100% on-site position and will require the candidate in the office 5-days a week. Relocation will not be provided.
***No C2C agencies
The Data Integration Engineer is a technical professional role that ensures our data integration needs are identified and implemented on the Informatica Data Management Cloud (IDMC) platform. IDMC is a critical component of our data warehouse solution architecture and the Data Integration Engineer will engineer data integration components to build and maintain our Enterprise Data Warehouse.
To achieve this goal, the Data Integration Engineer will lead data integration requirements and develop, change, and/or maintain data source connectors, data workflows and transformations, agents, other Data Integration components and assets. Your technical acumen of IDMC must be significant to perform day to day operational and project activities with a high degree of quality.
Your role also provides the opportunity to work and lead the larger data warehouse team on implementing more complex requirements that drive changes within IDMC across the non-production and production environments. Providing support and coordination of the overall data management process in this way requires an attention to detail and being customer focused to ensure needs are effectively addressed by the IT organization.
Essential Functions
- Assist with requirement definition on data integration designs and take ownership to resolve JIRA tickets within Sprints with a high quality outcome.
- Review, analyze, and where necessary author functional requirements and mapping documents.
- Develop standard to complex ETL components such as connectors, mappings, sessions, and workflows based on the prepared low-level design requirements.
- Be a steward of quality in your work by applying design patterns, use cases thinking, and validation of work through unit testing.
- Debug mappings and identify errors and error rows so that they can be corrected and re-loaded into a target system.
- Author and implement a performance tuning of mappings, processes, and load routines.
- Assist with QA by acting as a peer reviewer and coaching more junior resources.
- Act as an expert resource on IDMC application, repositories, dependencies, scheduling mechanism, object, and various dependencies within Andover's solution architecture to other members of the team.
- Profile data to gain insights into Data Integration strategies, identify technical dependencies and design integration components.
- Monitor IDMC environments and troubleshoot service and job related issues.
- Assess ramification of issues and prioritize tasks based on business impact for operational and project support activities.
- Provide reporting related to IDMC operational, job trending, and other resource utilization within the nightly batch schedule.
- Provide guidance to the Cloud support team to improve monitoring, measuring and reporting system health, performance and uptime metrics.
- Leverage IDMC existing and new features by evaluating/reviewing, installing and configuring new software releases, upgrades, and patches through planning and coordination with the larger team.
- Maintain appropriate documentation of system infrastructure, data architecture, ETL processes, data modeling, and reporting capabilities providing oversight.
- Lead Data Lifecycle Management processes and procedures related to change management.
- Recommend updates to Data Lifecycle Management to improve efficiency and reduce risk related to changing our architecture, data models, and IDMC components
- Other duties as assigned to support technical/operational activities.
Competencies / KSA's
The core competencies for a person to perform this job at a skilled level are:
- Informatica development focused on Data Engineering and Integration.
- Administration and management of the IDMC solution through a Data Management Lifecycle.
This job requires certain knowledge, skills, and abilities as follows:
- DevOps methodologies and best practices.
- Expert analytical skills to troubleshoot issues with system performance, connectivity to various data sources, space and system logging.
- Experience in SOX requirements, Audit reports and DR/BC needs that are mandatory for running the data management platform.
- Eligible candidate will be trained in the role of Advanced Data Engineering and Integration using IDMC.
Key Performance Indicators
An individual's success is determined by their performance of the following Company's core values:
Be a Champion with:
- Integrity and Trust: The ability to skillfully interact from top to bottom and bottom to top requires integrity and trust, which starts at the top.
Value Everyone by:
- Having a Customer Focus: We collectively service customers every day. Getting firsthand customer information and using it for improvements is essential.
- Focused on Peer Relationships: quickly find common ground and solve problems as a cooperative team player who can collaborate, be candid, and gain the trust and support of peers.
Be an Expert by:
- Priority Setting: Often priority setting occurs at the individual level but requires managerial guidance and vice versa. Skilled priority setting establishes this collaborative norm.
- Being Functionally / Technically proficient: Has the functional and technical knowledge and skills to do the job at a high level of accomplishment.
- Problem Solving: Probes all fruitful sources for answers.
Rise to the Occasion by:
- Learning on the Fly: With the increasing pace of change, being quick to learn and apply first-time solutions is a crucial skill.
Required Education and Experience
- Required:
- Bachelor's degree in Management Information Systems, Computer Information Systems, Computer Science or Engineering.
- Minimum of 5+ year professional experience with Informatica Power Center or Informatica Data Management Cloud.
- Competent in PostgreSQL to implement database objects on AWS Redshift
Preferred:
Experience working with application integration technologies such as Service Oriented and Event Driven Architectures (SOA / EDA), JMS messaging, SOAP and RESTful services.
Experience with AWS data products.
Experience within the P&C Insurance industry in a data architecture role.
Work Environment
This job operates in a professional office environment. This role routinely uses standard office equipment such as laptop computers and smartphones.
Physical Demands
The physical demands described here are representative of those that must be met by an employee to successfully perform the essential functions of this job.
Position Type and Expected Hours of Work
This is a full-time position. Days and hours of work are Monday through Friday, 8:15 am to 4:30 pm and must work 37.5 hours each week to maintain full-time status.
Work schedules may differ based upon responsibilities demanding before or after hours to meet a business need.
Remote working for those that are eligible is at the discretion of Management.
Other Duties
Please note this job description is not designed to cover or contain a comprehensive listing of activities, duties or responsibilities that are required of the employee for this job. Duties, responsibilities, and activities may change at any time with or without notice.