Job Description
Job Title: Data Engineer with Amdocs Experience
Job Location: Orlando Florida
Work Type: Onsite
Job Overview: We are seeking a highly skilled and experienced Data Engineer with expertise in Amdocs systems. The ideal candidate will have a strong background in data engineering, ETL processes, and data pipeline development, along with in-depth knowledge of Amdocs products and services. As a Data Engineer, you will be responsible for designing, developing, and maintaining scalable data pipelines and ensuring seamless integration with Amdocs systems to support our business objectives.
Key Responsibilities: - Design, develop, and maintain robust and scalable data pipelines to support data integration, processing, and analysis.
- Collaborate with cross-functional teams to gather and understand data requirements and translate them into technical specifications.
- Implement ETL (Extract, Transform, Load) processes to ensure accurate and efficient data movement between systems.
- Integrate data from various sources, including Amdocs systems, ensuring data quality, consistency, and reliability.
- Optimize data pipelines for performance, scalability, and cost-efficiency.
- Develop and maintain documentation for data pipelines, ETL processes, and data integration workflows.
- Monitor and troubleshoot data pipeline issues, ensuring timely resolution and minimal disruption to business operations.
- Stay updated with the latest industry trends, technologies, and best practices in data engineering and Amdocs systems.
- Provide technical guidance and support to junior data engineers and other team members as needed.
Required Qualifications: - Bachelor's degree in computer science, Information Technology, or a related field.
- 3+ years of experience as a Data Engineer, with a focus on data pipeline development and ETL processes.
- Strong expertise in Amdocs products and services, including integration and data management.
- Proficiency in programming languages such as Python, Java, or Scala.
- Experience with data integration tools and frameworks (e.g., Apache Kafka, Apache Nifi, Talend).
- Solid understanding of database systems, SQL, and NoSQL databases.
- Knowledge of cloud platforms (e.g., AWS, Azure, Google Cloud) and related data services.
- Familiarity with big data technologies (e.g., Hadoop, Spark) is a plus.
- Excellent problem-solving skills and attention to detail.
- Strong communication and collaboration skills.
Preferred Qualifications: - Master's degree in computer science, Information Technology, or a related field.
- Experience with data warehousing solutions and BI tools (e.g., Snowflake, Redshift, Tableau).
- Certifications in Amdocs systems or related technologies.
- Experience with DevOps practices and tools (e.g., Jenkins, Docker, Kubernetes).
- Knowledge of data governance and security best practices.