BIG DATA ENGINEER
: Job Details :


BIG DATA ENGINEER

Workcog

Location: Albany,MN, USA

Date: 2024-10-22T07:28:20Z

Job Description:

Company Overview

Workcog is a leading staffing and partnerships company that specializes in providing efficient staffing solutions. Our aim is to offer the best possible partnering experience to our clients by utilizing the most effective staffing practices. We take pride in our quality services and expertise in various industries.

Job Overview

Workcog is seeking a highly skilled Big Data Engineer to join our dynamic team. As a Big Data Engineer, you will be responsible for designing, developing, and maintaining our data infrastructure and systems. This role requires expert knowledge of big data technologies and the ability to analyze large datasets to extract valuable insights. The ideal candidate has extensive experience in managing big data projects and a strong understanding of data warehousing concepts.

Qualifications and Skills

  • Bachelor's degree in Computer Science, Engineering, or a related field.
  • Minimum of 5 years of experience in big data engineering or a similar role.
  • Strong programming skills in languages like Java, Python, or Scala.
  • Proficiency in big data technologies such as Hadoop, Spark, Kafka, and Hive.
  • Experience with data warehousing concepts and relational databases like Oracle or MySQL.
  • Knowledge of data modeling and database design principles.
  • Familiarity with cloud platforms like AWS or Azure.
  • Excellent problem-solving and analytical skills.
  • Ability to work collaboratively in a team environment.
  • Strong communication and interpersonal skills.

Roles and Responsibilities

  • Design and develop scalable big data solutions using technologies like Hadoop, Spark, and NoSQL databases.
  • Build and maintain data pipelines and ETL processes to collect and transform large volumes of data.
  • Optimize and fine-tune the performance of big data systems to ensure efficient data processing and analysis.
  • Collaborate with cross-functional teams to gather requirements and identify opportunities for leveraging big data technologies.
  • Develop and implement data governance policies and procedures to ensure data integrity and security.
  • Troubleshoot and debug issues in big data systems and provide timely resolutions.
  • Stay up-to-date with emerging trends and technologies in big data and propose innovative solutions to improve data processing and analysis.
  • Conduct performance testing and capacity planning to optimize the performance and scalability of big data systems.

Skills: BIg data , data bricks, data factory, azure delta lake, selinium, AWS, hadoop

Apply Now!

Similar Jobs (0)