Location: Dallas,TX, USA
Job Title: Hadoop Platform Engineer
Location: Dallas, TX (Onsite)
Duration: Long Term
Only W2
Qualifications:
Technical Proficiency:
Experience with Hadoop and Big Data technologies, including Cloudera CDH/CDP, Data Bricks, HD Insights, etc.
Strong understanding of core Hadoop services such as HDFS, MapReduce, Kafka, Spark, Hive, Impala, HBase, Kudu, Sqoop, and Oozie.
Proficiency in RHEL Linux operating systems, databases, and hardware administration.
Operations and Design:
Operations, design, capacity planning, cluster setup, security, and performance tuning in large-scale Enterprise Hadoop environments.
Scripting and Automation:
Proficient in shell scripting (e.g., Bash, KSH) for automation.
Security Implementation:
Experience in setting up, configuring, and managing security for Hadoop clusters using Kerberos with integration with LDAP/AD.
Problem Solving and Troubleshooting:
Expertise in system administration and programming skills for storage capacity management, debugging, and performance tuning.
Collaboration and Communication:
Collaborate with cross-functional teams, including data engineers, data scientists, and DevOps teams.
Provide technical guidance and support to team members and stakeholders.