AWS Databricks Cloud Data Engineer (Sr.) 100% Remote
: Job Details :


AWS Databricks Cloud Data Engineer (Sr.) 100% Remote

Cyrten

Location: all cities,AK, USA

Date: 2024-10-01T05:22:47Z

Job Description:
AWS Databricks Cloud Data Engineer (Sr) 100% RemoteLocation: Remote Preferred Time Zones: EST, CST, MTN Rate: DOE - W2 Length: 1 to 3 years Top 3-5 skills: 1. Hands-on Terraform. Build from scratch. 2. Scripts - Shell Unix python 3. Infrastructure experience deploying platforms 4. Data Processing platforms such as Databricks (or Snowflake) 5. Experience with data analytics in a cloud environment Tip for success: Underneath the Skills section of your resume, put a Terraform section. Don't be shy; list everything you have ever done in Terraform. Terraform is such a major key to this opportunity. Qualifications:
  • Bachelor's degree in Computer Science, Management Information Systems, Computer Engineering, or related field or equivalent work experience; advanced degree preferred
  • Seven or more years of experience as an AWS Data Engineer or Architect in designing and building large-scale solutions in an enterprise setting in both development and deployment
  • Five plus years in designing and building solutions in the cloud
  • Expertise in building and managing Cloud databases such as AWS RDS, DynamoDB, DocumentDB, or analogous architectures
  • Expertise in building Cloud Database Management Systems in Databricks Lakehouse or analogous architectures
  • Expertise in Cloud Data Warehouses in Redshift, BigQuery, or analogous architectures is a plus
  • Deep SQL expertise, data modeling, and experience with data governance in relational databases
  • Experience with the practical application of data warehousing concepts, methodologies, and frameworks using traditional (Vertica, Teradata, etc.) and current (SparkSQL, Hadoop, Kafka) distributed technologies
  • Refined skills using one or more scripting languages (e.g., Python, bash, etc.)
  • Experience using ETL/ELT tools and technologies such as Talend, Informatica a plus
  • Embrace data platform thinking, design and develop data pipelines keeping security, scale, uptime, and reliability in mind
  • Expertise in relational and dimensional data modeling
  • UNIX admin and general server administration experience required
  • Presto, Hive, SparkSQL, Cassandra, or Solr, and other Big Data query and transformation experiences a plus
  • Experience using Spark, Kafka, Hadoop, or similar distributed data technologies a plus
  • Able to expertly express the benefits and constraints of technology solutions to technology partners, business partners, and team members
  • Experience with leveraging CI/CD pipelines
  • Experience with Agile methodologies and ability to work in an Agile manner is preferred
  • One or more cloud certifications
Responsibilities:
  • Understand technology vision and strategic direction of business needs
  • Understand our current data model and infrastructure, proactively identify gaps and areas for improvement, and prescribe architectural recommendations focusing on performance and accessibility.
  • Partner across engineering teams to design, build, and support the next generation of our analytics systems.
  • Partner with business and analytics teams to understand specific requirements for data systems to support the development and deployment of data workloads ranging from Tableau reports to ad hoc analyses.
  • Own and develop architecture supporting translating analytical questions into effective reports that drive business action.
  • Automate and optimize existing data processing workloads by recognizing data and technology usage patterns and implementing solutions.
  • Solid grasp of the intersection between analytics and engineering while maintaining a proactive approach to ensure solutions demonstrate high-performance levels, privacy, security, scalability, and reliability upon deployment.
  • Provide guidance to partners on the effective use of the database management systems (DBMS) platform through collaboration, documentation, and associated standard methodologies.
  • Design and build end-to-end automation to support and maintain software currency
  • Create build automation services using Terraform, Python, and OS shell scripts.
  • Develop validation and certification processes through automation tools
  • Design integrated solutions in alignment with design patterns, blueprints, guidelines, and standard methodologies for products
  • Participate in developing solutions by incorporating cloud native and 3rd party vendor products
  • Participate in research, perform POCs (proofs of concept) with emerging technologies, and adopt industry best practices in the data space to advance the cloud data platform.
  • Develop data streaming, migration, and replication solutions
  • Demonstrate leadership, collaboration, exceptional communication, negotiation, strategic, and influencing skills to gain consensus and produce the best solutions.
Please Note: Must be a U.S. Citizen, this is for a Federal Gov't client NO 3rd Party Candidates NO 3rd Party Vendors
Apply Now!

Similar Jobs (0)