To Apply for this Job Click Here Team: Looking for someone who can work independently as well as in a team dynamic. Looking for previous experience in taking ownership of feature/functionality. Previous experience creating well tested services. Ability to investigate any technical component as well as understand and contribute to overall systems architecture. This is highly sensitive and critical business integration to be supported Critical for revenue reconciliation and recognition. Also other reports for revenue and bounty sharing. We are also in the middle of accounting unification. Through this project, we will be reporting HULU data to BRIM. . Without this resource we would miss deadlines and milestones therefore impacting accounting unification. TECH STACK1. Scala. 2. Python. 3. Spark. 4. Databricks. Working alongside 14 others for the FUSE project of data aggregation from multiple systems into a unified data stream. Eligible for extension after the initial duration. Looking for someone in New York Basic Qualifications
- 5+ years of experience
- building large datasets and scalable services
- deploying and running services in AWS, and engineering big-data solutions using technologies like Databricks, EMR, S3, Spark
- container systems such as Docker or Kubernetes
- Drive and maintain a culture of quality, innovation and experimentation
- Work in an Agile environment that focuses on collaboration and teamworkPreferred Qualifications
- Snowflake is a nice to haveRequired Education DEGREE REQUIRED IN CS or other Engineering degree Day to Day: We are looking for a Senior Data Engineer to join the DATOS team. You will be working on building highly resilient and scalable data pipelines which will publish enriched data of transactions into our datalake and analytical data warehouses. This role is highly multi-functional and gives the opportunity to work with Product, Information Security, Data, Analytics, Business Operations, Finance, and various other functions across the company as we consume, enrich and publish highly impactful data. You take pride in being responsible for your features, focusing on quality, reliability, and scale, and are mindful of maintainability. What you'll do:
- Build sophisticated and highly impactful systems
- Write code in Java, Scala, Kotlin, Python, Spark
- Use AWS products and services (EC2, S3, Lambda, DynamoDB, SQS, RDS, ElastiCache, CloudFront, Kinesis etc.)
- Use ETL products like DataBricks- critical
- Use DataWarehouses like SnowFlake- Nice to have
- Develop RESTful services
- Work with SQL and NoSQL T1396524NRD_173####### To Apply for this Job Click Here