Data Engineer (MLOps) - Fully Remote
: Job Details :


Data Engineer (MLOps) - Fully Remote

Averity

Location: all cities,AK, USA

Date: 2024-11-24T07:53:56Z

Job Description:

How would you like to become a Data/MLOps Engineer at an AI-centric performance management tech company? This is a fully remote role (and always will be) that is a high priority to fill. What's the Job? As one of our Data/MLOps Engineers, you'll be working with other experienced and high-performing engineers/data professionals on our Data Ingestion team (currently a team of 6). Initially, you will continue the work we have started on building our new Data Warehouse, migrating from Redshift to Snowflake, and pipelining all our product data into it. You'll then help develop the MLOps infrastructure, which is the next phase of our development. We currently have two AI/ML models in production and are looking to host our own LLMs in the future. Ideal candidates are Data Engineers with an interest in MLOps, or MLEs/MLOps Engineers who are open to doing the initial Data Engineering work, knowing the role will become more MLOps-focused in the future. We highly value flexible engineers who can work across all data-related aspects of engineering. We primarily use Python, SQL, Snowflake (migrating from Redshift), Airflow, dbt, Databricks, and Kafka. This is a fully remote position and always will be. There won't be any sudden back-to-office announcements. We officially bring the Engineering team together once every six months, and some of the engineers meet sporadically at their convenience (for example, we have several engineers in the NYC area). Who Are We? We are using AI to create the pinnacle of performance management platforms. As part of our Data Ingestion team, you'll play a crucial role in ensuring the data we need is primed for AI-driven insights. We are a remote-first company, and this is a fully remote position. What Skills Do I Need?

  • Experience building data pipelines and data warehouses using Python, SQL, dbt, Airflow, and Snowflake.
  • Experience building machine learning model infrastructure for deployed models.
  • Experience working in cross-functional teams, including Data Engineers, Data Scientists, Software Engineers, etc.
  • Strong plus if you have built APIs for Slack/chatbots, used the Databricks platform, and have experience with event-driven systems using Kafka.
Compensation:
  • $150,000 Base Salary
  • Equity
  • Bonus Eligible
  • 401(k) with matching
  • Full Benefits (Health, Dental, Vision)
  • Loads of other perks (too many to name)
  • Fully Remote
Apply Now!

Similar Jobs (0)