Data Engineer
: Job Details :


Data Engineer

Viking Global Investors

Location: New York,NY, USA

Date: 2024-09-29T06:41:23Z

Job Description:

Viking Global Investors LP is a global investment firm founded in 1999. We manage more than $48 billion of capital for our investors across public equity, private equity, and credit and structured capital investment strategies. We have more than 275 employees and offices in Stamford, New York, Hong Kong, London, and San Francisco. LOCATION: 660 Fifth Ave, New York, NY (in-person attendance required) JOB FUNCTION The Data Engineer is a member of the Investment Data Engineering team and is primarily responsible for building data pipelines to support investment research. The role is focused on applying modern data engineering principles to rapidly deliver trustworthy and thoughtfully curated datasets to data analysts, data scientists, and the investment staff. Additionally, the role is responsible for building analytical tools, frameworks, and other software to facilitate data collection efforts and analyses. Responsibilities may include, but are not limited to:

  • Own the full data pipeline lifecycle, including gathering requirements, orchestrating tasks, writing performant Python and SQL code, implementing data validation, and providing ongoing support.
  • Work with stakeholders to translate their needs into a clearly defined technical implementation.
  • Work with cloud data warehouses, including creating, manipulating, and modifying objects.
  • Build complex Airflow DAGs, including creating custom operators.
  • Write efficient and modular data transformations using frameworks like data build tool (dbt).
  • Conduct data explorations in Tableau to discover anomalies and identify data inaccuracies.
  • Conduct code reviews and participate in architecture and systems design discussions.
  • Ensure timely delivery of projects and proactively communicate updates to stakeholders.
QUALIFICATIONS The ideal candidate must have:
  • A minimum of 3 years of relevant work experience.
  • A degree in Computer Science or a related field, with a record of academic success.
  • Excellent computer science fundamentals and problem-solving skills, including an understanding of object-oriented and functional programming principles.
  • Experience in the fields of data warehousing, pipeline orchestration, and business intelligence, including familiarity with the extract-load-transform (ELT) data integration process.
  • Proficient in Python and SQL.
  • Experience with pipeline orchestration tools, such as: Apache Airflow, Luigi, Prefect, Dagster.
  • Experience with OLAP or cloud data warehouses, such as: Snowflake, Google BigQuery, Databricks SQL, Redshift.
  • Experience working in a cloud ecosystem, such as: AWS, Azure, GCP.
  • Experience working in a Linux environment.
The ideal candidate will also have:
  • Experience with Snowflake, Apache Airflow, Databricks, Apache Spark (PySpark), data build tool (dbt), Tableau, Retool, GitLab, and AWS.
  • Familiar with CI/CD patterns, Docker and containerization.
  • Familiar with data observability concepts and platforms.
  • Prior investment management or financial services industry experience.
The base salary range for this position in New York City is $120,000 to $175,000. In addition to base salary, Viking employees may be eligible for other forms of compensation and benefits, such as a discretionary bonus, 100% coverage of medical and dental premiums, and paid lunches. Actual compensation for successful candidates will be individually determined based on multiple factors including, but not limited to, a candidate's skill set, experience, education, and other qualifications. For more information on our benefits, please visit www.vikingglobal.com/life-at-viking/ Viking is an equal opportunity employer. Questions about your candidacy and requests for reasonable accommodation in the recruitment process should be directed to ...@vikingglobal.com.
Apply Now!

Similar Jobs (0)