Data Modeler (Azure Databricks)
: Job Details :


Data Modeler (Azure Databricks)

Saxon Global

Location: Jersey City,NJ, USA

Date: 2024-11-16T07:37:52Z

Job Description:
Key Responsibilities:
  • Design, implement, and optimize data models for the data warehouse, ensuring alignment with business requirements.
  • Work extensively with **IBM IDA (InfoSphere Data Architect)** to create logical and physical data models.
  • Ensure that the data models are scalable, performant, and aligned with the organization's data strategy.
  • Collaborate with data engineers to implement and maintain data models in **Azure** and **Databricks** environments.
  • Translate business requirements into conceptual, logical, and physical data models, ensuring data quality and consistency across systems.
  • Participate in data architecture and design discussions to ensure data modeling best practices are followed.
  • Develop and maintain documentation of data models, data flow diagrams, and database design specifications.
  • Work with stakeholders to ensure the data models support analytics, reporting, and data integration needs.
  • Perform impact analysis of changes to existing models and collaborate with teams to assess the effects on downstream systems.
  • Continuously improve and optimize data models to enhance performance and data accessibility.
Qualifications:
  • Bachelor's degree in Computer Science, Information Systems, or a related field.
  • Proven experience as a **Data Modeler** working in data warehouse environments.
  • Strong expertise with **IBM IDA (InfoSphere Data Architect)** or similar data modeling tools.
  • Solid understanding of **Azure** and **Databricks** architecture, and experience working in cloud-based data environments.
  • Proficiency in relational and dimensional data modeling techniques, including star and snowflake schemas.
  • Hands-on experience with SQL and database design, with knowledge of data lakes, data warehouses, and ETL processes.
  • Strong analytical and problem-solving skills, with attention to detail.
  • Ability to communicate complex technical concepts to both technical and non-technical stakeholders.
  • Familiarity with data governance, metadata management, and data quality practices.
  • Experience with **Azure Synapse**, **Azure Data Factory**, and **Databricks Delta Lake**.
  • Knowledge of big data technologies and principles of data lakehouse architectures.
  • Experience in agile methodologies and working in a fast-paced, collaborative environment
Apply Now!

Similar Jobs (0)