i
Coders Brain
430 Coders Brain Jobs
Data Modeler - Spark (6-11 yrs)
Coders Brain
posted 1mon ago
Flexible timing
Key skills for the job
Data Modeler Job Description :
You will work as a Data Modeler who is familiar with Cloud-based architectures and DW/BI/Analytics domain knowledge with experience in Databricks or Spark.
You will partner closely with the clients to deliver state-of-the-art engineering solutions. You will articulate and communicate with a diverse set of internal and external stakeholders with varying degrees of technical proficiency and deliver critical business and process-related information to mitigate any risks or failures.
You will persistently look for opportunities to address customer needs by being a thought partner in every moment of engagement.
Responsibilities :
- Spearhead the development of conceptual, logical, and physical data models, while implementing data marts/data warehouse/data lake/data lakehouse on Spark-based platforms
- Analyse existing models to identify variances/gaps and devise strategies to achieve alignment.
- Establish and enforce best practices and standards for data models.
- Generate Data Flow diagrams and relevant artifacts to depict data integrations within targeted systems.
- Collaborate closely with Data Owners to understand business data requirements and clarify terms and business rules.
- Facilitate walkthrough sessions with Data Owners and stakeholders to review and validate data models for accuracy.
- Demonstrate strong communication skills and effective engagement with delivery colleagues to ensure successful project outcomes.
- Comply with Data Architecture standards and guardrails.
- Proactively manage issues, risks, actions, and dependencies, escalating when necessary.
Experience :
Must have :
- Possess a minimum of 5+ years of Data Modelling experience in Data Lake/Delta Lakehouse/Data Warehouse on Spark based platforms on Azure/AWS/GCP.
- Demonstrate a minimum of 1 year of experience working with/in Databricks.
- Mandatory experience of Data Modelling on Delta Lakehouse
- Knowledge in handling Spark streaming data over Delta Lakehouse.
- Hands-on experience with data modelling tools for performing reverse and forward engineering of as-is and to-be data models, in any of the industry leading data modelling tools such as Erwin/ ER/Studio etc.
- Utilize visual notations for expressing data logical and physical models, state transitions, transformations, and versioning. Familiarity with UML, DPMN, ERDs, etc. is ideal.
Good to have :
- Knowledge of Databricks concepts like Apache Spark API, Delta, Delta Live, feature stores, DBR, job/interactive clusters, notebook management, Repos, MLFlow
- Knowledge of orchestration tools like Azure Data Factory, Airflow
- Experience in other cloud DWH like Snowflake, Big Query, and Amazon Redshift.
- Knowledge of in any Data Governance tool.
- Knowledge of Unity Catalog.
- Experience in DevOps and CI/CD.
Functional Areas: Other
Read full job descriptionPrepare for Data Modeler roles with real interview advice