37 ResourceTree Jobs
Data Engineer - Data Build Tool/Snowflake DB (4-7 yrs)
ResourceTree
posted 14hr ago
Key skills for the job
Primary Skills : DBT,Snowflake
Secondary Skills : ADF,Databricks,Python,Airflow,Fivetran,Glue
Role Description :
Data engineering role requires creating and managing technological infrastructure of a data platform, be in- charge / involved in architecting, building, and managing data flows / pipelines and construct data storages (noSQL, SQL), tools to work with big data (Hadoop, Kafka), and integration tools to connect sources or other databases.
Role Responsibility :
- Translate functional specifications and change requests into technical specifications
- Translate business requirement document, functional specification, and technical specification to related coding
- Develop efficient code with unit testing and code documentation
- Ensuring accuracy and integrity of data and applications through analysis, coding, documenting, testing, and problem solving
- Setting up the development environment and configuration of the development tools
- Communicate with all the project stakeholders on the project status
- Manage, monitor, and ensure the security and privacy of data to satisfy business needs
- Contribute to the automation of modules, wherever required
- To be proficient in written, verbal and presentation communication (English)
- Co-ordinating with the UAT team-
Role Requirement :
- Proficient in basic and advanced SQL programming concepts (Procedures, Analytical functions etc.)
- Good Knowledge and Understanding of Data warehouse concepts (Dimensional Modeling, change data capture, slowly changing dimensions etc.)
- Knowledgeable in Shell / PowerShell scripting
- Knowledgeable in relational databases, non-relational databases, data streams, and file stores
- Knowledgeable in performance tuning and optimization
- Experience in Data Profiling and Data validation
- Experience in requirements gathering and documentation processes and performing unit testing
- Understanding and Implementing QA and various testing process in the project
- Knowledge in any BI tools will be an added advantage
- Sound aptitude, outstanding logical reasoning, and analytical skills
- Willingness to learn and take initiatives
- Ability to adapt to fast-paced Agile environment
Additional Requirement :
- Design, develop, and maintain scalable data models and transformations using DBT in conjunction with Snowflake, ensure the effective transformation and load data from diverse sources into data warehouse or data lake.
- Implement and manage data models in DBT, guarantee accurate data transformation and alignment with business needs.
- Utilize DBT to convert raw, unstructured data into structured datasets, enabling efficient analysis and reporting.
- Write and optimize SQL queries within DBT to enhance data transformation processes and improve overall performance.
- Establish best DBTprocesses to improve performance, scalability, and reliability.
- Expertise in SQL and a strong understanding of Data Warehouse concepts and Modern Data Architectures.
- Familiarity with cloud- based platforms (e.g., AWS, Azure, GCP).
- Migrate legacy transformation code into modular DBT data models.
Functional Areas: Software/Testing/Networking
Read full job description