11 Intellify Jobs
Intellify Solutions - Data Engineer - ETL (1-2 yrs)
Intellify
posted 22d ago
Flexible timing
Key skills for the job
We have a Job Opportunity for you at Intellify Solutions,.
Job Data Engineer :
About Intellify We are a preferred and trusted Technology Consulting and Development Partner for global enterprises, from start-ups to Fortune 500s.
Work with different stakeholders of the knowledge value chain to provide the Business Intelligence, Data Analytics and Technology solutions which enables them to make the informed decisions and solve the business challenges.
Our Services include Business Intelligence, Data Analytics, Data Visualization, Artificial Intelligence, Power BI Services, Power Apps, Power Automate, Product Development, Generative AI, Low-Code/No-Code, Microsoft Fabric.
We have been in the market for the last 10 successful & progressive years with 100+ successful delivery of projects.
Intellify Solutions started its regional office in the USA in 2023.
Intellify has an extensive Global Experience & Capabilities to provide Technology support in a major part of the world.
Job Description :
- Data Engineer designs, implements, and documents data architecture and data modeling solutions, which include the use of relational, dimensional databases.
- These solutions support enterprise Information Management and Business Intelligence.
Experience : 3 to 5 Years.
Relevant Experience : 2 to 4 Years.
Qualification :
- Bachelor's or Master's degree in computer/data science technical or related experience.
Roles and responsibilities :
- Minimum 2-4 years of experience in designing, implementing, and supporting Data Warehousing and Business Intelligence solutions on Microsoft Fabric data pipelines.
- Create interactive and visually appealing dashboards and reports using Power BI, with a focus on user experience and data storytelling.
- Identify and implement optimizations to enhance the performance and efficiency of Power BI reports and dashboards.
- Design and implement scalable and efficient data pipelines using Azure Data Factory, PySpark notebooks, Spark SQL, and Python.
- This includes data ingestion, data transformation, and data loading processes.
- Implement ETL processes to extract data from diverse sources, transform it into suitable formats, and load it into the data warehouse or analytical systems.
- Hands-on experience in design, development, and implementation of Microsoft Fabric, Azure Data Analytics Service (Azure Data Factory - ADF, Data Lake, Azure Synapse, Azure SQL DWH, and Databricks).
- Experience in writing optimized SQL queries on MS Azure Synapse Analytics (dedicated, serverless resources in queries, etc.
- Troubleshoot, resolve, and suggest a deep code-level analysis of Spark to address complex customer issues related to Spark core internals, Spark SQL, Structured Streaming, and Delta.
- Continuously monitor and fine-tune data pipelines and processing workflows to enhance overall performance and efficiency, considering large-scale data sets.
- Experience with hybrid cloud deployments and integration between on-premises and cloud environments.
- Ensure data security and compliance with data privacy regulations throughout the data engineering process.
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources.
- Conceptual knowledge of data and analytics, such as dimensional modeling, ETL, reporting tools, data governance, data warehousing, and structured and unstructured data.
- Understanding data engineering best practices like code modularity, documentation, and version control.
- Collaborate with business stakeholders to gather requirements and create comprehensive technical solutions and documentation.
Skills And Qualifications :
- Proficiency in data modeling tools.
- Experience in designing and implementing database structures.
- Experience with data warehouse, data lake & Microsoft Fabric is a plus.
- Good knowledge of Power BI, Azure Databricks, Microsoft Fabric, data modeling, and related tools (Erwin or ER Studio or others) required.
Certifications Good to Have :
- DP 600 Certification or knowledge is a major plus requirement.
Functional Areas: Software/Testing/Networking
Read full job description