15 Infiniti Research Jobs
Infiniti Research - Databricks Developer - Big Data Tools (6-8 yrs)
Infiniti Research
posted 16hr ago
Flexible timing
Key skills for the job
About Quantzig:.
Quantzig is a global analytics and advisory firm with offices in the US, UK, Canada, China, and India.
we have assisted our clients across the globe with end-to-end advanced analytics, visual storyboarding , Machine Learning and data engineering solutions implementation for prudent decision making.
We are a rapidly growing organization that is built and operated by high-performance champions.
If you have what it takes to be the champion with business and functional skills to take ownership of an entire project end-to-end, help build a team with great work ethic and a drive to learn, you are the one we're looking for.
The clients love us for our solutioning capability, our enthusiasm and we expect you to be a part of our growth story.
We have developed exceptional expertise in advanced analytics and business intelligence solutions for transforming global organizations.
Harness the power of our intelligent, actionable insights to solve complex problems and inspire innovation, change, and growth across your organization value chain.
Role : Databricks Developer.
Experience 6 to 8 years.
Notice Period : Immediate only.
Location : Bengaluru (WFO for 5 days).
Job Description :
- We are seeking a Data Engineer with 6-8 years of experience in Databricks, Python, and SQL.
- The primary responsibility of this role is to migrate on-premises big data Spark and Impala/Hive scripts to the Databricks environment.
- The ideal candidate will have a strong background in data migration projects and be proficient in transforming ETL pipelines to Databricks.
- The role requires excellent problem-solving skills and the ability to work independently on complex data migration tasks.
- Experience with big data technologies and cloud platforms(Azure) is essential.
- Join our team to lead the migration efforts and optimize our data infrastructure on Databricks.
- Excellent problem-solving skills and a passion for data accessibility.
Key Responsibilities :
- Lead the migration of existing Spark and Impala/Hive scripts from on-premises environments to the Databricks platform.
- Analyze and optimize existing ETL pipelines for optimal performance and efficiency within the Databricks environment.
- Develop and implement robust data ingestion and transformation pipelines using Databricks Delta, Spark SQL, and Python.
- Develop and maintain high-quality Databricks notebooks and jobs.
- Leverage Databricks features such as Delta Lake, MLflow, and Databricks SQL to enhance data processing capabilities.
- Implement data quality checks and validations throughout the data pipeline.
- Integrate Databricks with other Azure cloud services, such as Azure Data Factory, Azure Blob Storage, and Azure Synapse Analytics.
- Leverage cloud-native technologies and services to optimize data processing and storage costs.
- Diagnose and troubleshoot data quality issues, performance bottlenecks, and other technical challenges.
- Provide technical guidance and support to other team members.
- Collaborate effectively with data engineers, data analysts, and business stakeholders.
- Clearly communicate technical concepts and project progress to both technical and non-technical audiences.
Required Skills & Experience :
Strong Proficiency in :
- Databricks platform and its core components (Spark, Delta Lake, SQL) Python for data engineering tasks (Pandas, PySpark) SQL (including Hive and Impala) Data warehousing concepts and best practices ETL/ELT methodologies
- Experience with: Big data technologies (Hadoop, Spark, Hive) Cloud platforms, particularly Azure Agile development methodologies
- Ability to analyze complex data problems and identify effective solutions.
- Ability to work effectively in a team environment and communicate technical concepts clearly.
- A strong interest in data engineering and a desire to build high-quality data solutions.
Preferred Skills :
- Experience with data visualization tools (Tableau, Power BI) Knowledge of machine learning and AI concepts
- Experience with containerization technologies (Docker, Kubernetes)
Functional Areas: Software/Testing/Networking
Read full job descriptionPrepare for Developer roles with real interview advice
5-8 Yrs
4-6 Yrs
Bangalore / Bengaluru