i
Ideas2IT Technologies
4 Ideas2IT Technologies Jobs
5-12 years
Ideas2IT Technologies - Data Engineer - ETL/Python (5-12 yrs)
Ideas2IT Technologies
posted 17hr ago
Flexible timing
Key skills for the job
Job Role :
As a Data Engineer, you'll build and maintain data pipelines and architectures.Responsibilities include optimizing databases and ETL processes, using Python or SQL,and collaborating with data teams for informed decision-making.
Why Choose Ideas2IT :
Ideas2IT has all the good attributes of a product startup and a services company. Since we launch our products, you will have ample opportunities to learn and contribute. However, single-product companies stagnate in the technologies they use. In our multiple product initiatives and customer-facing projects, you will have the opportunity to work on various technologies.
AGI is going to change the world. Big companies like Microsoft are betting heavily on this (see here and here). We are following suit. As a Data Engineer, exclusively focus on engineering data pipelines for complex products
What's in it for you?
You will work on diverse technology challenges like :
- A robust distributed platform to manage a self-healing swarm of bots onunreliable network / compute
- Large-scale Cloud-Native applications
- Document Comprehension Engine leveraging RNN and other latest OCR techniques
- Completely data-driven low-code platform
- You will leverage cutting-edge technologies like Blockchain, IoT, and Data Science as you work on projects for leading Silicon Valley startups.
- Your role does not start or end with just Java development; you will enjoy the freedom to share your suggestions on the choice of tech stacks across the length of the project
- If there is a certain technology you would like to explore, you can do your Technical PoCs
Work in a culture that values capability over experience and continuous learning as a core tenet
Here's what you'll bring :
- Proficiency in SQL and experience with database technologies (e.g., MySQL, PostgreSQL, SQL Server).
- Experience in any one of the cloud environments - AWS, Azure
- Experience with data modeling, data warehousing, and building ETL pipelines.
- Experience building large-scale data pipelines and data-centric applications using any distributed storage platform
- Experience in data processing tools like Pandas, pyspark.
- Experience in cloud services like S3, Lambda, SQS, Redshift, Azure Data Factory, ADLS, Function Apps, etc.
- Expertise in one or more high-level languages (Python/Scala)
- Ability to handle large-scale structured and unstructured data from internal and third-party sources
- Ability to collaborate with analytics and business teams to improve data models that feed business intelligence tools, increase data accessibility, and foster data-driven decision-making across the organization
- Experience with data visualization tools like PowerBI, Tableau
- Experience in containerization technologies like Docker , Kubernetes
Functional Areas: Software/Testing/Networking
Read full job descriptionPrepare for Data Engineer roles with real interview advice