Upload Button Icon Add office photos
Engaged Employer

i

This company page is being actively managed by StaidLogic Software Team. If you also belong to the team, you can get access from here

StaidLogic Software Verified Tick

Compare button icon Compare button icon Compare
5.0

based on 15 Reviews

filter salaries All Filters

20 StaidLogic Software Jobs

StaidLogic - Data Engineer - Google Cloud Platform (10-12 yrs)

10-12 years

StaidLogic - Data Engineer - Google Cloud Platform (10-12 yrs)

StaidLogic Software

posted 4d ago

Job Role Insights

Flexible timing

Job Description

Role: CLOUD DATA ENGINEER.

Location : Pune.

Experience : total 10 years.

Role Description :

- This is a contract role for 2 Cloud Data Engineers located on-site in Pune.

- The Cloud Data Engineers will be responsible for data engineering, data modeling, ETL processes, data warehousing, and data analytics on a day-to-day basis.

- Proficient in working with Azure Blob Storage, Azure Data Lake Storage, Azure Data Factory, Azure SQL Data Warehouse, Azure Data Bricks, and on Python, SQL, and PL/SQL concepts.

- Worked on migrating and transforming data from on-premises to cloud and between cloud services by creating Azure data factory pipelines & data flows using different ADF activities and components.

- Experience in writing Spark Applications using (Python/Scala) to connect to different cloud services like Azure SQL DB, Azure Postgres, Azure Synapse Analytics, ADLS, and AWS S3 and performing different data transformations based on business requirements.

- Familiar with Azure DevOps to deploy ADF pipelines and ARM Templates into other environments by creating release pipelines.

- Experience in legacy data migration projects such as Big Data to AWS Redshift migration, i.e, from on-premises to AWS Cloud and Snowflake Data warehouse.

- Experience with Snowflake SQL and Snowflake pipelines to pull data from AWS S3 and traditional DB.

- Worked in a source-consumer application in Google Cloud using different GCP services like Google Cloud Storage, Cloud Data proc, Big Query, Google Cloud PUB/SUB, Google Cloud Composer, etc.

- Experienced in creating airflow DAGs using the most common Operators in Airflow


- Python Operator, Bash Operator, and Google Operators to orchestrate the data flow using Google Cloud services.

- Experienced in writing Spark Applications using (Python/Scala) to connect to different cloud services like Google Cloud Storage, and Cloud SQL and performing different data transformations based on business requirements.

- Proficient in working with AWS S3, AWS Glue, Redshift Data Warehouse, and AWS Ec2 and on Python, SQL, and PL/SQL concepts.

- Responsible for designing and implementing data pipelines using AWS services such as S3, Glue, EC2, and EMR.

- Building and maintaining data warehouses and lakes using AWS Redshift and data security and access controls using AWS IAM.

- Design, develop, and maintain ETL (Extract, Transform, Load) processes using AWS Glue to extract data from various sources, transform it to meet business requirements, and load it into target data sources.

- Worked with the DevOps team to create environment-specific Configuration yaml files to deploy code through CI/CD process by creating artifacts using a central repository.

- Familiar with Processing Real-Time Streaming data using Azure Event Hubs and Azure Stream Analytics and visualizing the results using Power BI and Tableau.

- Experienced in Fact-Dimensional modeling (Star Schema, Snowflake Schema), transactional modeling, and SCD (Slowly Changing Dimension).

- Experienced in designing and creating RDBMS Tables and views, User Created Data Types, Indexes, Stored Procedures, Cursors, Triggers, and transactions.

- Had good knowledge and Hands-on with the ETL tool Informatica, Talend to understand the existing flows, modify the flow, and create the data flow based on the requirement.

- worked extensively with source code management and version control tools like Git, GitHub, and GitLab.

- Familiar with the Agile working method, used JIRA to track work progress and Confluence to prepare and manage technical documentation.

- Familiarity and knowledge of components of the Hadoop Ecosystem like HDFS and HIVE.


Functional Areas: Software/Testing/Networking

Read full job description

What people at StaidLogic Software are saying

What StaidLogic Software employees are saying about work life

based on 15 employees
100%
100%
54%
Flexible timing
Monday to Friday
Within city
View more insights

StaidLogic Software Benefits

Health Insurance
Soft Skill Training
Work From Home
Job Training
Team Outings
Education Assistance +6 more
View more benefits

Compare StaidLogic Software with

HCL Infosystems

3.9
Compare

IVTL Infoview Technologies

3.6
Compare

Apmosys Technologies

3.5
Compare

Diverse Lynx

3.8
Compare

Solartis Technology Services

3.7
Compare

VHS Consulting

3.7
Compare

Elentec Power India (EPI) Pvt. Ltd.

3.7
Compare

DynPro

3.8
Compare

Archwell Operations

3.6
Compare

Apex CoVantage

3.3
Compare

Dahua Technology India Pvt.Ltd.

3.6
Compare

Accel Frontline

3.9
Compare

Dataflow Group

3.1
Compare

Mol Information Processing Services India

4.0
Compare

GlobusSoft

3.8
Compare

Avontix

4.0
Compare

AvenData GmbH

3.2
Compare

Knoldus Inc

4.1
Compare

Aapna Infotech

4.7
Compare

Claritus Management Consulting

4.3
Compare

Similar Jobs for you

Data Engineer at Altimetrik India Pvt. Ltd

5-9 Yrs

₹ 15-28 LPA

Data Engineer at Intelli Swift

7-12 Yrs

₹ 15-28 LPA

Data Engineer at Mouri Tech (P) Ltd

6-10 Yrs

₹ 25-30 LPA

Data Engineer at egen.ai

3-13 Yrs

₹ 13-33 LPA

Data Engineer at Colan Infotech Pvt Ltd

5-10 Yrs

₹ 15-30 LPA

Data Engineer at Wenger & Watson Inc.

9-12 Yrs

₹ 27-33 LPA

Data Engineer at TETRAHED INC

5-10 Yrs

₹ 28-34 LPA

Data Engineer at Cortex Consultants

4-10 Yrs

₹ 12-28 LPA

Senior Data Engineer at SMARTWORK IT SERVICES

8-12 Yrs

₹ 18-28 LPA

Data Engineer at Technogen India Private Limited

6-9 Yrs

₹ 22-30 LPA

write
Share an Interview