i
Amazon
Proud winner of ABECA 2024 - AmbitionBox Employee Choice Awards
Filter interviews by
Logical reasoning, verbal communication
Facing difficulties is a common challenge in leadership roles, requiring resilience and problem-solving skills.
Stay calm and composed under pressure
Seek advice and support from mentors or colleagues
Break down the problem into smaller tasks to tackle it effectively
Learn from past experiences and adapt your approach accordingly
I applied via Job Fair and was interviewed before Aug 2022. There were 5 interview rounds.
Its general Aptitude test which includes: writing communication, English Grammar, Some logical skill questions and listening skills.
Second round depends on your profile:
For business /data Analyst.
There will be a written test which includes 20 questions 16 questions based on SQL/python basics.
And last 4 questions included SQL query and python script questions.
It is tough and if u pass this means 50% of interview over
I applied via Naukri.com
Amazon interview questions for designations
Top trending discussions
I applied via Approached by Company and was interviewed in Jul 2024. There were 2 interview rounds.
I applied via LinkedIn and was interviewed in Jun 2024. There were 2 interview rounds.
Snowflake has limitations such as maximum table size, maximum number of columns, and maximum number of concurrent queries.
Snowflake has a maximum table size of 16TB for all tables, including temporary and transient tables.
There is a limit of 1600 columns per table in Snowflake.
Snowflake has a maximum of 10,000 concurrent queries per account.
There are also limitations on the number of objects (databases, schemas, tables
Use SQL functions like SUBSTRING and CHARINDEX to split staged data's row into separate columns
Use SUBSTRING function to extract specific parts of the row
Use CHARINDEX function to find the position of a specific character in the row
Use CASE statements to create separate columns based on conditions
Monitor overnight data load job in Snowflake
Set up alerts and notifications for job completion or failure
Check job logs for any errors or issues
Monitor resource usage during the data load process
Use Snowflake's query history to track job progress
Implement automated retries in case of failures
I applied via Approached by Company and was interviewed in May 2024. There was 1 interview round.
Joins in SQL are used to combine rows from two or more tables based on a related column between them.
Joins are used to retrieve data from multiple tables based on a related column between them
Types of joins include INNER JOIN, LEFT JOIN, RIGHT JOIN, and FULL JOIN
Example: SELECT * FROM table1 INNER JOIN table2 ON table1.column = table2.column
I applied via LinkedIn and was interviewed in Jul 2024. There was 1 interview round.
Python DSA Leetcode style questions, prepare.
I have a strong background in data engineering with experience in various industries.
Bachelor's degree in Computer Science with a focus on data engineering
Worked as a Data Engineer at XYZ Company, where I developed and maintained data pipelines
Implemented data quality checks and automated data validation processes
Collaborated with cross-functional teams to design and implement scalable data solutions
Experience with clo...
Facebook is a leading social media platform with vast user base and cutting-edge technology.
Facebook has over 2.8 billion monthly active users, providing a massive data source for analysis and engineering.
The company has a strong focus on innovation and constantly develops new technologies and tools.
Facebook's data infrastructure is highly advanced, allowing for complex data processing and analysis.
Working at Facebook ...
Spark is a distributed computing framework used for big data processing.
Spark is an open-source project under Apache Software Foundation.
It can process data in real-time and batch mode.
Spark provides APIs for programming in Java, Scala, Python, and R.
It can be used for various big data processing tasks like machine learning, graph processing, and SQL queries.
Spark uses in-memory processing for faster data processing.
Coding questions on SQL - given 2 tables, join and find the results after the join
I applied via Company Website and was interviewed before Mar 2023. There were 3 interview rounds.
1.Basic SQL
2. Python based question
3.Data modelling
4. Spark
5. Cloud based questions
Sql,Python,Data Modeling and Project based questions
Interview experience
based on 16 reviews
Rating in categories
Customer Service Associate
4.2k
salaries
| ₹0.6 L/yr - ₹5 L/yr |
Transaction Risk Investigator
3.1k
salaries
| ₹2 L/yr - ₹6.5 L/yr |
Associate
2.8k
salaries
| ₹0.8 L/yr - ₹6.8 L/yr |
Senior Associate
2.4k
salaries
| ₹2 L/yr - ₹10.1 L/yr |
Program Manager
2.1k
salaries
| ₹9 L/yr - ₹36 L/yr |
Flipkart
TCS
Netflix