Filter interviews by
I applied via campus placement at Andhra University and was interviewed before Mar 2023. There were 3 interview rounds.
Mostly it was based on python coding
By giving one topic they're checking our communication skills
Top trending discussions
I applied via Company Website
EMR is a managed Hadoop framework for processing large amounts of data, while EC2 is a scalable virtual server in AWS.
EMR stands for Elastic MapReduce and is a managed Hadoop framework for processing large amounts of data.
EC2 stands for Elastic Compute Cloud and is a scalable virtual server in Amazon Web Services (AWS).
EMR allows for easy provisioning and scaling of Hadoop clusters, while EC2 provides resizable compute...
I have experience working with both Star and Snowflake schemas in my projects.
Star schema is a denormalized schema where one central fact table is connected to multiple dimension tables.
Snowflake schema is a normalized schema where dimension tables are further normalized into sub-dimension tables.
Used Star schema for simpler, smaller datasets where performance is a priority.
Used Snowflake schema for complex, larger dat...
Yes, I have used Python and PySpark in my projects for data engineering tasks.
I have used Python for data manipulation, analysis, and visualization.
I have used PySpark for big data processing and distributed computing.
I have experience in writing PySpark jobs to process large datasets efficiently.
Yes, I have experience with serverless schema.
I have worked with AWS Lambda to build serverless applications.
I have experience using serverless frameworks like Serverless Framework or AWS SAM.
I have designed and implemented serverless architectures using services like AWS API Gateway and AWS DynamoDB.
Display in Databricks is used to visualize data in a tabular format or as charts/graphs.
Display function is used to show data in a tabular format in Databricks notebooks.
It can also be used to create visualizations like charts and graphs.
Display can be customized with different options like title, labels, and chart types.
To create a workflow in Databricks, use Databricks Jobs or Databricks Notebooks with scheduling capabilities.
Use Databricks Jobs to create and schedule workflows in Databricks.
Utilize Databricks Notebooks to define the workflow steps and dependencies.
Leverage Databricks Jobs API for programmatic workflow creation and management.
Use Databricks Jobs UI to visually design and schedule workflows.
Integrate with Databricks D
I applied via Company Website and was interviewed in Mar 2024. There was 1 interview round.
I applied via Recruitment Consulltant and was interviewed in Sep 2023. There were 2 interview rounds.
Use the find command with the -mtime option to find files that are 30 days old in Linux.
Use the find command with the -mtime option to specify the number of days.
For example, to find files that are exactly 30 days old: find /path/to/directory -mtime 30
To find files that are older than 30 days: find /path/to/directory -mtime +30
To find files that are newer than 30 days: find /path/to/directory -mtime -30
Use the COPY command in Redshift to load data from AWS S3.
Use the COPY command in Redshift to load data from S3 bucket.
Specify the IAM role with necessary permissions in the COPY command.
Provide the S3 file path and Redshift table name in the COPY command.
Ensure the Redshift cluster has the necessary permissions to access S3.
I applied via Naukri.com and was interviewed in Jan 2024. There was 1 interview round.
I applied via Job Portal and was interviewed in Feb 2024. There was 1 interview round.
First round is Dax,2 nd is SQL queries,3 is python
I applied via Recruitment Consulltant and was interviewed in May 2023. There were 3 interview rounds.
Machine learning algorithms are used to train models on data to make predictions or decisions.
Supervised learning algorithms include linear regression, decision trees, and neural networks.
Unsupervised learning algorithms include clustering and dimensionality reduction.
Reinforcement learning algorithms involve learning through trial and error.
Examples of machine learning applications include image recognition, natural l
posted on 7 Oct 2023
Basic DP, Array Questions
Interview experience
Data Analyst
7
salaries
| ₹4.1 L/yr - ₹8 L/yr |
Trainer
7
salaries
| ₹3 L/yr - ₹4.8 L/yr |
SME
6
salaries
| ₹2.4 L/yr - ₹5 L/yr |
Data Scientist
4
salaries
| ₹4 L/yr - ₹5.5 L/yr |
Corporate Trainer
4
salaries
| ₹3 L/yr - ₹6.6 L/yr |
Tech Mahindra
TCS
Infosys
Wipro