Premium Employer

i

This company page is being actively managed by Coforge Team. If you also belong to the team, you can get access from here

Coforge Verified Tick Work with us arrow

Compare button icon Compare button icon Compare

Filter interviews by

Coforge Technical Analyst Interview Questions and Answers

Updated 5 Jun 2025

48 Interview questions

A Technical Analyst was asked 1mo ago
Q. What is the name of the student with the second highest score from the dictionary {"Sam":10,"Goutham":90,"Adil":70,"Vikas":99}?
Ans. 

The student with the second highest score is Adil, who scored 70, following Vikas with 99 and Goutham with 90.

  • Score Ranking: The scores are ranked as follows: Vikas (99), Goutham (90), Adil (70), and Sam (10).

  • Identifying Second Highest: To find the second highest, we can sort the scores and select the second entry.

  • Data Structure: The data is stored in a dictionary format, which allows for easy access to scores by ...

A Technical Analyst was asked 3mo ago
Q. How do you handle upsert operations in GCP BigQuery?
Ans. 

Upsertion in GCP BigQuery allows for efficient data updates and inserts using SQL syntax.

  • Upsertion combines INSERT and UPDATE operations based on whether a record exists.

  • Use the MERGE statement for upsertion: MERGE INTO target_table USING source_table ON condition.

  • Example: MERGE INTO target_table USING source_table ON target.id = source.id WHEN MATCHED THEN UPDATE SET target.value = source.value WHEN NOT MATCHED T...

Technical Analyst Interview Questions Asked at Other Companies

asked in Coforge
Q1. Write a program to get a list of employees whose salary is greate ... read more
Q2. How would you optimize the job scheduling code written in the fir ... read more
asked in Coforge
Q3. Explain Security authentication implementation and what are the d ... read more
asked in Coforge
Q4. How did you implemented the microservices in your project, What a ... read more
asked in ION Group
Q5. You are specialized in Data analsysis so what is the diff betweee ... read more
A Technical Analyst was asked 3mo ago
Q. How do you load a CSV file to BigQuery?
Ans. 

Loading a CSV file to BigQuery can be done using the web UI, command line, or API.

  • Use the BigQuery web UI: Navigate to BigQuery, select your dataset, click 'Create Table', and upload your CSV file.

  • Command-line tool: Use the bq command with 'bq load' command. Example: bq load --source_format=CSV dataset.table gs://bucket/file.csv

  • Using Python client library: Use 'google-cloud-bigquery' to load CSV. Example: client.l...

A Technical Analyst was asked 3mo ago
Q. Write an SQL query to find the third highest salary of an employee in each department.
Ans. 

Retrieve the third highest salary of employees within each department using SQL queries with ranking functions.

  • Use of RANK() or DENSE_RANK(): These functions help in assigning a rank to each salary within a department.

  • Common Table Expression (CTE): A CTE can be used to simplify the query structure and improve readability.

  • Example Query: WITH RankedSalaries AS (SELECT department_id, salary, DENSE_RANK() OVER (PARTIT...

A Technical Analyst was asked 3mo ago
Q. What are the differences between Avro and Parquet files in GCP?
Ans. 

Avro and Parquet are both columnar storage formats used in GCP, each with unique features and use cases.

  • Avro is a row-based storage format, while Parquet is a columnar storage format.

  • Avro is best for write-heavy operations and supports schema evolution, making it suitable for streaming data.

  • Parquet is optimized for read-heavy operations and is efficient for analytical queries, making it ideal for big data processi...

A Technical Analyst was asked 3mo ago
Q. What are the types of partitioning in GCP BigQuery?
Ans. 

GCP BigQuery supports time-based and integer range partitioning for efficient data management and querying.

  • Time-based partitioning: Automatically partitions data based on a TIMESTAMP or DATE column. Example: daily partitions for sales data.

  • Integer range partitioning: Divides data into ranges based on an INTEGER column. Example: partitioning user IDs into ranges.

  • Partitioned tables improve query performance and redu...

A Technical Analyst was asked 3mo ago
Q. Write an SQL query to find the complete employee hierarchy from a given employee to the CEO, assuming the employee table contains the hierarchy. The result should be in a single row.
Ans. 

SQL query to retrieve the complete hierarchy of an employee in a single row from an employee table.

  • Use a Common Table Expression (CTE) to recursively fetch the hierarchy.

  • Example CTE: WITH RECURSIVE EmployeeHierarchy AS (SELECT * FROM employee WHERE id = ? UNION ALL SELECT e.* FROM employee e INNER JOIN EmployeeHierarchy eh ON e.manager_id = eh.id) SELECT * FROM EmployeeHierarchy;

  • Replace '?' with the employee's ID ...

Are these interview questions helpful?
A Technical Analyst was asked 3mo ago
Q. Given an employee table with missing employee IDs that should follow a sequence (e.g., 101 to 150), write an SQL query to find the missing employee IDs.
Ans. 

SQL query to identify missing employee IDs in a sequential range from 101 to 150.

  • Use a Common Table Expression (CTE) to generate a sequence of numbers from 101 to 150.

  • Join the generated sequence with the Employee table to find missing IDs.

  • Example SQL query: WITH seq AS (SELECT 101 + LEVEL - 1 AS emp_id FROM dual CONNECT BY LEVEL <= 50) SELECT emp_id FROM seq LEFT JOIN Employee ON seq.emp_id = Employee.emp_id WH...

A Technical Analyst was asked 3mo ago
Q. What do you know about Dataproc in GCP?
Ans. 

Google Cloud Dataproc is a fully managed service for running Apache Spark and Hadoop clusters in the cloud.

  • Quickly create and manage clusters for big data processing.

  • Supports popular open-source tools like Apache Spark, Hadoop, and Hive.

  • Integrates seamlessly with other GCP services like BigQuery and Cloud Storage.

  • Allows for autoscaling of clusters based on workload demands.

  • Example: Use Dataproc to process large da...

A Technical Analyst was asked 3mo ago
Q. What are the different optimization techniques in Spark?
Ans. 

Spark optimization techniques enhance performance by improving resource utilization and reducing execution time.

  • 1. Catalyst Optimizer: Automatically optimizes query plans in Spark SQL, improving execution efficiency.

  • 2. Tungsten Execution Engine: Focuses on memory management and code generation for better performance.

  • 3. Data Serialization: Use efficient serialization formats like Kryo to reduce data transfer time.

  • 4...

Coforge Technical Analyst Interview Experiences

32 interviews found

Interview experience
2
Poor
Difficulty level
Moderate
Process Duration
4-6 weeks
Result
No response

I appeared for an interview in Feb 2025.

Round 1 - Technical 

(4 Questions)

  • Q1. Explain Spark Architecture.
  • Ans. 

    Apache Spark is a distributed computing system designed for fast data processing and analytics.

    • Spark operates on a master-slave architecture with a Driver and Executors.

    • The Driver program coordinates the execution of tasks and maintains the SparkContext.

    • Executors are worker nodes that execute tasks and store data in memory for fast access.

    • Spark uses Resilient Distributed Datasets (RDDs) for fault tolerance and parallel...

  • Answered by AI
  • Q2. What are the different optimization techniques in Spark.
  • Ans. 

    Spark optimization techniques enhance performance by improving resource utilization and reducing execution time.

    • 1. Catalyst Optimizer: Automatically optimizes query plans in Spark SQL, improving execution efficiency.

    • 2. Tungsten Execution Engine: Focuses on memory management and code generation for better performance.

    • 3. Data Serialization: Use efficient serialization formats like Kryo to reduce data transfer time.

    • 4. Bro...

  • Answered by AI
  • Q3. Difference between SparkSession and SparkContext.
  • Ans. 

    SparkSession is the entry point for Spark SQL, while SparkContext is the entry point for Spark Core functionalities.

    • SparkSession encapsulates SparkContext and provides a unified entry point for DataFrame and SQL operations.

    • SparkContext is used to connect to a Spark cluster and is the primary interface for Spark Core functionalities.

    • You can create a SparkSession using: `SparkSession.builder.appName('example').getOrCreat...

  • Answered by AI
  • Q4. What are different read and write modes.
  • Ans. 

    Read and write modes define how data is accessed and modified in files or streams, impacting data integrity and performance.

    • Read Mode (r): Opens a file for reading only. Example: 'file = open('data.txt', 'r')'

    • Write Mode (w): Opens a file for writing, truncating the file if it exists. Example: 'file = open('data.txt', 'w')'

    • Append Mode (a): Opens a file for writing, appending data to the end without truncating. Example: ...

  • Answered by AI
Round 2 - Technical 

(2 Questions)

  • Q1. What are the challenges you faced in previous project.
  • Ans. 

    Faced challenges in data accuracy, stakeholder communication, and adapting to market changes in previous projects.

    • Data Accuracy: Encountered discrepancies in historical data which required extensive validation and cleaning before analysis.

    • Stakeholder Communication: Misalignment with stakeholders on project goals led to revisions; implemented regular updates to ensure clarity.

    • Market Changes: Rapid shifts in market trend...

  • Answered by AI
  • Q2. What are your preferences whether Azure or GCP.
  • Ans. 

    Both Azure and GCP have unique strengths; preference depends on specific project needs and organizational goals.

    • Azure offers seamless integration with Microsoft products, ideal for enterprises using Windows Server and SQL Server.

    • GCP excels in data analytics and machine learning, with tools like BigQuery and TensorFlow for advanced data processing.

    • Azure has a strong hybrid cloud strategy, allowing businesses to integrat...

  • Answered by AI
Interview experience
1
Bad
Difficulty level
Moderate
Process Duration
Less than 2 weeks
Result
Not Selected

I applied via Recruitment Consulltant and was interviewed in Sep 2024. There were 2 interview rounds.

Round 1 - One-on-one 

(2 Questions)

  • Q1. Create Custom hook to fetch data
  • Ans. 

    Custom hook to fetch data in React

    • Create a custom hook using the 'useEffect' and 'useState' hooks

    • Use 'fetch' or any other method to fetch data from an API

    • Return the fetched data from the custom hook

  • Answered by AI
  • Q2. Javascript concept and output
Round 2 - One-on-one 

(2 Questions)

  • Q1. React hooks and lifecycle
  • Q2. Custom hook and some tricky output question

Skills evaluated in this interview

Technical Analyst Interview Questions & Answers

user image Manojkumar Muralitharan

posted on 14 Jan 2025

Interview experience
5
Excellent
Difficulty level
-
Process Duration
-
Result
-
Round 1 - Aptitude Test 

Good and confident level increaser

Round 2 - Coding Test 

Write a program to convert digits in to numbers

Round 3 - Technical 

(1 Question)

  • Q1. Scenario of data migration
  • Ans. 

    Data migration involves transferring data from one system to another while ensuring data integrity and consistency.

    • Plan the migration process carefully to minimize downtime and data loss.

    • Backup all data before starting the migration process.

    • Verify data integrity after migration to ensure all data has been successfully transferred.

    • Consider using tools or scripts to automate the migration process.

    • Communicate with stakeho...

  • Answered by AI
Interview experience
3
Average
Difficulty level
Moderate
Process Duration
Less than 2 weeks
Result
No response

I appeared for an interview in Feb 2025, where I was asked the following questions.

  • Q1. 1. Tell me about yourself?
  • Q2. Types of partitioning in gcp big query?
  • Ans. 

    GCP BigQuery supports time-based and integer range partitioning for efficient data management and querying.

    • Time-based partitioning: Automatically partitions data based on a TIMESTAMP or DATE column. Example: daily partitions for sales data.

    • Integer range partitioning: Divides data into ranges based on an INTEGER column. Example: partitioning user IDs into ranges.

    • Partitioned tables improve query performance and reduce co...

  • Answered by AI
  • Q3. Sql query to find the third highest salary of employee in each department?
  • Ans. 

    Retrieve the third highest salary of employees within each department using SQL queries with ranking functions.

    • Use of RANK() or DENSE_RANK(): These functions help in assigning a rank to each salary within a department.

    • Common Table Expression (CTE): A CTE can be used to simplify the query structure and improve readability.

    • Example Query: WITH RankedSalaries AS (SELECT department_id, salary, DENSE_RANK() OVER (PARTITION B...

  • Answered by AI
  • Q4. How do you load a csv file to big query?
  • Ans. 

    Loading a CSV file to BigQuery can be done using the web UI, command line, or API.

    • Use the BigQuery web UI: Navigate to BigQuery, select your dataset, click 'Create Table', and upload your CSV file.

    • Command-line tool: Use the bq command with 'bq load' command. Example: bq load --source_format=CSV dataset.table gs://bucket/file.csv

    • Using Python client library: Use 'google-cloud-bigquery' to load CSV. Example: client.load_t...

  • Answered by AI
  • Q5. Upsertion in gcp big query?
  • Ans. 

    Upsertion in GCP BigQuery allows for efficient data updates and inserts using SQL syntax.

    • Upsertion combines INSERT and UPDATE operations based on whether a record exists.

    • Use the MERGE statement for upsertion: MERGE INTO target_table USING source_table ON condition.

    • Example: MERGE INTO target_table USING source_table ON target.id = source.id WHEN MATCHED THEN UPDATE SET target.value = source.value WHEN NOT MATCHED THEN I...

  • Answered by AI
  • Q6. How do you debug a long running sql query in gcp big query?
  • Ans. 

    Debugging long-running SQL queries in GCP BigQuery involves analyzing execution plans, optimizing queries, and monitoring performance.

    • Use the BigQuery Query Execution Details to analyze the execution plan and identify bottlenecks.

    • Check for large data scans; use SELECT statements to limit the amount of data processed.

    • Optimize joins by ensuring that you are using the correct join types and filtering data early.

    • Consider u...

  • Answered by AI
  • Q7. About dataproc in gcp?
  • Ans. 

    Google Cloud Dataproc is a fully managed service for running Apache Spark and Hadoop clusters in the cloud.

    • Quickly create and manage clusters for big data processing.

    • Supports popular open-source tools like Apache Spark, Hadoop, and Hive.

    • Integrates seamlessly with other GCP services like BigQuery and Cloud Storage.

    • Allows for autoscaling of clusters based on workload demands.

    • Example: Use Dataproc to process large dataset...

  • Answered by AI
  • Q8. Avro and parquet file difference in gcp?
  • Ans. 

    Avro and Parquet are both columnar storage formats used in GCP, each with unique features and use cases.

    • Avro is a row-based storage format, while Parquet is a columnar storage format.

    • Avro is best for write-heavy operations and supports schema evolution, making it suitable for streaming data.

    • Parquet is optimized for read-heavy operations and is efficient for analytical queries, making it ideal for big data processing.

    • Av...

  • Answered by AI
  • Q9. There is an employee to ceo level complete hierarchy in employee table, sql query to find the complete hierarchy of a given employee? Result should be in one row only.
  • Ans. 

    SQL query to retrieve the complete hierarchy of an employee in a single row from an employee table.

    • Use a Common Table Expression (CTE) to recursively fetch the hierarchy.

    • Example CTE: WITH RECURSIVE EmployeeHierarchy AS (SELECT * FROM employee WHERE id = ? UNION ALL SELECT e.* FROM employee e INNER JOIN EmployeeHierarchy eh ON e.manager_id = eh.id) SELECT * FROM EmployeeHierarchy;

    • Replace '?' with the employee's ID to ge...

  • Answered by AI
  • Q10. Employee table has some missing employee id records which normally follow a sequence for employee id ranging from suppose say 101 to 150. Sql query to find missing employee ids?
  • Ans. 

    SQL query to identify missing employee IDs in a sequential range from 101 to 150.

    • Use a Common Table Expression (CTE) to generate a sequence of numbers from 101 to 150.

    • Join the generated sequence with the Employee table to find missing IDs.

    • Example SQL query: WITH seq AS (SELECT 101 + LEVEL - 1 AS emp_id FROM dual CONNECT BY LEVEL <= 50) SELECT emp_id FROM seq LEFT JOIN Employee ON seq.emp_id = Employee.emp_id WHERE E...

  • Answered by AI
Interview experience
4
Good
Difficulty level
-
Process Duration
-
Result
-
Round 1 - Technical 

(2 Questions)

  • Q1. OOPS concepts with examples
  • Ans. 

    OOP concepts include inheritance, encapsulation, polymorphism, and abstraction.

    • Inheritance: Allows a class to inherit properties and behavior from another class. Example: class Dog extends Animal.

    • Encapsulation: Bundles data and methods that operate on the data into a single unit. Example: private variables with public methods.

    • Polymorphism: Allows objects of different classes to be treated as objects of a common supercl...

  • Answered by AI
  • Q2. Explain NF in sql & ACID concepts
  • Ans. 

    NF in SQL refers to Normal Form which is used to eliminate redundancy in database design. ACID concepts ensure data integrity in transactions.

    • NF in SQL stands for Normal Form and is used to organize data in a database to eliminate redundancy and dependency.

    • There are different levels of NF such as 1NF, 2NF, 3NF, and BCNF, each with specific rules to follow.

    • ACID concepts (Atomicity, Consistency, Isolation, Durability) en...

  • Answered by AI

Interview Preparation Tips

Interview preparation tips for other job seekers - Basics are very important
Interview experience
5
Excellent
Difficulty level
Moderate
Process Duration
Less than 2 weeks
Result
Not Selected

I appeared for an interview in Apr 2025, where I was asked the following questions.

  • Q1. What is the name of the student with the second highest score from the dict {"Sam":10,"Goutham":90,"Adil":70,"Vikas":99}?
  • Ans. 

    The student with the second highest score is Adil, who scored 70, following Vikas with 99 and Goutham with 90.

    • Score Ranking: The scores are ranked as follows: Vikas (99), Goutham (90), Adil (70), and Sam (10).

    • Identifying Second Highest: To find the second highest, we can sort the scores and select the second entry.

    • Data Structure: The data is stored in a dictionary format, which allows for easy access to scores by stude...

  • Answered by AI
  • Q2. Can you tell me about yourself?
Interview experience
4
Good
Difficulty level
-
Process Duration
-
Result
-
Round 1 - Technical 

(1 Question)

  • Q1. What are the python Class lifestyle method
  • Ans. 

    Python class lifestyle methods are special methods that are automatically called at different points in the life cycle of a class object.

    • Constructor method (__init__): Called when a new instance of the class is created.

    • Destructor method (__del__): Called when an instance of the class is about to be destroyed.

    • String representation method (__str__): Called when the object needs to be represented as a string.

    • Getter and se...

  • Answered by AI

Skills evaluated in this interview

Interview experience
5
Excellent
Difficulty level
Moderate
Process Duration
Less than 2 weeks
Result
Selected Selected

I applied via Referral and was interviewed in May 2024. There were 3 interview rounds.

Round 1 - Technical 

(2 Questions)

  • Q1. Python coding on data type conversion and indexes
  • Ans. 

    Understanding data type conversion and index manipulation in Python is crucial for effective data analysis.

    • Data types in Python include int, float, str, list, tuple, dict, and set.

    • Type conversion can be done using functions like int(), float(), str(), list(), etc.

    • Example: int('10') converts string '10' to integer 10.

    • Indexing in Python starts at 0; for example, list = [1, 2, 3], list[0] returns 1.

    • Negative indexing allow...

  • Answered by AI
  • Q2. SQL windows functions
Round 2 - One-on-one 

(2 Questions)

  • Q1. Behavioral questions
  • Q2. Technical questions scenario based
Round 3 - HR 

(1 Question)

  • Q1. Salary discussion

Interview Preparation Tips

Interview preparation tips for other job seekers - Prepare well accordingly as per JD

Technical Analyst Interview Questions & Answers

user image Dhruv Bindoria

posted on 9 Sep 2024

Interview experience
4
Good
Difficulty level
-
Process Duration
-
Result
-
Round 1 - Technical 

(2 Questions)

  • Q1. What are higher order functions in javascript
  • Ans. 

    Higher order functions in JavaScript are functions that can take other functions as arguments or return functions as output.

    • Higher order functions can be used to create more flexible and reusable code.

    • Examples include functions like map, filter, and reduce in JavaScript.

    • They allow for functions to be passed as parameters, making code more concise and readable.

  • Answered by AI
  • Q2. Make a component for utilizing data from an API
  • Ans. 

    Create a component to fetch and display data from an API

    • Use a library like Axios or Fetch to make API requests

    • Parse the JSON data received from the API

    • Display the data in a user-friendly format on the front end

  • Answered by AI

Skills evaluated in this interview

Interview experience
3
Average
Difficulty level
-
Process Duration
-
Result
-
Round 1 - One-on-one 

(1 Question)

  • Q1. Java8, SpringBoot, Microserices core concepts
Round 2 - Technical 

(1 Question)

  • Q1. Project Designing

What people are saying about Coforge

View All
a manager-ms sql database administrator
2d
Nps Prism
0%
Axtria
0%
Coforge
0%
2 participants . expiring in 2w
Got a question about Coforge?
Ask anonymously on communities.

Coforge Interview FAQs

How many rounds are there in Coforge Technical Analyst interview?
Coforge interview process usually has 1-2 rounds. The most common rounds in the Coforge interview process are Technical, One-on-one Round and HR.
How to prepare for Coforge Technical Analyst interview?
Go through your CV in detail and study all the technologies mentioned in your CV. Prepare at least two technologies or languages in depth if you are appearing for a technical interview at Coforge. The most common topics and skills that interviewers at Coforge expect are Technical Analysis, SQL, Java, Javascript and Microservices.
What are the top questions asked in Coforge Technical Analyst interview?

Some of the top questions asked at the Coforge Technical Analyst interview -

  1. Write a program to get a employee list whose salary is greater than 50k and gra...read more
  2. Explain Security authentication implementation and what are the delegati...read more
  3. How did you implemented the microservices in your project, What are the api mon...read more
How long is the Coforge Technical Analyst interview process?

The duration of Coforge Technical Analyst interview process can vary, but typically it takes about less than 2 weeks to complete.

Tell us how to improve this page.

Overall Interview Experience Rating

3.6/5

based on 37 interview experiences

Difficulty level

Easy 7%
Moderate 93%

Duration

Less than 2 weeks 71%
2-4 weeks 21%
4-6 weeks 7%
View more
Join Coforge Engage with the emerging!
Coforge Technical Analyst Salary
based on 2.8k salaries
₹9.7 L/yr - ₹38.4 L/yr
110% more than the average Technical Analyst Salary in India
View more details

Coforge Technical Analyst Reviews and Ratings

based on 260 reviews

3.0/5

Rating in categories

2.8

Skill development

3.1

Work-life balance

3.0

Salary

2.9

Job security

2.9

Company culture

2.3

Promotions

2.8

Work satisfaction

Explore 260 Reviews and Ratings
TPF - Technical Analyst

Bangalore / Bengaluru,

Greater Noida

+1

7-12 Yrs

₹ 10-28.5 LPA

Explore more jobs
Senior Software Engineer
4.9k salaries
unlock blur

₹6.2 L/yr - ₹23.1 L/yr

Technical Analyst
2.8k salaries
unlock blur

₹17.8 L/yr - ₹32 L/yr

Software Engineer
2.2k salaries
unlock blur

₹3.5 L/yr - ₹8 L/yr

Senior Test Engineer
1.8k salaries
unlock blur

₹4.8 L/yr - ₹20 L/yr

Technology Specialist
1.2k salaries
unlock blur

₹12 L/yr - ₹42 L/yr

Explore more salaries
Compare Coforge with

Capgemini

3.7
Compare

Cognizant

3.7
Compare

Accenture

3.8
Compare

Infosys

3.6
Compare
write
Share an Interview