i
Coforge
Filter interviews by
I appeared for an interview in Feb 2025.
Apache Spark is a distributed computing system designed for fast data processing and analytics.
Spark operates on a master-slave architecture with a Driver and Executors.
The Driver program coordinates the execution of tasks and maintains the SparkContext.
Executors are worker nodes that execute tasks and store data in memory for fast access.
Spark uses Resilient Distributed Datasets (RDDs) for fault tolerance and parallel...
Spark optimization techniques enhance performance by improving resource utilization and reducing execution time.
1. Catalyst Optimizer: Automatically optimizes query plans in Spark SQL, improving execution efficiency.
2. Tungsten Execution Engine: Focuses on memory management and code generation for better performance.
3. Data Serialization: Use efficient serialization formats like Kryo to reduce data transfer time.
4. Bro...
SparkSession is the entry point for Spark SQL, while SparkContext is the entry point for Spark Core functionalities.
SparkSession encapsulates SparkContext and provides a unified entry point for DataFrame and SQL operations.
SparkContext is used to connect to a Spark cluster and is the primary interface for Spark Core functionalities.
You can create a SparkSession using: `SparkSession.builder.appName('example').getOrCreat...
Read and write modes define how data is accessed and modified in files or streams, impacting data integrity and performance.
Read Mode (r): Opens a file for reading only. Example: 'file = open('data.txt', 'r')'
Write Mode (w): Opens a file for writing, truncating the file if it exists. Example: 'file = open('data.txt', 'w')'
Append Mode (a): Opens a file for writing, appending data to the end without truncating. Example: ...
Faced challenges in data accuracy, stakeholder communication, and adapting to market changes in previous projects.
Data Accuracy: Encountered discrepancies in historical data which required extensive validation and cleaning before analysis.
Stakeholder Communication: Misalignment with stakeholders on project goals led to revisions; implemented regular updates to ensure clarity.
Market Changes: Rapid shifts in market trend...
Both Azure and GCP have unique strengths; preference depends on specific project needs and organizational goals.
Azure offers seamless integration with Microsoft products, ideal for enterprises using Windows Server and SQL Server.
GCP excels in data analytics and machine learning, with tools like BigQuery and TensorFlow for advanced data processing.
Azure has a strong hybrid cloud strategy, allowing businesses to integrat...
I applied via Recruitment Consulltant and was interviewed in Sep 2024. There were 2 interview rounds.
Custom hook to fetch data in React
Create a custom hook using the 'useEffect' and 'useState' hooks
Use 'fetch' or any other method to fetch data from an API
Return the fetched data from the custom hook
Good and confident level increaser
Write a program to convert digits in to numbers
Data migration involves transferring data from one system to another while ensuring data integrity and consistency.
Plan the migration process carefully to minimize downtime and data loss.
Backup all data before starting the migration process.
Verify data integrity after migration to ensure all data has been successfully transferred.
Consider using tools or scripts to automate the migration process.
Communicate with stakeho...
What people are saying about Coforge
I appeared for an interview in Feb 2025, where I was asked the following questions.
GCP BigQuery supports time-based and integer range partitioning for efficient data management and querying.
Time-based partitioning: Automatically partitions data based on a TIMESTAMP or DATE column. Example: daily partitions for sales data.
Integer range partitioning: Divides data into ranges based on an INTEGER column. Example: partitioning user IDs into ranges.
Partitioned tables improve query performance and reduce co...
Loading a CSV file to BigQuery can be done using the web UI, command line, or API.
Use the BigQuery web UI: Navigate to BigQuery, select your dataset, click 'Create Table', and upload your CSV file.
Command-line tool: Use the bq command with 'bq load' command. Example: bq load --source_format=CSV dataset.table gs://bucket/file.csv
Using Python client library: Use 'google-cloud-bigquery' to load CSV. Example: client.load_t...
Upsertion in GCP BigQuery allows for efficient data updates and inserts using SQL syntax.
Upsertion combines INSERT and UPDATE operations based on whether a record exists.
Use the MERGE statement for upsertion: MERGE INTO target_table USING source_table ON condition.
Example: MERGE INTO target_table USING source_table ON target.id = source.id WHEN MATCHED THEN UPDATE SET target.value = source.value WHEN NOT MATCHED THEN I...
Debugging long-running SQL queries in GCP BigQuery involves analyzing execution plans, optimizing queries, and monitoring performance.
Use the BigQuery Query Execution Details to analyze the execution plan and identify bottlenecks.
Check for large data scans; use SELECT statements to limit the amount of data processed.
Optimize joins by ensuring that you are using the correct join types and filtering data early.
Consider u...
Avro and Parquet are both columnar storage formats used in GCP, each with unique features and use cases.
Avro is a row-based storage format, while Parquet is a columnar storage format.
Avro is best for write-heavy operations and supports schema evolution, making it suitable for streaming data.
Parquet is optimized for read-heavy operations and is efficient for analytical queries, making it ideal for big data processing.
Av...
Coforge interview questions for designations
OOP concepts include inheritance, encapsulation, polymorphism, and abstraction.
Inheritance: Allows a class to inherit properties and behavior from another class. Example: class Dog extends Animal.
Encapsulation: Bundles data and methods that operate on the data into a single unit. Example: private variables with public methods.
Polymorphism: Allows objects of different classes to be treated as objects of a common supercl...
NF in SQL refers to Normal Form which is used to eliminate redundancy in database design. ACID concepts ensure data integrity in transactions.
NF in SQL stands for Normal Form and is used to organize data in a database to eliminate redundancy and dependency.
There are different levels of NF such as 1NF, 2NF, 3NF, and BCNF, each with specific rules to follow.
ACID concepts (Atomicity, Consistency, Isolation, Durability) en...
Get interview-ready with Top Coforge Interview Questions
Python class lifestyle methods are special methods that are automatically called at different points in the life cycle of a class object.
Constructor method (__init__): Called when a new instance of the class is created.
Destructor method (__del__): Called when an instance of the class is about to be destroyed.
String representation method (__str__): Called when the object needs to be represented as a string.
Getter and se...
I applied via Referral and was interviewed in May 2024. There were 3 interview rounds.
Higher order functions in JavaScript are functions that can take other functions as arguments or return functions as output.
Higher order functions can be used to create more flexible and reusable code.
Examples include functions like map, filter, and reduce in JavaScript.
They allow for functions to be passed as parameters, making code more concise and readable.
Create a component to fetch and display data from an API
Use a library like Axios or Fetch to make API requests
Parse the JSON data received from the API
Display the data in a user-friendly format on the front end
Some of the top questions asked at the Coforge Technical Analyst interview -
The duration of Coforge Technical Analyst interview process can vary, but typically it takes about less than 2 weeks to complete.
based on 34 interviews
2 Interview rounds
based on 244 reviews
Rating in categories
Bangalore / Bengaluru,
Greater Noida
+13-8 Yrs
Not Disclosed
Senior Software Engineer
4.9k
salaries
| ₹6.4 L/yr - ₹23.6 L/yr |
Technical Analyst
2.6k
salaries
| ₹9.5 L/yr - ₹38.4 L/yr |
Software Engineer
2k
salaries
| ₹2 L/yr - ₹9.5 L/yr |
Senior Test Engineer
1.8k
salaries
| ₹4.6 L/yr - ₹19 L/yr |
Technology Specialist
1.2k
salaries
| ₹12 L/yr - ₹42 L/yr |
Capgemini
Cognizant
Accenture
Infosys