i
CitiusTech
Filter interviews by
Python is a high-level, interpreted programming language known for its readability and versatility in various applications.
Easy to learn and use, making it ideal for beginners.
Supports multiple programming paradigms, including procedural, object-oriented, and functional programming.
Rich ecosystem of libraries and frameworks, such as Django for web development and Pandas for data analysis.
Widely used in data scienc...
Experienced Technical Lead with a strong background in software development, team management, and project delivery across various industries.
Over 10 years of experience in software development, specializing in Java and Python.
Led a team of 15 developers in a successful project that improved system performance by 30%.
Implemented Agile methodologies, resulting in a 25% increase in team productivity.
Collaborated with...
AI enhances efficiency, decision-making, and innovation across various fields, transforming industries and daily life.
Automation: AI can automate repetitive tasks, such as data entry, freeing up human resources for more complex work.
Data Analysis: AI algorithms can analyze vast amounts of data quickly, identifying patterns and insights that humans might miss.
Personalization: AI powers recommendation systems, like ...
Apache Spark is a distributed computing framework designed for big data processing with speed and ease of use.
Spark operates on a cluster of machines, allowing for parallel processing of large datasets.
It uses a master-slave architecture, where the driver program acts as the master and worker nodes execute tasks.
Data is processed in-memory, which significantly speeds up data processing compared to traditional disk...
What people are saying about CitiusTech
A DevOps pipeline automates the process of building, testing, and deploying applications using YAML configuration files.
Define stages: Use YAML to specify stages like build, test, and deploy.
Example: 'stages: [build, test, deploy]'
Use jobs: Each stage can have multiple jobs defined in YAML.
Example: 'jobs: { build: { script: 'npm install' } }'
Environment variables: Set environment variables for different stages.
Exa...
Terraform is an Infrastructure as Code tool for managing cloud resources declaratively.
Define provider: Specify the cloud provider, e.g., AWS, Azure, GCP.
Resource creation: Use 'resource' blocks to define cloud resources, e.g., EC2 instances.
Variables: Use 'variable' blocks for dynamic configurations.
Outputs: Use 'output' blocks to display important information after deployment.
State management: Terraform maintain...
Recursive CTEs in SQL allow for hierarchical data retrieval, enabling complex queries like traversing tree structures.
Recursive CTEs consist of two parts: the anchor member and the recursive member.
Example: To find all subordinates in an employee hierarchy, use a CTE that references itself.
Syntax: WITH RECURSIVE cte_name AS (SELECT ... UNION ALL SELECT ...)
Common use cases include organizational charts, bill of ma...
OOP concepts include encapsulation, inheritance, polymorphism, and abstraction, essential for software design.
Encapsulation: Bundling data and methods. Example: A class 'Car' with properties like 'speed' and methods like 'accelerate()'.
Inheritance: Deriving new classes from existing ones. Example: 'ElectricCar' inherits from 'Car', adding features like 'batteryCapacity'.
Polymorphism: Methods behaving differently b...
Find pairs in an array that sum up to a specified number using efficient algorithms.
Use a hash map to store elements and their indices for O(n) time complexity.
Example: For array [1, 2, 3, 4] and target 5, pairs are (1, 4) and (2, 3).
Iterate through the array, checking if target - current element exists in the map.
Return pairs or indices based on requirements.
Design patterns provide reusable solutions to common problems, enhancing code maintainability and scalability.
Promote code reusability: For example, the Singleton pattern ensures a class has only one instance, which can be reused throughout the application.
Improve code readability: Patterns like MVC (Model-View-Controller) separate concerns, making the code easier to understand.
Facilitate communication: Using comm...
I applied via Referral and was interviewed in Apr 2021. There were 4 interview rounds.
Authentication in .NET involves using various authentication mechanisms such as Forms Authentication, Windows Authentication, and OAuth.
Use Forms Authentication for web applications
Use Windows Authentication for intranet applications
Use OAuth for third-party authentication
Implement authentication using ASP.NET Identity
Use secure password storage mechanisms such as hashing and salting
Dependency injection is a design pattern where an object's dependencies are provided externally rather than created internally.
Dependency injection helps to decouple components and make them more modular.
It allows for easier testing and maintenance of code.
There are three types of dependency injection: constructor injection, setter injection, and interface injection.
Example: Instead of creating a database connection ob...
A pipe is a feature in Angular that allows you to transform data before displaying it.
Pipes are used in Angular templates with the '|' symbol.
There are built-in pipes like 'uppercase' and 'date'.
Custom pipes can be created using the 'Pipe' decorator and implementing the 'PipeTransform' interface.
Custom pipes can take arguments to modify their behavior.
Pipes can be chained together to perform multiple transformations on...
Middleware is software that acts as a bridge between different applications, allowing them to communicate and exchange data.
Middleware is a layer of software that sits between applications and operating systems
It provides services such as message routing, authentication, and data transformation
Examples include Apache Tomcat, Microsoft IIS, and IBM WebSphere
No, assigning null to an array makes it empty and its length becomes 0.
Assigning null to an array makes it empty.
The length of an empty array is 0.
Trying to access length property of null will result in an error.
Null value in JavaScript represents absence of any object value.
Null is a primitive value in JavaScript.
It is assigned to a variable to indicate that it has no value.
It is different from undefined, which means a variable has been declared but not assigned a value.
Null is falsy in JavaScript, meaning it is considered false in a boolean context.
Null can be used to clear the value of an object property.
Performing operations on an array of objects using JavaScript.
Use array methods like map, filter, reduce for operations on object array.
Access object properties using dot notation or bracket notation.
Iterate through the array using loops like for loop or forEach method.
Example: Calculate total sum of 'price' property in an array of products.
To create slice and combine reducers in React, use the createSlice and combineReducers functions from Redux toolkit.
Use createSlice function to define a slice of state with reducers and actions.
Example: const counterSlice = createSlice({ name: 'counter', initialState: 0, reducers: { increment: state => state + 1, decrement: state => state - 1 } })
Use combineReducers function to combine multiple slices into a sing...
To create and optimize a React application, focus on efficient component structure, state management, code splitting, lazy loading, and performance monitoring.
Use functional components and hooks for better performance.
Implement state management with tools like Redux or Context API.
Split code into smaller chunks and lazy load components for faster initial load times.
Optimize performance by minimizing re-renders and usin...
Day to day activities involve coding, debugging, testing, collaborating with team members. Salary negotiation involves research, preparation, and effective communication.
Coding and developing new features
Debugging and fixing issues
Testing code for quality assurance
Collaborating with team members for project progress
Researching market rates for salary negotiation
Preparing a strong case for desired salary
Effectively comm...
A hashmap is a data structure that stores key-value pairs and uses a hash function to map keys to their corresponding values.
Hashmap uses a hash function to determine the index of the key-value pair in the underlying array.
Collisions can occur when two keys hash to the same index, which is resolved using techniques like chaining or open addressing.
Hashmap typically has an underlying array where each element is a linked...
LRU cache is a data structure that stores the most recently used items and removes the least recently used items when full.
Use a doubly linked list to keep track of the order of items based on their usage.
Use a hashmap to quickly access items in the cache.
When an item is accessed, move it to the front of the linked list to mark it as the most recently used.
SOLID principles are a set of five design principles in object-oriented programming to make software designs more understandable, flexible, and maintainable.
Single Responsibility Principle (SRP) - A class should have only one reason to change.
Open/Closed Principle (OCP) - Software entities should be open for extension but closed for modification.
Liskov Substitution Principle (LSP) - Objects of a superclass should be re...
I have worked with design patterns such as Singleton, Factory, Observer, and Strategy.
Singleton pattern ensures a class has only one instance and provides a global point of access to it.
Factory pattern creates objects without specifying the exact class of object that will be created.
Observer pattern defines a one-to-many dependency between objects so that when one object changes state, all its dependents are notified a...
I appeared for an interview in Aug 2024.
Manual testing ensures software quality by identifying bugs before automation is implemented.
Manual testing involves executing test cases without automation tools.
It is useful for exploratory testing, where testers explore the application.
Example: A tester manually checks a login feature by entering various credentials.
Manual testing is crucial for usability testing to assess user experience.
Regression testing can be d...
I applied via Approached by Company and was interviewed in Oct 2024. There were 3 interview rounds.
I have a diverse background in leadership, strategy, and operations, driving growth and innovation across various sectors.
Leadership Experience: Over 15 years in executive roles, leading teams of up to 200 people in multinational corporations.
Strategic Planning: Developed and executed a 5-year strategic plan that increased market share by 30% in a competitive industry.
Operational Excellence: Implemented lean management...
Successfully reduced expenses by 15% while maintaining service quality through strategic vendor negotiations and process optimization.
Vendor Negotiations: Engaged with suppliers to renegotiate contracts, resulting in a 10% reduction in material costs without compromising quality.
Process Optimization: Implemented lean management techniques that streamlined operations, reducing waste and saving an additional 5% in operat...
I will forecast expenses by analyzing historical data, market trends, and budget projections.
Analyze historical data to identify patterns and trends in expenses
Consider market trends and economic indicators that may impact expenses
Collaborate with department heads to gather budget projections and forecasts
Use financial modeling techniques to predict future expenses based on various scenarios
Regularly review and adjust ...
Major financial statements include income statement, balance sheet, and cash flow statement, which are interconnected through net income and retained earnings.
Income statement shows revenues and expenses, resulting in net income.
Balance sheet displays assets, liabilities, and equity, with net income affecting retained earnings.
Cash flow statement details cash inflows and outflows, reconciling with changes in cash on th...
I seek to switch roles to leverage my skills in a new environment and tackle fresh challenges that align with my career goals.
Career Growth: I believe this new position offers greater opportunities for advancement, allowing me to take on leadership roles and drive strategic initiatives.
Skill Utilization: I want to apply my expertise in a different context, such as using my project management skills to lead cross-functi...
I applied to the company because of its strong reputation in the industry and its commitment to innovation and employee development.
Reputation of the company in the industry
Commitment to innovation
Opportunities for employee development
I applied via Naukri.com and was interviewed in Oct 2024. There were 2 interview rounds.
A function to flatten a nested array and sort its elements in ascending order.
Use Array.prototype.flat() to flatten the array. Example: [1, [2, 3], [4, [5]]] becomes [1, 2, 3, 4, 5].
Use Array.prototype.sort() to sort the flattened array. Example: [3, 1, 2] becomes [1, 2, 3].
Combine both methods in a single function for efficiency.
Ensure to handle different data types if necessary, e.g., strings and numbers.
As a React developer, I faced challenges like state management, performance optimization, and integrating APIs effectively.
State Management: Managing complex state across components can be challenging. For example, using Redux or Context API to handle global state.
Performance Optimization: Ensuring components re-render only when necessary. Implementing memoization with React.memo or useMemo can help.
API Integration: Ha...
A function to count how many times a specific word appears in a given string.
Use the String.prototype.split() method to break the string into an array of words.
Filter the array to count occurrences of the target word.
Example: 'hello world hello' with target 'hello' returns 2.
Consider case sensitivity; use toLowerCase() for case-insensitive counting.
Return the count as a number.
I applied via LinkedIn and was interviewed in Nov 2024. There were 2 interview rounds.
I am a seasoned technical lead with over 10 years of experience in leading software development teams and delivering high-quality products.
Over 10 years of experience in software development
Proven track record of leading successful development teams
Strong expertise in various programming languages and technologies
Excellent communication and problem-solving skills
I am a seasoned technical lead with over 10 years of experience in leading software development teams and delivering high-quality products.
Over 10 years of experience in software development
Proven track record of leading successful development teams
Strong expertise in various programming languages and technologies
Excellent communication and problem-solving skills
I appeared for an interview in Jan 2025.
Program to remove duplicates from an array of strings
Iterate through the array and store each element in a set to keep track of unique elements
Create a new array with the unique elements from the set
I applied via Naukri.com and was interviewed in Oct 2024. There were 2 interview rounds.
Spark performance problems can arise due to inefficient code, data skew, resource constraints, and improper configuration.
Inefficient code can lead to slow performance, such as using collect() on large datasets.
Data skew can cause uneven distribution of data across partitions, impacting processing time.
Resource constraints like insufficient memory or CPU can result in slow Spark jobs.
Improper configuration settings, su...
I applied via Naukri.com and was interviewed in Sep 2024. There was 1 interview round.
To create a pipeline in ADF, you can use the Azure Data Factory UI or code-based approach.
Use Azure Data Factory UI to visually create and manage pipelines
Use code-based approach with JSON to define pipelines and activities
Add activities such as data movement, data transformation, and data processing to the pipeline
Set up triggers and schedules for the pipeline to run automatically
Activities in pipelines include data extraction, transformation, loading, and monitoring.
Data extraction: Retrieving data from various sources such as databases, APIs, and files.
Data transformation: Cleaning, filtering, and structuring data for analysis.
Data loading: Loading processed data into a data warehouse or database.
Monitoring: Tracking the performance and health of the pipeline to ensure data quality and reliab...
getmetadata is used to retrieve metadata information about a dataset or data source.
getmetadata can provide information about the structure, format, and properties of the data.
It can be used to understand the data schema, column names, data types, and any constraints or relationships.
This information is helpful for data engineers to properly process, transform, and analyze the data.
For example, getmetadata can be used ...
Triggers in databases are special stored procedures that are automatically executed when certain events occur.
Types of triggers include: DML triggers (for INSERT, UPDATE, DELETE operations), DDL triggers (for CREATE, ALTER, DROP operations), and logon triggers.
Triggers can be classified as row-level triggers (executed once for each row affected by the triggering event) or statement-level triggers (executed once for eac...
Normal cluster is used for interactive workloads while job cluster is used for batch processing in Databricks.
Normal cluster is used for ad-hoc queries and exploratory data analysis.
Job cluster is used for running scheduled jobs and batch processing tasks.
Normal cluster is terminated after a period of inactivity, while job cluster is terminated after the job completes.
Normal cluster is more cost-effective for short-liv...
Slowly changing dimensions refer to data warehouse dimensions that change slowly over time.
SCDs are used to track historical changes in data over time.
There are three types of SCDs - Type 1, Type 2, and Type 3.
Type 1 SCDs overwrite old data with new data, Type 2 creates new records for changes, and Type 3 maintains both old and new data in separate columns.
Example: A customer's address changing would be a Type 2 SCD.
Ex...
Use Python's 'with' statement to ensure proper resource management and exception handling.
Use 'with' statement to automatically close files after use
Helps in managing resources like database connections
Ensures proper cleanup even in case of exceptions
List is mutable, tuple is immutable in Python.
List can be modified after creation, tuple cannot be modified.
List uses square brackets [], tuple uses parentheses ().
Lists are used for collections of items that may need to be changed, tuples are used for fixed collections of items.
Example: list_example = [1, 2, 3], tuple_example = (4, 5, 6)
Datalake 1 and Datalake 2 are both storage systems for big data, but they may differ in terms of architecture, scalability, and use cases.
Datalake 1 may use a Hadoop-based architecture while Datalake 2 may use a cloud-based architecture like AWS S3 or Azure Data Lake Storage.
Datalake 1 may be more suitable for on-premise data storage and processing, while Datalake 2 may offer better scalability and flexibility for clou...
To read a file in Databricks, you can use the Databricks File System (DBFS) or Spark APIs.
Use dbutils.fs.ls('dbfs:/path/to/file') to list files in DBFS
Use spark.read.format('csv').load('dbfs:/path/to/file') to read a CSV file
Use spark.read.format('parquet').load('dbfs:/path/to/file') to read a Parquet file
Star schema is denormalized with one central fact table surrounded by dimension tables, while snowflake schema is normalized with multiple related dimension tables.
Star schema is easier to understand and query due to denormalization.
Snowflake schema saves storage space by normalizing data.
Star schema is better for data warehousing and OLAP applications.
Snowflake schema is better for OLTP systems with complex relationsh...
repartition increases partitions while coalesce decreases partitions in Spark
repartition shuffles data and can be used for increasing partitions for parallelism
coalesce reduces partitions without shuffling data, useful for reducing overhead
repartition is more expensive than coalesce as it involves data movement
example: df.repartition(10) vs df.coalesce(5)
Parquet file format is a columnar storage format used for efficient data storage and processing.
Parquet files store data in a columnar format, which allows for efficient querying and processing of specific columns without reading the entire file.
It supports complex nested data structures like arrays and maps.
Parquet files are highly compressed, reducing storage space and improving query performance.
It is commonly used ...
Some of the top questions asked at the CitiusTech interview -
The duration of CitiusTech interview process can vary, but typically it takes about less than 2 weeks to complete.
based on 231 interview experiences
Difficulty level
Duration
based on 1.8k reviews
Rating in categories
Senior Software Engineer
2.7k
salaries
| ₹5.8 L/yr - ₹20.2 L/yr |
Technical Lead
2.1k
salaries
| ₹7.5 L/yr - ₹28 L/yr |
Software Engineer
1.3k
salaries
| ₹3 L/yr - ₹11.1 L/yr |
Technical Lead 1
392
salaries
| ₹12.1 L/yr - ₹21 L/yr |
Technical Lead 2
334
salaries
| ₹8 L/yr - ₹29 L/yr |
Accenture
Capgemini
Xoriant
HTC Global Services